TY - JOUR AU - Häuselmann, Andreas AB - Key Points Affective Computing creates a new class of data (‘emotional data’) that is personal, sensitive, and intimate by nature, but not necessarily considered personal or sensitive from the legal point of view. The information duties as set out in the GDPR do not oblige controllers to inform data subjects what specific emotions they have detected about them. Affective Computing creates tension with the accuracy principle enshrined in EU data protection law because of its disputed scientific foundation. Data subjects seem to have little means to rectify inaccurate emotional data. When Affective Computing is applied in the context of important decisions, the individuals concerned arguably do not have sufficient remedies to challenge the accuracy of emotional data that possibly influenced the decision in question. Introduction Because emotions form an important aspect of human intelligence and daily life, computer scientists have aimed to make them ‘machine-readable’ (eg to infer emotions from facial expressions and voice through computing) in order to develop computers that understand and even have emotional intelligence. This endeavour refers to the scientific field of Affective Computing (AC), which is considered as computing that relates to, arises from or influences emotion.1 AC is bound to be implemented or already utilized in various fields, including recruiting,2 virtual assistants,3 advertising,4 and cars.5 By making emotions machine readable, the developments in AC provide opportunities to automatically process a new type of data: emotional data.6 Naturally, the question arises of how such a new category of data and new technology suits into the framework of EU data protection law.7 Even though AC approaches are already being used in practice, the topic seems to be highly underrepresented in privacy and data protection scholarship and relevant studies.8 Hence, this article aims to provide a preliminary view whether and to which extent EU data protection law can be applied to AC based on selected issues. Additionally, the article intends to raise awareness and stimulate further research. To start, ‘Affective computing’ section outlines the basic concept of AC approaches that are currently used and describes some real-world applications and scenarios. Next, ‘AC versus EU data protection law’ section discusses some of the implications that arise in the context of AC and EU data protection law.9 The implications include a discussion of emotional data in the light of the concepts of personal data and sensitive personal data, the tensions created by AC with regard to the transparency and accuracy principle, the right to rectification, and the right not to be subject to automated decision-making. Finally, ‘Conclusion’ section draws conclusions and points to further issues and need for research. Affective computing Emotions play a pivotal role in functions considered essential to intelligence, including decision making, social interaction, perception, or learning.10 Emotional intelligence refers to the capacity to recognize and regulate emotions in an optimal manner.11 It has been claimed that if computers should be genuinely intelligent, they need to have emotional intelligence.12 The field of AC refers to computing that relates to, arises from or influences emotion.13 AC is a scientific and engineering endeavour inspired by psychology, neuroscience, linguistics, and related areas.14 It could be considered as a sub-field of Artificial Intelligence (AI) and is therefore recently called ‘Emotional AI’.15 A commonly agreed definition of emotion in any of the disciplines that study this phenomenon does not exist.16 For the purpose of this article,17 the term emotion refers to the six most commonly used emotion categories18 in emotion research: anger, disgust, fear, happiness, sadness, and surprise.19 The human body responds through various signals to emotions that are manifested in physical and physiological forms. The former include facial expressions, voice intonation, gestures, and movements, whereas the latter relate to respiration, pulse rate, skin conductance, body temperature, and blood pressure.20 Approaches in AC aim to detect emotions derived or inferred from such signals. In this article, the focus lies on AC approaches that detect emotions from facial expressions and speech, since they can be deployed relatively easy compared to more invasive approaches (eg from physiological factors such as pulse rate or skin conductance). The following sub-sections outline the concepts of emotion detection from facial expressions and voice (including speech). Facial expressions Facial expressions are likely the most natural way how humans express their emotions.21 The leading approach to detect and classify emotions from facial expressions is the Facial Action Coding System (FACS) developed to measure facial activity and to classify its motions into the six basic emotions previously mentioned: anger, disgust, fear, joy, sadness, and surprise.22 Computer scientists use computer vision and graphics to automatically analyse and synthesize facial expression in Automated Face Analysis (AFA) systems. AFA systems seek to detect emotions by typically following four steps: face detection, face registration, feature extraction, and classification or regression.23 The Emotion Analysis Application Programming Interface (API)24 from the company Kairos claims to understand emotion by analysing footage provided by the user (eg a retailer) through face analysis algorithms which will return the values for the six basic emotions.25 Emotion Analysis API is offered to casinos, restaurants, retail merchants, and companies in the hospitality sector.26 The Amazon ‘Rekognition’ API offers similar image and video analysis for the purpose of emotion detection, arguably with improved accuracy.27 A notable EU-funded research project concerning an automated border control system called iBorderCtrl ‘analyses the micro-gestures of travellers to figure out if the interviewee is lying’.28 Lie detector systems are essentially based on AC technology.29 HireVue video interview software claims to be able to evaluate a candidate’s employability, including personality traits, in less than 30 minutes30 by means of on-demand video interviews where job candidates record responses to structured interview questions.31 The software analyses the emotions a candidate portrays during the video-assessment32 based on AC and AFA components. Followed by the automated analysis of the recorded responses of the candidate, the software rates the candidate with an average rating and an average recommendation. More than a hundred companies have already used HireVue on more than a million of applicants.33 Voice and speech Emotions of a person may be inferred from speech signals.34 Speech recordings constitute a rich source of personal and sensitive data and can be used for emotion detection and classification.35 Research has demonstrated specific associations between emotions such as fear, anger, sadness, joy, and acoustic signal features of speech such as pitch, voice level, and speech rate.36 Such acoustic features are decomposed in order to associate them with the emotional state of the speaker.37 In other words, approaches in AC extract acoustic signal features that characterize emotional speech and map them with emotion representations.38 Speech Emotion Recognition (SER) systems aim to recognize the emotional aspects of speech, irrespective of the semantic contents.39 AudEERING’s devAIce claims to be able to identify several speakers and detect over fifty emotional states. It can be integrated in any platform and embedded into any kind of mobile device application, wearables, and hearables.40 Amazon patented technology that enables its virtual assistant Alexa to recognize the user’s emotional state derived from the user’s voice.41 SER applications are also used in call centres for in-call speaking guidance, for example by advising the call agent to speak with more empathy when the customer seems to be angry according to the automated voice analysis.42 AC versus EU data protection law It is not clear to what extent EU data protection law, particularly the GDPR, is fit for purpose when applied to the processing of emotional data based on AC approaches. This claim is underpinned by several legal implications as described below.43 Emotional data and personal data The GDPR only applies to the processing of personal data.44 Although emotions are felt personal because they are related to a person's values45 and express what a person cares about,46 such information is not necessarily considered as personal data from a legal perspective.47 Processing of information related to emotions and feelings does not fall under the material scope of the GDPR in case that the individual concerned is neither identified nor identifiable. Thus, despite the personal nature and inherent relationship between emotions and personhood,48 emotional data does not necessarily constitute personal data. Personal data is defined as a concept with four elements: (i) any information (ii) relating to (iii) an identified or identifiable (iv) natural person.49 Element (i) reflects the aim to assign a wide scope to the concept of personal data and potentially encompasses all kinds of information.50 The form of the information seems to be irrelevant, as the information may be available ‘in written form or be contained in, for example, a sound or image’.51 This seems to be important since emotions or feelings may not only be expressed and detected in words, but also in speech and voice, facial expressions, gestures, and physiological information (eg measurement of skin conduction in order to determine anxiety or measurement of heart rate). The second element (relating to) is also interpreted broadly by the Court of Justice of the European Union (CJEU) and is satisfied ‘where the information, by reason of its content, purpose or effect, is linked to a particular person’.52 What is decisive on whether information related to emotions and feelings constitutes personal data depends on the third element of personal data, namely whether the person concerned in fact is identified or identifiable. Regarding this element, a flexible approach is taken.53 This is emphasized by the wording of Article 4(1) and recital 26, particularly the references to ‘singling out’, ‘directly or indirectly’, and ‘either by the controller or by another person’.54 With regard to the identification, recital 26 states that account should be taken of ‘all the means reasonably likely to be used’. According to the CJEU, this criterion would not be met if the identification is prohibited by law (eg criminal law provisions that prohibit re-identification) ‘or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost, and man-power, so that the risk of identification appears in reality to be insignificant’.55 The last element of the concept of personal data makes clear that data on corporations or other legal/juristic persons56 as well as artificial creatures (eg robots) are not protected by the GDPR. Overall, personal data seems to be a broad concept.57 However, when applied to AC, the concept appears to be rather vague with regard to the identification requirement.58 In other words, it seems to be questionable whether the processing of data in the context of AC involves personal data. By way of example, a billboard installed at Piccadilly Circus in London used AC and facial detection technology to broadcast adverts based on people’s age, gender and mood.59 The corresponding privacy policy states that no images of the faces are stored by the system and the gathered information (age, sex, and mood) is kept together with information from other people’s faces, without any possibility to single out individuals. In short, Landsec, the owner of the screen, concludes that ‘we cannot, nor would never want to, use the technology to identify someone’s face’.60 If this holds true,61 the GDPR will not be applicable because street passers are not identified or identifiable, although—arguably— rather personal and sensitive information (emotional data) is being processed. The same technique is being used by retailers to capture data about age, gender, and observed emotions of customers.62 Also here, the provider of the solution argues that no personal data are processed because the facial analysis is performed on the device and only anonymous data like age and gender is stored.63 Therefore, the captured emotions ‘belong to no one’ because they cannot be linked to individuals.6 4 In a similar use case, where a supermarket deployed software to analyse the faces of customers that looked at an ad screen to determine emotional states, age, and sex65, the data protection supervisory authority in Bavaria stated that this does not relate to the processing of personal data because only metadata66 is collected within this process.67 Also the Irish Data Protection Commissioner took a similar view with regard to facial detection systems that can recognize the mood of individuals standing in front of a smart ad board.68 However, it could be argued that facial detection systems serving ads to individuals based on their emotional states ‘single out’ these individuals in the sense of recital 26 GDPR. Whereas singling out implies identification and such identification would be fairly easy in an online marketing context,69 the same cannot be said with regard to an offline context such as the ones described above. Although such targeted ‘offline’ ads single out individuals, the controller that deploys the system is likely to argue that it does not have the means to identify the individuals concerned because only anonymous metadata will be stored.70 Hence, the material scope of the GDPR is not triggered although personal information, namely emotions, is processed. Emotional data and sensitive data By means of AC, machines may gain access to the emotional life of individuals, information that is highly personal, intimate, and private.71 In fact, all emotions are by definition personal and revealing them could make an individual potentially more vulnerable.72 Emotions are felt as personal73 and express what a person cares about.74 Because there is an inherent relationship between emotions and personhood75 and privacy is considered to be fundamental to the maintenance of human dignity and the boundary to one's personhood,76 information regarding emotions should be deemed to be of sensitive and intimate nature.77 When emotional data constitute personal data because the data subject is identified or identifiable, the question arises whether such data is specifically protected. Considering the special categories of personal data as defined in Article 9(1) GDPR and its corresponding recitals,78 data relating to emotion is arguably not protected as a special category of personal data under the GDPR, despite its sensitive and intimate nature.79 This also holds true when biometric data is used for AC in order to detect the emotions of the individual concerned, for example in the context of AFA systems and emotion detection based on an individual’s voice and speech as described above.80 Biometric data must only be regarded as a special category of personal data if it is used for the purpose of uniquely identifying an individual,81 in other words ‘processed through a specific technical means allowing the unique identification or authentication of a natural person’.82 According to regulatory guidance, biometric identification typically involves ‘the process of comparing biometric data of an individual (acquired at the time of the identification) to a number of other biometric templates stored in a data database (i.e. a one-to-many matching process)’.83 In the case of HireVue,84 the video interview software does not process facial expressions (i.e. biometric data) for the purpose to uniquely identify the applicant as required by Article 9(1) GDPR, but to detect the emotions an applicant portrays during the video assessment. Identification is achieved through other means beforehand, namely when the candidate reveals its name or other identifiable information. An AC application that advises the call agent to speak with more empathy because the customer seems to be angry according to the automated voice analysis does not process biometric data for identification purposes as outlined previously. In some cases, the processing of emotional data could amount to the processing of a special category of personal data,85 for example, data concerning health. Health data does not only include physical or mental health, but also ‘any information (…) on the physiological or biomedical state of the data subject independent of its source’.86 AFA systems that try to detect depression from analysing individual's facial expressions in videos arguably process (mental) health data, even if the data subject concerned is completely healthy. Another example could be AC applications that derive emotional states of individuals from physiological data such as heart rate, blood pressure, and skin conductance. As outlined earlier, such data fall under the definition of health data and are protected as a special category of personal data according to the GDPR.87 Given the current developments in AC, a debate is needed in order to discuss whether data relating to emotions merit specific protection given its sensitive and intimate nature and if so, how the lacunae in EU data protection law might be filled. Interestingly, the draft of the ePrivacy Regulation, as proposed by the European Commission, assigns information relating to emotions a highly sensitive character.88 This implies that emotional data might be subject to different levels of protection, depending on the applicable laws in question. In a situation where both the material scope of the GDPR89 and the future ePrivacy Regulation will be triggered, emotional data will be protected as sensitive data according to the ePrivacy Regulation, but not according to the GDPR.90 Such a situation might not only be utterly confusing and disadvantageous for data subjects, but also for companies that need to comply with the GDPR and the ePrivacy Regulation. In addition, to regulate emotional data by means of different levels of protection does not seem to contribute to legal certainty. The transparency principle The transparency principle inherent in Article 6(1) GDPR requires that it must be transparent to natural persons ‘that personal data concerning them are collected, used, consulted or otherwise processed’.91 It is further specified in Articles 12–14 GDPR in the form of obligations towards the controller to provide certain information to the data subject. Article 13 GDPR applies when the personal data are collected from the data subject, and Article 14 applies when the personal data have not been obtained from the data subject (eg third party controllers, data brokers, publicly available sources).92 It is important to note that the GDPR requires controllers to inform data subjects about the purposes of the processing for which the personal data are intended.93 With regard to AC applications, transparent processing would presuppose that the individual is able to see what emotion the machine recognized, a requirement that has been propagated by the pioneer in the field of AC.94 As outlined by the analysis below, this does not seem to be the case in practice. With emotional data, the question arises whether such data must be considered as being collected from the data subject or rather inferred from personal data provided by the data subject (eg from the facial expressions of the data subject). According to WP29, the former applies when a data subject consciously provides the personal data to the controller (eg completing an online form) as well as when a data controller collects data by observation (eg automated data capturing devices or software such as cameras, network equipment, RFID, or other types of sensors).95 Some scholars have argued that the latter cannot be communicated to the data subject because it has not yet been created (at the time of collection) and that notification duties under Article 13 GDPR will never be triggered because the data is not obtained from the data subject.96 Arguably, emotional data are not consciously provided by the data subject and due to the fact that highly complex computing is needed to detect emotional data, it cannot be considered as data that is simply observed. Rather, it must be considered as being inferred from personal data provided by the data subject. With regard to inferred personal data, WP29 stated: If the purpose includes the creation of inferred personal data, the intended purpose of creating and further processing such inferred personal data, as well as the categories of the inferred data processed, must always be communicated to the data subject at the time of collection, or prior to the further processing for a new purpose in compliance with Article 13.3 or Article 14.4.97 When this guidance is applied to the processing of the AC powered HireVue software, the employer would need to inform the candidate about the intended purpose of creating inferred data and the category of inferred data at the commencement phase of the processing cycle.98 Hence, the prospective employer must not inform the candidate about the specific emotions detected by the system after the video assessment took place, but could simply refer to the purpose of creating inferred data (eg to ‘assess the suitability of the candidate for the position’) and the category of ‘emotional data’ in the privacy notice.99 The same would apply if the prospective employer receives the emotional data from an independent assessment provider (ie not obtained from the data subject), because Article 14 which would be applicable in such a scenario simply requires the employer to inform the candidate regarding the purposes of processing and the ‘categories of personal data’ concerned.100 Also here, it would be sufficient if the employer refers to the category of ‘emotional data’, and the candidate will never be aware what specific emotions the software detected.101 The same conclusion can be drawn for Speech Emotion Recognition (SER) applications that are used in call centres for customer emotion detection by means of automated speech analysis. Callers will not know what emotions have been detected by the system and to what extent such information has been used to influence them or the outcome of their request. In addition, it might be doubted whether it is in line with the fairness, purpose, and data minimization principle102 to analyse the emotions an applicant portrays during a video assessment or to analyse customer emotions by means of automated speech analysis. Also, the principle of proportionality, which forms part of the requirement for a legitimate purpose103 and the data minimization principle,104 should be taken into account. This leads to a significant loophole because candidate and caller do not know what specific emotions are being detected about them, and therefore cannot verify the accuracy of such data. I use the term ‘loophole’ because in my view, data subjects should be able to see what emotion the machine recognized, particularly given the sensitive nature of emotional data.105 Without that knowledge, candidate and caller are hardly capable to obtain the rectification of inaccurate data. Even if they exercise their right of access according to Article 15(1) GDPR, a generic answer in the form of a general description of the categories of personal data (again ‘emotional data’) provided by the controller would be sufficient, leading to the same outcome, namely no specific information regarding the emotions detected by the systems. The option of last resort would be to request a copy of the personal data undergoing processing according to Article 15(3) GDPR,106 which could entail the copy of the report generated by the HireVue software or alternatively the detected emotional data. Whether the employer would be in fact obliged to provide a copy of the report seems to be unlikely because Article 15(3) does not require the provision of a copy of the document.107 The CJEU found that there is no right to a copy of the document containing the personal data108 and also in a recent case in the Netherlands, the court pointed out that the GDPR does not grant a right to obtain a copy of documents, but rather a right to obtain a copy of personal data.109 The information to be provided must enable the data subject to verify the correctness of the personal data undergoing processing.110 Applied to HireVue, the employer should provide the candidate with the detected emotion category in order to allow him or her to verify the correctness of the detected emotional data. However, with regard to emotion detection from speech, it seems to be questionable whether the in-call speaking guidance automatically creates a report including the detected emotions, and whether the call has been recorded in the first place. If the AC system used is interactive and responds in real time to emotions without recording the call, there is no control for the data subject and little possibilities for supervisory authorities. Moreover, the right to request a copy is not an absolute right111 because it must not adversely affect the rights or freedoms of others, including trade secrets or IP rights.112 Hence, where the employer or operator of the call centre can convincingly demonstrate that the disclosure of the reports would infringe their trade secrets or IP rights, both candidate and caller will not receive the information sought. The accuracy principle The GDPR states that the processing of personal data must be accurate.113 The accuracy principle intends to protect the individual concerned from being irrationally or unfairly treated based on wrong and inaccurate representations.114 According to European data protection authorities, accurate means ‘accurate as to a matter of fact’.115 Similarly, the UK’s data protection supervisory authority states that controllers must ‘ensure personal data are of adequate precision in light of the context of processing’.116 Personal data generated by algorithms, including by means of AC techniques, are subject to this accuracy principle.117 According to case law of the Court of Justice of the EU, the term ‘any information’ encompasses both objective and subjective information, including opinions and assessments, provided it ‘relates’ to the data subject.118 Opinions may include inferences, guesses, and value-judgements,119 and consequently also encompass information regarding emotions of data subjects inferred by AC systems. What is required with regard to the level of precision in order to assess the accuracy of the personal data depends on the context, namely on the purpose of the processing.120 It therefore seems reasonable to argue that where AC powered software is being used in the context of decisions with significant impact on the lives of the data subjects concerned,121 the accuracy level should have a higher threshold compared to AC being applied for the purpose of direct marketing. However, different studies have challenged the idea that a person’s emotional state can accurately be inferred from his or her facial movements.122 A recent and extensive study on the topic suggests that facial movements are not diagnostic displays that reliably and specifically signal particular emotional states regardless of context, person, and culture. According to this study, it is not possible to confidently infer happiness from a smile, anger from a scowl, or sadness from a frown because these emotion categories are more variable in their facial expressions.123 Another study revealed that accuracy levels of eight commercial automatic classifiers used for facial affect recognition were consistently lower when applied to spontaneous affective behaviours compared to ‘posed’ affective behaviour. Recognition accuracy rates of the tested classifiers varied from 48 per cent to 62 per cent.124 Also other means to detect emotions, for example based on physiological measurements or voice, have been challenged due to a lack of scientific consensus whether such methods can ensure accurate or even valid results.125 While humans can efficiently recognize emotional aspects of speech, it is still an ongoing subject of research to conduct it automatically. Research in this context has been restricted to laboratory conditions with noise-free, uncompressed, and full-bandwidth audio recordings. Recent studies, however, indicate that speech compression, filtering, band reduction, and the addition of noise reduce the accuracy significantly.126 Nevertheless, Speech Emotion Recognition (SER) is already being applied ‘into the wild’ and emotions are inferred from speech recorded or streamed in natural and daily-life environments, likely with significantly lower accuracy rates. The AI Now Institute at New York University denoted AC to be based on ‘debunked pseudoscience’127 and recommended that ‘regulators should ban the use of affect recognition in important decisions that impact people's lives and access to opportunities’.128 Obviously, processing of personal data by means of AC applications seems to be in direct contravention with the accuracy principle as set out in the GDPR, or at least creates severe tensions with this principle. Moreover, if companies act upon such arguably inaccurate data and treat individuals in a certain way could lead to severe implications for the individuals concerned.129 The right to rectification The data subject has a right to obtain the rectification of inaccurate personal data concerning him or her or to have incomplete personal data completed.130 With regard to accuracy, personal data could be divided into three categories: (i) factual data that accurately reflect a known reality about an individual, (ii) counter-factual data that inaccurately reflect a known reality about an individual, and (iii) data that cannot be described as completely falling under the former or the latter.131 I argue that emotional data inferred by AC systems usually fall in the last category. Arguably, emotional data cannot be factual or counter-factual data because emotions can only be a known reality for the natural person that has these emotions (and not for other parties or entities). The reason for this is that every individual has her or his own personal experience of emotion.132 Emotional data derived from AC-powered applications represent unproven and factually uncertain inferences about the emotions of an individual. As described in the section concerning the accuracy principle, it is likely that emotional data are inaccurate. Hence, the data subject concerned should have –at least in principle – the right to rectify such inaccurate data. However, two obstacles must be overcome in order to exercise this right: (i) the scope of the right according to CJEU case law and (ii) to sufficiently demonstrate that emotional data is inaccurate. First, it has been argued that inferred data—even if considered personal data—‘cannot be rectified under data protection law and can only be contested if there is a procedure in place to contest the evaluation’.133 In other words, it has been claimed that the right to rectification is limited to assess the accuracy and completeness of the input data, but excludes the output data, including opinions.134 This point of view is based on standing jurisprudence of the European Court of Justice with regard to the remit of data protection law.135 In its case law, the CJEU arguably limited the remit of data protection law and excluded the assessment regarding the accuracy of inferential analytics from its scope.136 When applied to the accuracy of emotional data inferred by AC powered systems, the result would be that inaccurate emotional data cannot be rectified under data protection law. In practice, this means that the data subject whose emotional data is being processed cannot rectify the opinion of an AC-powered system that deemed the data subject to be sad, even if the data subject was in fact not sad at all.137 However, one must bear in mind the context of the cited cases. For example, the right to rectification should obviously not result in situations where a candidate would be allowed to correct his answers in an exam retroactively138 or an individual to rectify the content of a legal analysis in the context of an immigration case.139 Considering the CJEU's teleological approach to interpret data subject rights,140 it seems reasonable to argue that the CJEU could rule differently when a data subject requests to rectify inaccurate emotional data processed by a private company.141 Furthermore, regulatory guidance states that derived or inferred data constitute (new) personal data142 and that the right to rectification applies not only to the ‘input personal data’ (the face/picture or voice/speech used for emotion detection) but also to ‘output data’ (new data—the emotion detected by the AC system).143 Second-ly, the term rectification implicitly relies upon the notion of verification in the sense that a record may demonstrably be shown to be inaccurate or incomplete and consequently corrected by the individual concerned. This might be a straightforward task when the personal data in question is verifiable (such as a name, date of birth, email address).144 With regard to inferred data, it is generally impossible for data subjects to prove that such data is wrong without access to the tools used to infer the data.145 In the case of emotional data, the question arises if and how such data in fact can be verified. Regarding the former (if), only the concerned individual is capable to verify whether or not emotional data is accurate, due to the fact that it relates to her or his personal experience of the emotion in question. Hence, the question regarding the ‘if’ can be answered in the affirmative. But how could the individual demonstrate that she or he has experienced the emotion differently than detected by the AC system? I argue that a basic principle would be needed for this, namely the principle that the data subject's perception of the emotion in question serves as the ‘ground truth’. The data subject’s perception of the emotion in question should serve as the ‘ground truth’ because every individual has his or her own personal experience of emotions.146 Based on this principle, it should be sufficient if the data subject simply contests the accuracy of the emotion detected by the AC system. Otherwise, the right to rectification would not satisfy its legitimate aim, leading to the processing of inaccurate emotional data, without sufficient remedies for the individuals concerned. This would also be desirable from a normative perspective. To make the applicability of the right to rectification dependent on the question whether or not the inferred personal data is verifiable seems to be the wrong approach because the verifiability of an inference does not determine its effect on the data subject concerned.147 Particularly with regard to important decisions for the data subject, such as hiring decisions, arguably ‘unverifiable’ emotional data that are incorrect from the data subject’s perspective could have significant harmful impacts. AC and automated decision-making In Article 22(1), the GDPR grants individuals the right ‘not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her’. The provisions in Article 22 GDPR are long and complex and the inherent rules suffer from significant weaknesses.148 Particularly the requirement that a decision must be based ‘solely’ on an automated process excludes applications and systems that ‘only’ provide decisional support for decision making from the scope of Article 22 GDPR.149 For example, the AC-powered HireVue software analyses the emotions a candidate portrays during the video-assessment150 and automatically scores the candidate with an average rating and recommendation. Subsequently, the recruiter has the discretion to decide, more precisely to select one of the recommended candidates. In such a scenario, Article 22 GDPR will not apply because the decision-making process is not ‘solely’ automated given the human involvement of the recruiter.151 Nevertheless, this conclusion seems to be problematic when a life decision about a person152 is influenced by or based on possibly inaccurate data. In this described scenario, what may be inaccurate emotional data supported the decision in some way. The person concerned seems to have no or rather poor control over the quality of the decisional outcome and the inferences153 that the employer drew with regard to the emotional data detected by the software. More precisely, the candidate will have no right to express his or her point of view and to contest the decision according to Article 22(3) GDPR because the whole provision is not applicable. Furthermore, it could be difficult for the candidate to assess the accuracy of emotional data detected by the software during the video assessment and, where necessary, to rectify said data due to the issues discussed above (the right to rectify emotional data). Within the iBorderCtrl system, an ‘Automatic Deception Detection System’ quantifies the probability of deceit in interviews by analysing interviewees’ non-verbal micro-gestures.154 There is an inherent risk of inaccuracy, namely false positives that wrongly identify the interviewee as being deceptive, which might lead to a stigmatization or prejudice against the interviewee, for example when talking to the human border guard.155 The final decision will be made by the human border guard which renders Article 22 GDPR inapplicable. 156 Also here, the person that takes the decision could be unduly influenced by the possibly inaccurate output of the iBorderCtrl system.157 Additionally, interviewees that are deemed to be lying will most likely not be informed accordingly, let alone being informed regarding the functionality and accuracy of the system.158 Conclusion This article argued that EU data protection law is not necessarily fit for purpose when applied to the processing of emotional data based on AC approaches. It underpinned this claim by discussing selected issues that arise in the context of EU data protection law by AC applications that are already used today. Emotional data, an arguably new class of data created by AC applications, is not necessarily considered personal data and sensitive personal data from a legal perspective, despite its sensitive and personal nature. Individuals being subject to AC processing will not be aware of the specific emotions detected by the system because the GDPR does not oblige controllers to inform specifically, but rather generally (eg emotional data). AC creates tension with the accuracy principle enshrined in European data protection law because its accuracy is heavily disputed. It might prove very difficult for data subjects to rectify inaccurate data due to the scope of the right according to CJEU case law and the challenge to sufficiently demonstrate that inferred emotional data is inaccurate because emotional data is not easily verifiable or provable. When AC is applied in the context of important decisions such as whether to be hired or not, the individuals concerned do not have sufficient remedies to challenge the accuracy of emotional data that possibly influenced the decision in question because the relevant provision of the GDPR is not applicable, due to the fact that the AC component only provided decisional support. Technology is neither good nor bad nor is it neutral.159 The effects of new technologies such as AC cannot be easily depicted until they are extensively deployed. But once they have been deployed, they are difficult to change.160 Despite the fact that AC approaches are already being used by companies, the potential issues in the context of EU data protection law have been rarely discussed in relevant legal scholarship. This article intended to raise awareness and stimulate further research regarding the matter of AC and EU data protection law. Such further research could for instance explore the role of the yet elusive notion of fairness in data protection law because the fairness principle has the potential to help provide a basis for new legal norms161 that are necessary in the context of new technologies such as AC. Besides the apparent need for research with regard to EU data protection law, it would be beneficial to discuss the social consequences of the use of AC technology.162 Furthermore, ethical aspects of AC should also be discussed, particularly with regard to bias and discrimination,163 personal autonomy, human dignity, and chilling effects. Moreover, AC might also be relevant for research in the domain of consumer law.164 Footnotes 1 Rosalind Picard, ‘Affective Computing’ (1995) MIT Media Laboratory Perceptual Computing Section Technical Report No 321, 1 accessed 25 February 2021. 2 HireVue (n 30) below. 3 Amazon’s Alexa, Google Home. 4 Affdex accessed 25 February 2021. 5 Toyota accessed 25 February 2021 BMW and Porsche accessed 25 February 2021 Volvo accessed 25 February 2021. 6 New in the sense that AC allows the processing of emotional data by automated means. For the purpose of this article, I define emotional data as information related to emotions of an individual. Emotions refer to the six ‘basic emotions’ as described in the section ‘Affective Computing’. 7 When referring to EU data protection law, this article exclusively considers the GDPR and hence excludes other data protection laws that could be applicable (eg the ePrivacy Directive and EU Member State Laws). 8 Surprisingly, AC has not been discussed in a study conducted on behalf of the European Parliament. Giovanni Sartor and Francesca Lagioia, ‘The Impact of the General Data Protection Regulation (GDPR) on Artificial Intelligence’ (2020) accessed 25 February 2021. 9 These implications are only a preliminary selection and shall by no means provide a comprehensive overview of all the possible issues AC creates in the context of EU data protection law. For additional implications, see (n 43) below. 10 Rosalind W Picard, Affective Computing (MIT Press, Cambridge 1997) 47. 11 Aaron Ben-Ze'Ev, The Subtlety of Emotions (MIT Press, Cambridge 2000) 179. 12 Picard (n 10) preface page x. 13 Picard (n 1). 14 Rafael Calvo et al, ‘Introduction to Affective Computing’ in Rafael Calvo et al (eds), The Oxford Handbook of Affective Computing (OUP, New York 2015) 2. 15 See for example, Andrew McStay, ‘Emotional AI, Soft Biometrics and the Surveillance of Emotional Life: An Unusual Consensus on Privacy’ (2020) Vol 7 Iss 1 Big Data & Society 1–12. 16 Kevin Mulligan and Klaus R. Scherer, ‘Toward a Working Definition of Emotion’ (2012) 4(4) Emotion Review 345–537. 17 Because most of the currently available commercial applications of AC are based on those emotion categories. 18 These six emotions refer to research conducted by psychologists in the early seventies that developed the methodology of ‘basic emotions’, see Paul Ekman and Wallace v Friesen, ‘Constants across Cultures in the Face and Emotion’ (1971) 17 (2) Journal of Personality and Social Psychology 124. 19 Lisa Feldman Barrett et al, ‘Emotional Expressions Reconsidered’ (2019) 20 (1) Psychological Science in the Public Interest 1, 52. 20 Eiman Kanjo et al, ‘Emotions in Context: Examining Pervasive Affective Sensing Systems, Applications, and Analyses’ (2015) 19 Personal and Ubiquitous Computing 1197 accessed 25 February 2021. 21 Calvo et al (n 14) 4. 22 Eiman Kanjo et al (n 20). 23 Michael Valstar, ‘Automatic Facial Expression Analysis’ in Manas K Mandal and Avinash Awasthi (eds) Understanding Facial Expressions in Communication (Springer 2015) 144. 24 A set of functions and procedures allowing the creation of applications that access features or data of an operating system, application or other service, see accessed 25 February 2021. 25 See accessed 25 February 2021. 26 Luana Pascu, ‘New Kairos Facial Recognition Camera Offers Customer Insights’ Biometric Update (Toronto, 11 September 2019) accessed 25 February 2021. 27 See accessed 25 February 2021. 28 Note that the system is only a research project funded by the EU under the H2020 programme and it remains to be seen whether it will be used at the border in the future. European Commission, ‘Smart Lie-detection System to Tighten EU’s Busy Borders’ (24 October 2018) accessed 25 February 2021. 29 Angela Chen and Karen Hao, 'Emotion AI Researchers say Overblown Claims give their Work a Bad Name’ MIT Technology Review (Cambridge 14 February 2020) accessed 25 February 2021. 30 Nahtan Mondragon, Clemens Aicholzer and Kiki Leutner, ‘The Next Generation of Assessments’ (HireVue 2019) accessed 25 February 2021. 31 See accessed 25 February 2021. 32 Mondragon et al (n 30). 33 Angela Chen, ‘The AI Hiring Industry is Under Scrutiny - but it'll be Hard to Fix’ MIT Technology Review (Cambridge 07 November 2019) accessed 25 February 2021. 34 Eiman Kanjo et al (n 20). 35 Andreas Nautsch et al, ‘Preserving Privacy in Speaker and Speech Characterisation’ (2019) 58 Computer Speech & Language 441, 442 accessed 25 February 2021. 36 Christina Sobn and Murray Alpert, ‘Emotion in Speech: The Acoustic Attributes of Fear, Anger, Sandess, and Joy’ (1999) 28 (4) Journal of Psycholinguistic Research 347. 37 Eiman Kanjo et al (n 20). 38 Chi-Chun Lee et al, ‘Speech in Affective Computing’ in Rafael Calvo et al (eds), The Oxford Handbook of Affective Computing (OUP, New York 2015) 173, 177. 39 Margaret Lech et al, ‘Real-Time Speech Emotion Recognition Using a Pre-trained Image Classification Network: Effects of Bandwidth Reduction and Computing’ (2020) 2 Frontiers in Computer Science 1, 3 accessed 25 February 2021. 40 See accessed 25 February 2021. 41 Huafeng Jin and Shuo Wang ‘Voice-Based Determination of Physical and Emotional Characteristics of Users’ US Patent Number US 10096319 B1 (Assignee: Amazon Technologies, Inc.) October 2018 accessed 25 February 2021. 42 See accessed 25 February 2021. 43 Note that AC also seems to be problematic with regard to necessity and proportionality, fundamental principles in EU data protection law, as well as data protection principles such as purpose limitation, data minimization, and accountability. However, these issues will not be discussed within this article. 44 Art 2 (1) EU General Data Protection Regulation (GDPR): Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ 2016 L 119/1 further on referred to as ‘GDPR’. 45 Heather C Lench and Zakari Koebel Capenter, ‘What Do Emotions Do for Us?’ in Heather C Lench (ed) The Function of Emotions (Springer, Cham 2018) 1, 142. 46 Giovanni Stanghellini and René Rosfort, Emotions and Personhood: Exploring Fragility – Making Sense of Vulnerability (OUP, Oxford 2013) 142. 47 McStay reaches the same conclusion. McStay (n 15) 1, 3–4. 48 Stanghellini and Rosfort (n 46) 149. 49 Art 4(1) GDPR. 50 Case C-434/16 Nowak v Data Protection Commissioner [2017] ECR I-994 (ECLI:EU:C:2017:994) para 34. 51 Joined Cases C-141/12 & C-372/12, YS, M and S v Minister voor Immigratie, Integratie en Asiel [2014] ECR I-2081 (ECLI:EU:C:2013:838), Opinion of AG Sharpston, para 45 (emphasis added). 52 Case C-434/16 Nowak v Data Protection Commissioner [2017] ECR I-994 (ECLI:EU:C:2017:994) para 35. 53 Lee A Bygrave and Luca Tosoni, Commentary of Article 4(1) in Christopher Kuner, Lee A Bygrave, Christopher Docksey (eds), The EU General Data Protection Regulation: A Commentary (OUP, Oxford 2020) 109. 54 Ibid 110. 55 Case C-582/14, Breyer v Bundesrepublik Deutschland [2016] ECR I-779 (ECLI:EU:C:2016:779) para 46. 56 Bygrave and Tosoni (n 53) 111. 57 Purtova takes the view that, in the near future, everything will be or will contain personal data due to the rapid developments in technology. Nadezhda Purtova, ‘The Law of Everything. Broad Concept of Personal Data and Future of EU Data Protection Law’ (2018) 10 (1) Law, Innovation and Technology 40, 74–75 accessed 25 February 2021. 58 See also Michèle Finck and Frank Pallas, ‘They who must not be Identified- Distinguishing Personal from Non-personal Data under the GDPR’ (2020) 10 (1) International Data Privacy Law 11, 14. 59 Matthew Moore, ‘Personalised Ads Delivered by the Billboard that's got its Eye on you’, The Times (London, 18 October 2017) accessed 25 February 2021. 60 See Privacy Policy Piccadilly Lights accessed 25 February 2021. 61 When considering the combination of location, time, posture, age, gender, and mood that could be used to identify the individual, it seems possible to reach a different conclusion. 62 See for example Hella Aglaia accessed 11 June 2020. 63 See accessed 25 February 2021. 65 Note that the software used also claims to be able to estimate the four basic emotions happiness, sadness, surprise, and anger accessed 25 February 2021. 66 Metadata is commonly understood to denote ‘data about data’ or ‘data that defines and describes other data’, for example within ISO standards. Jonathan Furner, ‘Definitions of "Metadata": A Brief Survey of International Standards’ (2019) 71 (6) Journal of the Association for Information Science and Technology 33. 67 Heike Anger, ‘Gesichtsscan im Supermarkt ist unbedenklich’, Handelsblatt (Duesseldorf, 12 June 2017) accessed 25 February 2021. 68 Purtova (n 57) 74–75. 69 Frederik Zuiderveen Borgesius, ‘Singling Out People Without Knowing Their Names – Behavioural Targeting, Pseudonymous Data, and the new Data Protection Regulation’ (2016) 32 Computer Law & Security Review 256, 262. 70 In line with the views of the Irish and Bavarian Supervisory Authorities. Note that these views could be rebutted when considering the combination of all data elements that can be used to identify individuals, see for example, Purtova (n 57). 71 Picard (n 10) 118. 72 Aaron Ben-Ze'Ev, The Subtlety of Emotions (MIT Press, Cambridge 2000) 183. 73 Lench and Capenter (n 45) 1, 142. 74 Stanghellini and Rosfort (n 46) 142. 75 Ibid 149. 76 William S Brown, ‘Technology, Workplace Privacy and Personhood’ (1996) 15 Journal of Business Ethics 1237, 1243. 77 McStay (n 15) 1, 4. 78 Recitals 51, 52, 53 GDPR. 79 Contrary to Clifford’s view that argues this ‘will clearly result in the processing of sensitive personal data’, see Damian Clifford, ‘Citizen-consumers in a personalised Galaxy: Emotion Influenced Decision-making, a True Path to the Dark Side?’ (2017) CiTiP Working Paper 31/2017, 21 accessed 25 February 2021. 80 Note that Art 29 WP considered voice as biometric data, Art 29 Working Party, ‘Opinion 4/2007 on the concept of personal data’ (WP 136, 20 June 2007) at 8. 81 Art 9 (1) GDPR. 82 Recital 51 GDPR, the same recital states that processing of photographs should not systematically be considered to be processing of special categories of personal data. 83 Art 29 Working Party, ‘Opinion 3/2012 on developments in biometric technologies’ (WP 193, 27 April 2012) at 5. 84 See Mondragon (n 30). 85 Note that art 9 GDPR solely refers to special categories of personal data, but recitals 51 and 53–54 also mention sensitive data. Special categories of personal data are defined as ‘data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation’. 86 Recital 35 GDPR (emphasis added). 87 Art 9(1) GDPR. 88 Proposal for a Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications) Recital 2 accessed 25 February 2021. 89 Namely where emotional data must be considered personal data because the data subject is identified or identifiable. 90 Provided that the proposed ePrivacy Regulation will not be amended with regard to the sensitivity of emotional data. 91 Recital 39 GDPR. 92 Art 29 Working Party, ‘Guidelines on Transparency under Regulation 2016/679’' (WP 260 rev.01, 11 April 2018) at 15. 93 Art 13(1) lit c and art 14(1) lit c GDPR. 94 Picard (n 10) 122. 95 Art 29 Working Party, ‘Guidelines on Transparency under Regulation 2016/679’ (WP 260 rev.01, 11 April 2018) at 15. 96 Sandra Wachter and Brent Mittelstadt, ‘A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI’ (2019) 2019 (2) Columbia Business Law Review 494, 543 and 545. 97 Art 29 Working Party, ‘Guidelines on transparency under Regulation 2016/679’ (WP 260 rev.01, 11 April 2018) at 15 (emphasis added). 98 Ibid 14. 99 Note that if emotional data is deemed to be collected directly from the data subject (and not inferred), the employer is arguably not required to communicate the category of personal processed to the candidate, since an explicit obligation to do so is not present in art 13 GDPR. 100 Art 14(1) lit b GDPR. 101 Based on a strict reading of the law. It can also be argued that it will not be sufficient to only indicate the category of personal data (‘emotional data’) in response to an access rights request as this will not enable data subject to effectively amend, object or restrict the use of their data. This very much depends on local guidance and local case law. 102 Art 5(1) lit a, b and c GDPR. 103 Cécile de Terwangne, Commentary of art 5 in Christopher Kuner, Lee A Bygrave and Christopher Docksey (eds), The EU General Data Protection Regulation: A Commentary (OUP, Oxford 2020) 313. 104 The wording ‘relevant and limited to what is necessary’ in art 5(1) lit c should be seen as a manifestation of the proportionality principle see also Lee A Bygrave, Data Privacy Law: An International Perspective (OUP, New York 2014) 148. 105 I derive this requirement from the underlying ideas of the transparency and fairness principle. 106 Which I read as a separate component of the right to access different from the information to be provided according to art 15(1) GDPR. 107 Gabriela Zanfir-Fortuna, Commentary of art 15 in Kuner et al (n 103) 464. 108 Joined Cases C-141/12 & C-372/12, YS, M and S v Minister voor Immigratie, Integratie en Asiel [2014] ECR I-2081 (ECLI:EU:C:2014:2081) paras 58–59. 109 Rechtbank Den Haag, C/09/572633/HA RK 19-295 ECLI:NL:RBDHA:2019:13029, para 4.5. 110 Ibid para 4.5. 111 Wachter and Mittelstadt (n 96) 494, 546. 112 Recital 63 GDPR, see also Custers arguing that algorithms used to infer data might be considered as trade secrets: Bart Custers, ‘Profiling as Inferred Data. Amplifier Effects and Positive Feedback Loops’ in Emre Bayamlıoğlu et al (eds), Being Profiled: Cogitas Ergo Sum (Amsterdam University Press, Amsterdam 2018). 113 Art 5(1) lit d GDPR. 114 Dara Hallinan and Frederik Zuiderveen Borgesius, ‘Opinions can be Incorrect (in our Opinion)! On Data Protection Law’s Accuracy Principle’ (2020) 10 (1) IDPL 1, 9. 115 Art 29 Working Party, ‘Guidelines on the Implementation of the Court of Justice of the European Union Judgment on Google Spain and inc v. Agencia Española de Protección De Datos (AEPD) and Mario Costeja González C-131/12’ (WP 225, 26 November 2014), at 15. accessed 25 February 2021. 116 See accessed 25 February 2021. 117 The authors convincingly outline why this is the case: Hallinan and Borgesius (n 114). 118 Case C-434/16 Nowak v Data Protection Commissioner [2017] ECR I-994 (ECLI:EU:C:2017:994) para 34. 119 Hallinan and Borgesius (n 114) 1, 5. 120 Case C-434/16 Nowak v Data Protection Commissioner [2017] ECR I-994 (ECLI:EU:C:2017:994) para 53. 121 For example, whether or not getting a job based on assessments provided by HireVue as referred to in footnote 30 above. 122 Barrett et al. (n 19) 1; Sara Preto, ‘Emotion-reading Algorithms cannot Predict Intentions Via Facial Expressions’ USC News (Los Angeles, 4 September 2019) accessed 25 February 2021. 123 Barrett et al (n 19) 1, 46. 124 Damian Dupré et al, ‘A Performance Comparison of Eight Commercially Available Automatic Classifiers for Facial Affect Recognition’ (2020) 15 (4) PLoS ONE 1, 10 accessed 25 February 2021. 125 Kate Crawford et al, ‘AI Now Report’ (2019) AI Now Institute 12 accessed 25 February 2021. 126 Lech et al (n 39) 1, 3. 127 Crawford et al (n 125). 128 Ibid. 129 Particularly in the context of important decisions, such as employment or border controls (see Section ‘AC and automated decision-making’). 130 Art 16 GDPR. 131 Hallinan and Borgesius (n 114) 1, 4-5. 132 Jennifer Healey, ‘Physiological Sensing of Emotion’ in Rafael Calvo et al (eds), The Oxford Handbook of Affective Computing (OUP, New York 2015), 213, 214. 133 Wachter and Mittelstadt (n 96) 550–51, see also 549, 590. 134 Ibid 550, 590. 135 Case C-434/16 Nowak v Data Protection Commissioner [2017] ECR I-994 (ECLI:EU:C:2017:994); Case C-553/07 College van burgemeester en wethouders van Rotterdam v MEE Rijkeboer [2009] ECR I-03889 (ECLI:EU:C:2009:293); Case C-28/08 P, European Commission v Bavarian Lager [2010] ECR I-06055 (ECLI:EU:C:2010:378); Joined Cases C-141/12 & C-372/12, YS, M and S v Minister voor Immigratie, Integratie en Asiel [2014] ECR I-2081 (ECLI:EU:C:2014:2081). 136 Case C-434/16 Nowak v Data Protection Commissioner [2017] ECR I-994 (ECLI:EU:C:2017:994) para 57; Case C-553/07 College van burgemeester en wethouders van Rotterdam v MEE Rijkeboer [2009] ECR I-03889 (ECLI:EU:C:2009:293) para 49; Wachter and Mittelstadt (n 96) 494, 537. 137 The term opinion is used because emotional data could constitute an opinion and case law indicates that inferred data, including opinions, cannot be rectified. According to the CJEU, the right to rectification is not intended to apply to content of subjective opinions and assessments see last reference in previous footnote, 590. 138 Case C-434/16 Nowak v Data Protection Commissioner [2017] ECR I-994 (ECLI:EU:C:2017:994) para 54. 139 Joined Cases C-141/12 & C-372/12, YS, M and S v Minister voor Immigratie, Integratie en Asiel [2014] ECR I-2081(ECLI:EU:C:2014:2081) para 45. 140 Case C-434/16 Nowak v Data Protection Commissioner [2017] ECR I-994 (ECLI:EU:C:2017:994) para 53. 141 Eg a company that uses HireVue as discussed above in footnote 30 above. 142 Art 29 Working Party ‘Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679’ (WP251rev.01, 6 February 2018) at 8–9. 143 Ibid 17–18. 144 Wachter and Mittelstadt (n 96) 494, 548. 145 Custers (n 112) 115. 146 Although not indicated by case law, I derive the concept of ‘ground truth’ from the normative aspect of the accuracy principle that aims to protect data subjects from being subject to misrepresentation. See Hallinan and Borgesius (n 114) 1, 10. 147 Wachter and Mittelstadt (n 96) 494, 549. 148 Lee A Bygrave, ‘Minding the Machine v2.0: The EU General Data Protection Regulation and Automated Decision Making’ (2019) University of Oslo Faculty of Law Legal Studies Research Paper Series No. 2019-01, at 1 and 4 accessed 25 February 2021. 149 Ibid 7. 150 Nahtan Mondragon, Clemens Aicholzer, Kiki Leutner, ‘The Next Generation oF Assessments’ (HireVue 2019) accessed 25 February 2021. 151 WP guidance refers to ‘solely’ automated decision making when there is no human involvement in the decision process. However, if someone routinely applies automated decisions without any influence on the result would still be considered ‘solely’ automated. See art 29 Working Party ‘Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679’, (WP251rev.01, 6 February 2018) 20–21. 152 Tim Lewis, ‘AI can read your Emotions. Should it?’, The Guardian (London 17 August 2019) accessed 25 February 2021. 153 Bygrave (n 148) 12. 154 See accessed 25 February 2021. 155 See accessed 25 February 2021. 156 Assuming that the border guard does not blindly follow the system and ‘rubber-stamp’ its decision. 157 See accessed 25 February 2021. 158 Lewis (n 152). 159 Melvin Kranzberg, ‘Technology and History: Kranzberg's Laws’ (1995) 15 (1) Bulletin of Science, Technology & Society 5. 160 David Collingridge, The Social Control of Technology (St Martin’s Press, New York 1980). 161 Bygrave (n 148). 162 See also Andrew McStay, ‘The Right to Privacy in the Age of Emotion AI’ (2018) accessed 25 February 2021. 163 A study provided evidence that facial recognition software used for emotion detection interpreted black faces consistently as angrier than white faces. Lauren Rhue, ‘Racial Influence on Automated Perceptions of Emotions’ (2018) Wake Forest University 1 accessed 25 February 2021. 164 See for example, Ryan Calo who asks ‘what if an advertiser had a way to determine your present emotional state?’ Ryan Calo, ‘Digitial Market Manipulation’ (2014) 82 (4) The George Washington Law Review 995, 996. 64 Barbara Woolsey, ‘Emotion Recognition Technology could Transform Retail Advertising’, Handelsblatt (Duesseldorf, 24 January 2018) accessed 25 February 2021. This article forms part of the author’s external PhD research at eLaw – Center for Law and Digital Technologies, Leiden University in the Netherlands. The author would like to thank Joke Bodewits (Hogan Lovells International LLP Amsterdam), Professor Bart Custers, Professor Gerrit-Jan Zwenne, and Santiago Ramírez López for their valuable comments and inputs. © The Author(s) 2021. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) TI - Fit for purpose? Affective Computing meets EU data protection law JF - International Data Privacy Law DO - 10.1093/idpl/ipab008 DA - 2021-03-12 UR - https://www.deepdyve.com/lp/oxford-university-press/fit-for-purpose-affective-computing-meets-eu-data-protection-law-pkoY4JT17F SP - 1 EP - 1 VL - Advance Article IS - DP - DeepDyve ER -