Article 22
Automated individual decision-making, including profiling

Official
Texts
Guidelines Caselaw Review of
EU Regulation
Review of
Nat. Regulation
Show the recitals of the Regulation related to article 22 keyboard_arrow_down Hide the recitals of the Regulation related to article 22 keyboard_arrow_up

(60) The principles of fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.

(71) The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject's performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her. However, decision-making based on such processing, including profiling, should be allowed where expressly authorised by Union or Member State law to which the controller is subject, including for fraud and tax-evasion monitoring and prevention purposes conducted in accordance with the regulations, standards and recommendations of Union institutions or national oversight bodies and to ensure the security and reliability of a service provided by the controller, or necessary for the entering or performance of a contract between the data subject and a controller, or when the data subject has given his or her explicit consent. In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision. Such measure should not concern a child.

In order to ensure fair and transparent processing in respect of the data subject, taking into account the specific circumstances and context in which the personal data are processed, the controller should use appropriate mathematical or statistical procedures for the profiling, implement technical and organisational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimised, secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation, or that result in measures having such an effect. Automated decision-making and profiling based on special categories of personal data should be allowed only under specific conditions.

(72) Profiling is subject to the rules of this Regulation governing the processing of personal data, such as the legal grounds for processing or data protection principles. The European Data Protection Board established by this Regulation (the ‘Board’) should be able to issue guidance in that context.

Show the recitals of the Directive related to article 22 keyboard_arrow_down Hide the recitals of the Directive related to article 22 keyboard_arrow_up

(41) Whereas any person must be able to exercise the right of access to data relating to him which are being processed, in order to verify in particular the accuracy of the data and the lawfulness of the processing; whereas, for the same reasons, every data subject must also have the right to know the logic involved in the automatic processing of data concerning him, at least in the case of the automated decisions referred to in Article 15 (1); whereas this right must not adversely affect trade secrets or intellectual property and in particular the copyright protecting the software; whereas these considerations must not, however, result in the data subject being refused all information;

The GDPR

Article 22 of the regulation somewhat clarifies the former provision of the Directive.

It provides as well that the data subject has the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. The article expressly includes the profiling, namely any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyze or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements; (see the definition in Article 4 (4) of the Regulation).

However, this provision extends the possible exceptions to the prohibition. In addition to the cases where the decision is necessary for the conclusion or performance of a contract between the data subject and the controller, the article provides an exception when the decision is authorised by the law of the Union or of a Member State to which the controller is subject; finally, the article may be derogated from the prohibition when the decision is based on the explicit consent of the data subject.

In the three hypotheses, the appropriate measures guaranteeing protection of rights and freedoms as well as the legitimate interests of the data subject must be provided either by the law which admits such a decision or by the controller who should also recognize  the right to be provided human intervention on the part of the controller, to express his or her point of view and to challenge the decision.

The prohibition is however strengthened for decisions based on processing of sensitive data within the meaning of Article 9 (1) of the Regulation which are prohibited unless the person has given his or her consent within the meaning of Article 9 (2) (a) and unless it is excluded by the EU law or the law of the Member State or if the processing is deemed to be necessary for reasons of public interest on the basis of the law of the Union or the law of the Member State (see Article 9(2), g) which must include appropriate and specific safeguards of the legitimate interests of the person. In any case, appropriate measures to ensure the protection of rights and freedoms as well as the legitimate interests of the data subject must be implemented.

The Directive

Article 15 of the Directive already recognized the right of individuals not to be subject to a decision which produces legal effects concerning him/her or significantly affects him/her and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him or her, such as their performance at work, creditworthiness, reliability, conduct, etc. However, exceptions were provided under conditions once the decision was taken as part of the conclusion or performance of a contract or was authorised by a law providing safeguards for the legitimate interest of the person.

Potential issues

What is particularly striking in reading the new provision is that while expanding the exceptions to the prohibition, it relies on the appropriate and/or specific safeguards that remain unspecified. We can certainly refer to recitals 71 and 72 to find examples (specific information of the data subject, the right to be provided human intervention, measures to reduce errors to the maximum, the obligation to carry out an analysis of impact in some cases, etc.), but the space for maneuvering of the Member States and the controllers is in a risk to be too broad to guarantee them actually and in any event to align them within the Union.

Summary

European Union

European Union

European data protection board (EDPB)

Guidelines on the targeting of social media users - 8/2020 (13 April 2021)

A significant development in the online environment over the past decade has been the rise of social media. More and more individuals use social media to stay in touch with family and friends, to engage in professional networking or to connect around shared interests and ideas. For the purposes of these guidelines, social media are understood as online platforms that enable the development of networks and communities of users, among which information and content is shared.1 Key characteristics of social media include the ability for individuals to register in order to create “accounts” or “profiles” for themselves, to interact with one another by sharing user-generated or other content and to develop connections and networks with other users.

As part of their business model, many social media providers offer targeting services. Targeting services make it possible for natural or legal persons (“targeters”) to communicate specific messages to the users of social media in order to advance commercial, political, or other interests.

 A distinguishing characteristic of targeting is the perceived fit between the person or group being targeted and the message that is being delivered. The underlying assumption is that the better the fit, the higher the reception rate (conversion) and thus the more effective the targeting campaign (return on investment). Mechanisms to target social media users have increased in sophistication over time. Organisations now have the ability to target individuals on the basis of a wide range of criteria. Such criteria may have been developed on the basis of personal data which users have actively provided or shared, such as their relationship status. Increasingly, however, targeting criteria are also developed on the basis of personal data which has been observed or inferred, either by the social media provider or by third parties, and collected (aggregated) by the platform or by other actors (e.g. data brokers) to support ad-targeting options. In other words, the targeting of social media users involves not just the act of “selecting” the individuals or groups of individuals that are the intended recipients of a particular message (the ‘target audience’), but rather it involves an entire process carried out by a set of stakeholders which results in the delivery of specific messages to individuals with social media accounts.

The combination and analysis of data originating from different sources, together with the potentially sensitive nature of personal data processed in the context of social media , creates risks to the fundamental rights and freedoms of individuals. From a data protection perspective, many risks relate to the possible lack of transparency and user control. For the individuals concerned, the underlying processing of personal data which results in the delivery of a targeted message is often opaque. Moreover, it may involve unanticipated or undesired uses of personal data, which raise questions not only concerning data protection law, but also in relation to other fundamental rights and freedoms. Recently, social media targeting has gained increased public interest and regulatory scrutiny in the context of democratic decision making and electoral processes.

Link

Retour au sommaire

Article 29 Working Party

Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 - wp251rev.01 (6 february 2018)

The General Data Protection Regulation (the GDPR), specifically addresses profiling and automated individual decision-making, including profiling.

Profiling and automated decision-making are used in an increasing number of sectors, both private and public. Banking and finance, healthcare, taxation, insurance, marketing and advertising are just a few examples of the fields where profiling is being carried out more regularly to aid decision-making.

Advances in technology and the capabilities of big data analytics, artificial intelligence and machine learning have made it easier to create profiles and make automated decisions with the potential to significantly impact individuals’ rights and freedoms.

The widespread availability of personal data on the internet and from Internet of Things (IoT) devices, and the ability to find correlations and create links, can allow aspects of an individual’s personality or behaviour, interests and habits to be determined, analysed and predicted.

Profiling and automated decision-making can be useful for individuals and organisations, delivering benefits such as:

  • increased efficiencies; and
  • resource savings.

They have many commercial applications, for example, they can be used to better segment markets and tailor services and products to align with individual needs. Medicine, education, healthcare and transportation can also all benefit from these processes.

However, profiling and automated decision-making can pose significant risks for individuals’ rights and freedoms which require appropriate safeguards.

These processes can be opaque. Individuals might not know that they are being profiled or understand what is involved.

Profiling can perpetuate existing stereotypes and social segregation. It can also lock a person into a specific category and restrict them to their suggested preferences. This can undermine their freedom to choose, for example, certain products or services such as books, music or newsfeeds. In some cases, profiling can lead to inaccurate predictions. In other cases it can lead to denial of services and goods and unjustified discrimination.

The GDPR introduces new provisions to address the risks arising from profiling and automated decision-making, notably, but not limited to, privacy. The purpose of these guidelines is to clarify those provisions.

This document covers:

  • Definitions of profiling and automated decision-making and the GDPR approach to these in general – Chapter II
  • General provisions on profiling and automated decision-making – Chapter III
  • Specific provisions on solely automated decision-making defined in Article 22 - Chapter IV
  • Children and profiling – Chapter V
  • Data protection impact assessments and data protection officers– Chapter VI

The Annexes provide best practice recommendations, building on the experience gained in EU Member States.

The Article 29 Data Protection Working Party (WP29) will monitor the implementation of these guidelines and may complement them with further details as appropriate.

Lien

Retour au sommaire

Summary

European Union

European Union

CJEU caselaw

C-634/21,  OQ v. Land Hessen (7 December 2023)

Article 22(1) of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)

must be interpreted as meaning that the automated establishment, by a credit information agency, of a probability value based on personal data relating to a person and concerning his or her ability to meet payment commitments in the future constitutes ‘automated individual decision-making’ within the meaning of that provision, where a third party, to which that probability value is transmitted, draws strongly on that probability value to establish, implement or terminate a contractual relationship with that person.

Decision of the court

Opinion of the advocate general

Retour au sommaire Retour au sommaire
Regulation
1e 2e

Art. 22

1.   The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

2.   Paragraph 1 shall not apply if the decision:

(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;

(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or

(c) is based on the data subject's explicit consent.

3.   In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.

4.   Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place.

1st proposal close

Art. 20

1.           Every natural person shall have the right not to be subject to a measure which produces legal effects concerning this natural person or significantly affects this natural person, and which is based solely on automated processing intended to evaluate certain personal aspects relating to this natural person or to analyse or predict in particular the natural person's performance at work, economic situation, location, health, personal preferences, reliability or behaviour.

2. Subject to the other provisions of this Regulation, a person may be subjected to a measure of the kind referred to in paragraph 1 only if the processing:

(a)     is carried out in the course of the entering into, or performance of, a contract, where the request for the entering into or the performance of the contract, lodged by the data subject, has been satisfied or where suitable measures to safeguard the data subject's legitimate interests have been adduced, such as the right to obtain human intervention; or

(b)     is expressly authorized by a Union or Member State law which also lays down suitable measures to safeguard the data subject's legitimate interests; or

(c)     is based on the data subject's consent, subject to the conditions laid down in Article 7 and to suitable safeguards.

3. Automated processing of personal data intended to evaluate certain personal aspects relating to a natural person shall not be based solely on the special categories of personal data referred to in Article 9.

4.           In the cases referred to in paragraph 2, the information to be provided by the controller under Article 14 shall include information as to the existence of processing for a measure of the kind referred to in paragraph 1 and the envisaged effects of such processing on the data subject.

5.           The Commission shall be empowered to adopt delegated acts in accordance with Article 86 for the purpose of further specifying the criteria and conditions for suitable measures to safeguard the data subject's legitimate interests referred to in paragraph 2.

 

2nd proposal close

Art. 4

For the purposes of this Regulation:

(…)

(12a) 'profiling' means any form of automated processing of personal data consisting of using those data to evaluate personal aspects relating to a natural person, in particular to analyse and predict aspects concerning performance at work, economic situation, health, personal preferences, or interests, reliability or behaviour, location or movements;

Art. 20

1. The data subject shall have the right not to be subject to a decision (...) based solely on automated processing, including profiling, which produces legal effects concerning him or her or significantly affects him or her.

1a. Paragraph 1 shall not apply if the decision: (...)

(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller (...) ; or

(b) is (...) authorized by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or

(c) is based on the data subject's explicit consent (...).

1b. In cases referred to in paragraph 1a (a) and (c) the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision :

2. (...)

3. Decisions referred to in paragraph 1a shall not (...) be based on special categories of personal data referred to in Article 9(1), unless points (a) or (g) of Article 9(2) apply and suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place.

4. (...)

5. (...)

Directive close

Art. 15

1. Member States shall grant the right to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.

2. Subject to the other Articles of this Directive, Member States shall provide that a person may be subjected to a decision of the kind referred to in paragraph 1 if that decision:

(a) is taken in the course of the entering into or performance of a contract, provided the request for the entering into or the performance of the contract, lodged by the data subject, has been satisfied or that there are suitable measures to safeguard his legitimate interests, such as arrangements allowing him to put his point of view; or

(b) is authorized by a law which also lays down measures to safeguard the data subject's legitimate interests.

Article 11
(6) No decision of the controller or processor in consequence of which is an interference with the legal and legally protected interests of the data subject, may not be issued or made without verification solely on the basis of automated personal data processing. This shall not apply where such decision was made in favour of the data subject and upon his request.

close