Editorial of March 2024 – Official Weblog of UNIO – Cyber Tech

By the Alessandra Silveira 

On inferred private information and the difficulties of EU legislation in coping with this matter

The precise to not be topic to automated choices was thought of for the primary time earlier than the Courtroom of Justice of the European Union (CJEU) within the latest SCHUFA judgment. Article 22 GDPR (on particular person choices primarily based solely on automated processing, together with profiling) all the time raised many doubts to authorized students:[1] i) what a choice taken “solely” on the idea of automated processing could be?; ii) would this Article present for a proper or, fairly, a basic prohibition whose software doesn’t require the occasion involved to actively invoke a proper?; iii) to what extent this automated resolution produces authorized results or considerably impacts the info topic in an analogous method?; iv) will the provisions of Article 22 GDPR solely apply the place there isn’t a related human intervention within the decision-making course of?; v) if a human being examines and weighs different elements when making the ultimate resolution, will it not be made “solely” primarily based on the automated processing? [and, in this situation, will the prohibition in Article 22(1) GDPR not apply]?

To those doubts a German court docket has added just a few extra. SCHUFA is a non-public firm below German legislation which gives its contractual companions with info on the creditworthiness of third events, particularly, customers. To that finish, it establishes a prognosis on the chance of a future behaviour of an individual (‘rating’), such because the compensation of a mortgage, primarily based on sure traits of that particular person, on the idea of mathematical and statistical procedures. The institution of scores (‘scoring’) is predicated on the idea that, by assigning an individual to a gaggle of different individuals with comparable traits who’ve behaved in a sure approach, related behaviour may be predicted.[2]

SCHUFA offered a monetary entity with a rating for the applicant OQ, which served as the idea for refusing to grant the credit score for which the latter had utilized. That citizen subsequently requested SCHUFA to erase the entry regarding her and to grant her entry to the corresponding information. Nonetheless, SCHUFA merely knowledgeable her of the related rating and, basically phrases, of the ideas underlying the strategy of calculating the rating, with out informing her of the precise information included in that calculation, or of the relevance attributed to them in that context, arguing that the strategy of calculation was a commerce secret.

Nonetheless, in line with the referring court docket, it’s finally the credit score rating established by credit score info companies that truly decides whether or not and the way a monetary entity/financial institution enters right into a contract with the info topic. The referring court docket proceeds on the idea that the institution of a rating by a credit score info company doesn’t merely serve to organize that financial institution’s resolution, however constitutes an unbiased “resolution” inside the which means of Article 22(1) of the GDPR.[3]

As we’ve highlighted on this weblog,[4] this case legislation is especially related as a result of profiling is commonly used to make predictions about people. It entails gathering details about an individual and assessing their traits or patterns of behaviour with a purpose to place them in a specific class or group and to attract on that inference or prediction – whether or not of their skill to carry out a process, their curiosity or presumed behaviour, and so on. To this extent, such automated inferences demand safety as inferred private information, since additionally they make it doable to determine somebody by affiliation of ideas, traits, or contents. The crux of the matter is that persons are more and more shedding management over such automated inferences and the way they’re perceived and evaluated by others.

In SCHUFA case the CJEU was known as upon to make clear the scope of the regulatory powers that sure provisions of the GDPR bestow on the nationwide legislature, particularly the exception to the prohibition in Article 22(2)(b) GDPR – in line with which such prohibition doesn’t apply if the choice is allowed by European Union or Member State legislation to which the controller is topic. That is related as a result of, if Article 22(1) GDPR have been to be interpreted as which means that the institution of a rating by a credit score info company is an unbiased resolution inside the which means of Article 22(1) of the GDPR, that exercise could be topic to the prohibition of automated particular person choices. Consequently, it will require a authorized foundation below Member State legislation inside the which means of Article 22(2)(b) of the GDPR.

So, what’s new about this ruling? Firstly, the CJEU dominated that Article 22(1) of the GDPR gives for a prohibition tout court docket whose violation doesn’t should be invoked individually by the info topic. In different phrases, this provision guidelines out the potential of the info topic being the thing of a choice taken solely on the idea of automated processing, together with profiling, and clarifies that lively behaviour by the info topic isn’t essential to make this prohibition efficient. [5] In any case, the prohibition is not going to be relevant when the situations established below Article 22(2) and Recital 71 of the GDPR are relevant. That’s to say, the adoption of a choice primarily based solely on automated processing is authorised solely within the instances referred to in that Article 22(2), particularly when: i) it’s vital for getting into into, or efficiency of, a contract between the info topic and an information controller [paragraph a)]; ii) it’s authorised by Union or Member State legislation to which the controller is topic [paragraph b)]; or iii) it’s primarily based on the info topic’s express consent [paragraph c)]. [6]

In second place, the CJEU clarified the extent to which Member State legislation is allowed to determine exceptions to the prohibition below Article 22(2)(b) of the GDPR. In accordance with the CJEU, it follows from the very wording of this provision that nationwide legislation authorizing the adoption of an automatic particular person resolution should present for applicable measures to safeguard the rights and freedoms and the official pursuits of the info topic. In gentle of Recital 71 of the GDPR, such measures ought to embrace applicable mathematical or statistical procedures for the profiling, implementing technical and organisational measures applicable to make sure, particularly, that elements which lead to inaccuracies in private information are corrected and the chance of errors is minimised, securing private information in a way that takes account of the potential dangers concerned for the pursuits and rights of the info topic and that forestalls, inter alia, discriminatory results on pure individuals. The SCHUFA case additionally made clear that the info topic has the suitable to i) get hold of human intervention; ii) to specific his or her standpoint; and iii) to problem the choice. The CJEU has thus dispelled any doubts as as to whether the nationwide legislator is certain by the rights offered for in Article 22(3) of the GDPR, regardless of the considerably equivocal wording of this provision, which textually solely refers to Article 22(2)(a) and (c), seemingly to exclude Member States from that obligation. [7] The CJEU additionally added that Member States might not undertake, below Article 22(2)(b) of the GDPR, guidelines that authorise profiling in violation of the ideas and authorized bases imposed by Articles 5 and 6 of the GDPR, as interpreted by CJEU case legislation. [8]

Lastly, the CJEU acknowledged the broad scope of the idea of “resolution” inside the which means of the GDPR, ruling {that a} profile could also be in itself an completely automated resolution inside the which means of Article 22 of the GDPR. The CJEU defined that there could be a threat of circumventing Article 22 of the GDPR and, consequently, a lacuna in authorized safety if a restrictive interpretation of that provision was adopted, in line with which the institution of the chance worth should solely be thought of as a preparatory act and solely the act adopted by the third occasion can, the place applicable, be categorised as a “resolution” inside the which means of Article 22(1). [9] Certainly, in that state of affairs, the institution of a chance worth comparable to that at concern in the principle proceedings would escape the precise necessities offered for in Article 22(2) to (4) of the GDPR, regardless that that process is predicated on automated processing and that it produces results considerably affecting the info topic, to the extent that the motion of the third occasion to whom that chance worth is transmitted attracts strongly on it. This is able to additionally end result within the information topic not having the ability to assert, from the credit score info company which establishes the chance worth regarding her or him, his or her proper of entry to the precise info referred to in Article 15(1)(h) of the GDPR, within the absence of automated decision-making by that firm. Even assuming that the act adopted by the third occasion falls inside the scope of Article 22(1) in as far as it fulfils the situations for software of that provision, that third occasion wouldn’t be capable to present that particular info as a result of it typically doesn’t have it. [10]

Briefly, the truth that the willpower of a chance worth is roofed by Article 22(1) of the GDPR leads to its prohibition, except one of many exceptions set out in Article 22(2) of the GDPR applies – together with authorization by the legislation of the Member State, a risk which the CJEU has interpreted restrictively – and the precise necessities set out in Article 22(3) and (4) of the GDPR are complied with.

Nonetheless, the CJEU’s resolution in SCHUFA nonetheless leaves many questions and not using a clear response. Contemplating the precise request for a preliminary ruling, the CJEU answered that Article 22(1) of the GDPR have to be interpreted as which means that the automated institution, by a credit score info company, of a chance worth primarily based on private information regarding an individual and regarding his or her skill to fulfill cost commitments sooner or later, it constitutes “automated particular person decision-making” inside the which means of that provision, the place a 3rd occasion, to which that chance worth is transmitted, attracts strongly on that chance worth to determine, implement or terminate a contractual relationship with that particular person(our italics).[11]

Even though the CJEU’s reply outcomes from the precise wording of the query referred for a preliminary ruling – as written by the nationwide choose who’s the “grasp” of the referral – the query stays as to the extent of the CJEU’s reply. Did the CJEU maybe admit that profiling is, in itself, an completely automated resolution – and, in precept, prohibited – however solely when the chance worth is decisive for the choice on the contractual relationship? Wouldn’t this verify the concept, rejected by the CJEU in Recital 61 of the SCHUFA case, that the willpower of the chance worth could be a easy preparatory act? And if the chance worth isn’t decisive for the choice on the contractual relationship, then does the prohibition in Article 22 of the GDPR not apply?

As we’ve beforehand argued on this weblog, the issue must be seen as profiling itself, no matter whether or not or not it’s decisive for the choice of a 3rd occasion. When profiling produces authorized results or equally considerably impacts an information topic it must be seen as an automatic resolution in accordance to Article 22 of the GDPR. The aim of Article 22 of the GDPR is to guard people in opposition to particular dangers to their rights and freedoms arising from the automated processing of private information, together with profiling – because the CJEU explains in paragraph 57 of the judgment in query. And that processing entails, as is obvious from Recital 71 of the GDPR, the evaluation of private elements regarding the pure particular person affected by that processing, particularly the evaluation and prediction of elements regarding that particular person’s work efficiency, financial state of affairs, well being, private preferences or pursuits, reliability or behaviour, location or actions – because the CJEU rightly explains in paragraph 58 of the judgment in query.

It is very important do not forget that profiling all the time contains inferences and predictions in regards to the particular person, regardless the appliance of automated particular person choices primarily based on profiling by a 3rd occasion. To create a profile it’s essential to undergo three distinct phases: i) information assortment; ii) automated evaluation to determine correlations; and iii) making use of the correlations to a person to determine current or future behavioral traits. If there have been maybe automated particular person choices primarily based on profiling, these would even be topic to the GDPR – whether or not completely automated or not. That’s, profiling isn’t restricted to the mere categorization of the person, but it surely additionally contains inferences and predictions in regards to the particular person. Nonetheless, the effectiveness of the appliance of the GDPR to inferred information faces a number of obstacles. This has to do with undeniable fact that the GDPR was designed for information offered straight by the info topic – and never for information inferred by digital applied sciences as AI programs. That is the problem behind this judgment.


[1] See Alessandra Silveira, Profiling and cybersecurity: a perspective from elementary rights’ safety within the EU, Francisco Andrade/Joana Abreu/Pedro Freitas (eds.), “Authorized developments on cybersecurity and associated fields”, Springer Worldwide Publishing, Cham/Suíça, 2024.

[2] See Judgment SCHUFA, paragraph 14.

[3] See Request for a preliminary ruling of 1 October 2021, Case C-634/21, paragraph 23.

[4] See Alessandra Silveira, Lastly, the ECJ is decoding Article 22 GDPR (on particular person choices primarily based solely on automated processing, together with profiling),

[5] See Judgment SCHUFA, paragraph 52.

[6] See Judgment SCHUFA, paragraph 53.

[7] See Judgment SCHUFA, paragraph 65 and 66.

[8] See Judgment SCHUFA, paragraph 68. See additionally the ECJ resolution within the Joined Instances C‑26/22 and C‑64/22.

[9] See Judgment SCHUFA, paragraph 61.

[10] See Judgment SCHUFA, paragraph 63.

[11] See Judgment SCHUFA, paragraph 73.

Image credit: Picture by Pixabay on Pexels.com.

Add a Comment

Your email address will not be published. Required fields are marked *

x