I. Background
Schufa-Holding GmbH (“Schufa”), a private credit agency, provides its contractual partners with information on the creditworthiness of third parties and, for this purpose, creates so-called score values. To determine this value, mathematical-statistical methods are used to predict the probability of future behavior, such as repayment of a loan, for a person on the basis of certain characteristics. These characteristics, both positive (e.g., the contractual fulfillment of a claim) and negative (e.g., late payment), are reported to Schufa by contractual partners for their customers.
II. Facts of the case C-634/21
One of these references for a preliminary ruling is Case C-634/21 and concerns a dispute between a citizen (plaintiff) and the State of Hesse, represented by the Hessian Commissioner for Data Protection and Freedom of Information (“HBDI”) (VG Wiesbaden – 6 K 788/20.WI). Schufa supports the HBDI as an intervener. According to Schufa’s activities described above, Schufa provided a credit institution with a score value for the plaintiff, which served as the basis for denying a loan applied for by the plaintiff. The plaintiff requested Schufa to delete the entry in question and to grant her access to the corresponding data. However, the latter only informed her of the corresponding score value and, in general terms, of the principles underlying the method for calculating the score value. It did not provide her with any information about the specific information that had been included in this calculation and the importance attached to it in this context, explaining that the calculation method was subject to commercial confidentiality.
The questions which the Wiesbaden Administrative Court referred to the ECJ for a preliminary ruling are as follows:
1. is Art. 22 para. 1 of the GDPR to be interpreted as meaning that the automated generation of a probability value about the ability of a data subject to service a loan in the future already constitutes a decision based exclusively on automated processing – including profiling – which produces legal effects vis-à-vis the data subject or similarly significantly affects him, if that value, determined by means of personal data relating to the data subject, is communicated by the controller to a third party controller and that third party takes that value as a material basis for its decision on the establishment, performance or termination of a contractual relationship with the data subject?
(2) If the answer to the first question referred for a preliminary ruling is in the negative, must Article 6(1) and Article 22 of the GDPR be interpreted as precluding a national rule under which the use of a probability value relating to certain future conduct of a natural person for the purpose of deciding on the establishment, performance or termination of a contractual relationship with that person (scoring) is permitted subject to certain additional conditions?
On March 16, 2023, the Advocate General, Priit Pikamäe, presented his opinion on the assessment of the above questions, which in part leaves one somewhat puzzled.
III. on the first question referred for a preliminary ruling
Art. 22 GDPR
Automated decisions in individual cases, including profiling
(1) The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
(2) Paragraph 1 shall not apply if the decision is
(a) is necessary for the conclusion or performance of a contract between the data subject and the controller,
(b) is permitted by Union or Member State law to which the controller is subject and that law contains suitable measures to safeguard the rights and freedoms and legitimate interests of the data subject; or
(c) is carried out with the explicit consent of the data subject.
(3) In the cases referred to in paragraph 2(a) and (c), the controller shall take reasonable steps to safeguard the rights and freedoms as well as the legitimate interests of the data subject, which shall include, at least, the right to obtain the intervention of a data subject on the part of the controller, to express his or her point of view and to contest the decision.
Decisions under paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless Article 9(2)(a) or (g) applies and appropriate measures have been taken to protect the rights and freedoms and legitimate interests of the data subject.
1. art. 22 DSGVO
a) Supposedly on the first question, but not covered by the wording of the latter, the Advocate General first takes a differentiated view of the question of whether Art. 22 GDPR standardizes a prohibition with a reservation of consent or merely establishes a right of objection for the data subject, taking into account the rights to information granted in the case of automated data processing “pursuant to Article 22(1) and (4)” and the existing information obligations:
“Article 22(1) of the GDPR has a special feature compared to the other restrictions on data processing in this Regulation, as it enshrines a ‘right’ of the data subject not to be subject to a decision based solely on automated processing, including profiling.” Notwithstanding the terminology used, the application of Article 22(1) GDPR does not require that the data subject actively invokes the right. Indeed, an interpretation in light of the 71st recital of this Regulation and taking into account the scheme of this provision, in particular its paragraph 2, which lists the cases in which automated processing is exceptionally allowed, leads to the conclusion that this provision imposes a general prohibition on the decisions of the above mentioned categories of data subjects.
However, instead of addressing the logical follow-up question of whether “scoring” also constitutes an “automated decision”, the Advocate General examines the question of whether “the” decision has legal effect vis-à-vis the data subject or similarly significantly affects the data subject and explains the following:
“Although the GDPR does not further define ‘legal effect’ or the phrase ‘similarly significantly affects’, it is clear from the choice of words that this provision only covers serious effects.” In this respect, it should first be noted that the 71st recital of the GDPR explicitly mentions the ‘automatic rejection of an online credit application’ as a typical example of a decision that ‘significantly’ affects the data subject.”
Next, it must be taken into account, on the one hand, that since the processing of a credit application is a step preceding the conclusion of a loan agreement, the rejection of such an application may have a “legal effect” on the data subject, since he or she can no longer benefit from a contractual relationship with the financial institution at issue. On the other hand, it should be noted that such a refusal may also have an impact on the financial situation of the person concerned. Therefore, it is logical to conclude that, in any case, this person will be “similarly” affected within the meaning of this provision. […]”
In isolation, the rejection of a credit application may have legal effect or similarly significantly affect and the automatic rejection of an online credit application may fall within the scope of Article 22(1) GDPR. However, both constellations do not relate to the case relevant here, in which the automated, machine-processing entity and the credit-granting entity diverge.
The principle required under Article 22 of the GDPR, which Schulz correctly points out, that the machine-processed data leads directly to a computer-aided decision, is not upheld in this way.
Accordingly, it has also been convincingly argued to date that credit reports from a credit agency may in themselves be based on an automated processing process, but do not constitute automated decisions within the meaning of Article 22 of the GDPR. In this respect, there is no underlying creative act of a decision with regard to various alternative actions with conclusive effect.
The terms used by the Advocate General in this context, such as a “view” or “opinion” “on a particular matter”, in paragraph 37 of the Opinion, therefore already do not fulfil the definition of a decision in this context.
With regard to the decisive question of where the relevant decision, based exclusively on automated processing, is to be located in the present constellation (namely the granting or rejection of the loan by the credit institution or the scoring by Schufa), the Advocate General refers to an examination of the individual case. For this purpose, the way in which the decision-making process for granting credit is structured is of essential importance. Accordingly, it is fundamentally a question of fact to be assessed by the national courts.
However, “experience from data protection supervision by the authorities” is said to have shown that “score values play a decisive role in the granting of credit and the structuring of its conditions”. Accordingly, the conclusion states in relative terms in para. 47 of the Opinion:
“The foregoing considerations seem to me to indicate, subject to the assessment of the facts which is for each national court to make in each individual case, that the score value established by a credit reference agency and communicated to a financial institution is generally intended to predetermine the latter’s decision to grant or refuse credit to the person concerned in such a way that it must be assumed that that opinion is of a purely formal nature in the context of the procedure.(20) The fact remains that the score value is not a decisive factor in the granting or refusal of credit. It follows that the score itself is to be considered a “decision” within the meaning of Article 22(1) of the GDPR.”
If the decision of the financial institution to grant or refuse the credit is already predetermined by the scoring carried out by the credit agency, it must be assumed that the scoring itself is the “decision” within the meaning of Article 22(1) GDPR.
This conclusion is reasonable, since any other interpretation would call into question the objective pursued by the Union legislator with this provision, namely to protect the rights of the data subjects.
According to this view, there is no reason for this conclusion and the fear of a regulatory gap in the consistent application of Article 22 GDPR and the required separation between automated processing and the actual decision – whether by co-determination of a natural person or automated.
2. right to information according to Art. 15 para. 1 lit. h) DSGVO
Art. 15 DSGVO
Right to information of the data subject
(1) The data subject shall have the right to obtain from the controller confirmation as to whether personal data concerning him or her are being processed; if this is the case, he or she shall have a right of access to such personal data and to the following information:
[…]
(h) the existence of automated decision-making, including profiling, in accordance with Article 22(1) and (4) and, at least in those cases, meaningful information about the processing.
information about the logic involved and the scope and intended effects of such processing for the data subject.
Also unaffected by the subject matter of the question referred for a preliminary ruling, the Advocate General subsequently goes on to address the scope of the right to information under Article 15(1)(h) of the GDPR in the case of automated data processing, stating:
“Since SCHUFA has refused to provide the applicant with certain information on the calculation method, on the grounds that these are business secrets, it seems appropriate to specify the scope of the right of access provided for in Article 15(1)(h) of the GDPR, in particular as regards the obligation to provide ‘meaningful information on the logic involved.’ In my opinion, this provision should be interpreted as also covering, in principle, the calculation method used by a credit agency to determine a score value, provided that there are no conflicting interests worthy of protection. In this respect, reference should be made to the 63rd recital of the GDPR, which states, among other things, that “[the right of access] should not prejudice the rights and freedoms of other persons, such as trade secrets or intellectual property rights and in particular copyright in software.”
As a result of the balancing of the conflicting interests, a minimum of information must be provided on the calculation method so that the essence of the right to protection of personal data is not impaired.
With regard to the information to be provided, the Advocate General makes an attempt to specify that,
“that it includes sufficiently detailed explanations on the method for calculating the score value and on the reasons that have led to a certain result. In general, the controller should provide the data subject with general information, in particular on factors taken into account in the decision-making process and their weighting at an aggregated level, which is also useful for challenging ‘decisions’ within the meaning of Article 22(1) GDPR on the part of the data subject.”
While “meaningful information about the logic involved” certainly justifies that the calculation method for the score value is included in the data subject’s right to information, the above statements leave much room for interpretation.
According to this, it seems to be sufficient that Schufa discloses the parameters that are basically included in the calculation of the score value as well as their general weighting. It is apparently not necessary to establish a concrete reference to the person requesting information.
IV. The second question referred for a preliminary ruling
§ SECTION 31 BDSG
Protection of economic transactions in the case of scoring and credit rating information
(1) The use of a probability value about a certain future behavior of a natural person for the purpose of deciding on the establishment, performance or termination of a contractual relationship with this person (scoring) is only permissible if the person’s creditworthiness is known,
(1) The use of a probability value about a certain future behavior of a natural person for the purpose of deciding on the establishment, performance or termination of a contractual relationship with this person (scoring) shall only be permissible if
1. the provisions of data protection law have been complied with,
2. the data used to calculate the probability value are demonstrably relevant for calculating the probability of the specific behavior, based on a scientifically recognized mathematical-statistical procedure,
3. the calculation of the probability value was not based exclusively on address data, and
4. in the case of the use of address data, the data subject has been informed of the intended use of this data prior to the calculation of the probability value; the information shall be documented.
(2) The use of a probability value determined by credit agencies about the solvency and willingness to pay of a natural person is permissible in the case of the inclusion of information about claims only insofar as the conditions according to paragraph 1 are met and only such claims about an owed service that has not been provided despite being due are taken into account, […].
The permissibility of processing, including the determination of probability values, of other data relevant to creditworthiness under general data protection law shall remain unaffected.
Art. 6 para. 1 DSGVO
(1) Processing is only lawful if at least one of the following conditions is met:
(a) the data subject has given consent to the processing of personal data concerning him or her for one or more specific purposes;
(b) processing is necessary for the performance of a contract to which the data subject is party or for the implementation of pre-contractual measures taken at the data subject’s request;
(c) processing is necessary for compliance with a legal obligation to which the controller is subject;
(d) processing is necessary in order to protect the vital interests of the data subject or another natural person;
(e) processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
(f) processing is necessary for the purposes of the legitimate interests of the controller or of a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require the protection of personal data, in particular where the data subject is a child. […]
(1) In the context of the second question referred for a preliminary ruling, the Advocate General first examines whether, pursuant to Article 22(2)(b) of the GDPR, an exception could be made to the presumed prohibition of automated data processing, should the ECJ agree with the Advocate General’s conclusion that scoring is covered by Article 22(1) of the GDPR. According to this provision, the prohibition does not apply “if the decision […] is authorized by Union or Member State legislation to which the controller is subject”.
The Advocate General concludes that Article 22(2)(b) cannot be used as a legal basis for Section 31 of the BDSG, since the national provision also includes, in an undifferentiated manner, rules for data processing activities that fall outside the scope of Art.
22 GDPR and merely regulates the use of probability values by economic operators, but not their creation.
If Section 31 BDSG thus regulates a material scope of application that deviates from Article 22(1) of the GDPR, Article 22(2)(b) of the GDPR is also not a legal basis for the adoption of this provision. 2.
(2) This leaves the provisions of the GDPR itself, which give the Member States certain regulatory powers.
According to Art. 6(1) DSGVO, the processing of personal data is permissible on the grounds specified and exhaustive therein. According to Art. 6(2) GDPR, Member States are empowered to maintain or introduce more specific provisions to adapt the use of the GDPR rules, but only to fulfill the grounds mentioned in Art. 1(c) and (e).
While Article 6(1)(b) of the GDPR legitimizes a credit institution’s request for information from the credit bureau for the purpose of credit assessment, the provision does not provide much for the collection and manner of processing of personal data on the part of Schufa, according to the Advocate General’s accurate findings. Nor does the “creation” of a score value take place within the framework of a “legal obligation” imposed on Schufa (cf. Art. 6(1)(c) DSGVO). Although Schufa contributes to the protection of consumers and the stability of the financial system by providing information on the creditworthiness of certain persons in the public interest, the creation of a score value is not necessary to fulfill this task.
Accordingly, if there is no opening clause in the present case that allows the member states to concretize the GDPR provisions – as is done by Section 31 BDSG – it must be assumed that Section 31 BDSG is not compatible with the GDPR. The corresponding conclusion of the Advocate General seems to be justifiable in this respect.
V. Conclusion
The Opinion of the Advocate General is not binding on the ECJ. It is the task of the Advocate General to submit to the Court of Justice, in complete independence, a proposal for a decision in the case in question.
Against this background, it would be desirable for the ECJ to reach a different result, in particular with regard to Art. 22 GDPR. However, the past has shown that the ECJ rarely deviates far from the findings of the Advocate General.
A pronouncement date has not yet been published.
Therefore, not only Schufa would be well advised to make good use of the time until a decision by the ECJ, but also Schufa’s contractual partners would do well to critically review their processes in the context of creditworthiness checks once again.
We would be happy to advise you on this.