Language of document : ECLI:EU:C:2023:220

OPINION OF ADVOCATE GENERAL

PIKAMÄE

delivered on 16 March 2023 (1)

Case C634/21

OQ

v

Land Hesse,

Joined party:

SCHUFA Holding AG

(Request for a preliminary ruling from the Verwaltungsgericht Wiesbaden (Administrative Court, Wiesbaden, Germany))

(Reference for a preliminary ruling – Protection of natural persons with regard to the processing of personal data – Regulation (EU) 2016/679 – Article 6(1) – Lawfulness of processing– Article 22 – Automated individual decision-making – Profiling – Private credit information agencies – Establishment of a probability value concerning the creditworthiness of a natural person (‘scoring’) – Transmission to third parties deciding to establish, implement or terminate a contractual relationship with that person on the basis of that value)






I.      Introduction

1.        The present request for a preliminary ruling from the Verwaltungsgericht Wiesbaden (Administrative Court, Wiesbaden, Germany) under Article 267 TFEU concerns the interpretation of Article 6(1) and Article 22(1) of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (‘the GDPR’). (2)

2.        The request has been made in proceedings between the applicant, OQ, a natural person, and Land Hesse, Germany, represented by the Hessischer Beauftragter für Datenschutz und Informationsfreiheit (Data Protection and Freedom of Information Commissioner for Hesse; ‘HBDI’), concerning protection of personal data. SCHUFA Holding AG (‘SCHUFA’), an agency governed by private law, is supporting the HBDI as an intervener. As part of its economic activity, which consists in providing its clients with information on the creditworthiness of third parties, SCHUFA provided a credit institution with a score for the applicant, which served as the basis for the refusal to grant the credit for which the applicant had applied. The applicant subsequently requested SCHUFA to erase the entry concerning her and to grant her access to the corresponding data, but SCHUFA merely informed her of the relevant score and, in broad outline, of the principles underlying the calculation method for the score, without informing her of the specific data included in that calculation or of the relevance accorded to them in that context, asserting that the calculation method is a trade secret.

3.        In so far as the applicant claims that the refusal of her application by SCHUFA is contrary to data protection rules, the Court will be called upon to take a view on the restrictions which the GDPR imposes on the economic activity of reporting agencies in the financial sector, in particular in data management, and on the effect to be accorded to trade secrets. Likewise, the Court will have to clarify the scope of the regulatory powers which certain provisions of the GDPR confer on the national legislature by way of derogation from the general objective of harmonisation pursued by that legal act.

II.    Legal framework

A.      European Union law

4.        Article 4(4) of the GDPR provides:

‘For the purposes of this Regulation:

(4)      “profiling” means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements’.

5.        Article 6 of the GDPR, entitled ‘Lawfulness of processing’, states:

‘1.      Processing shall be lawful only if and to the extent that at least one of the following applies:

(a)      the data subject has given consent to the processing of his or her personal data for one or more specific purposes;

(b)      processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;

(c)      processing is necessary for compliance with a legal obligation to which the controller is subject;

(d)      processing is necessary in order to protect the vital interests of the data subject or of another natural person;

(e)      processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;

(f)      processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.

Point (f) of the first subparagraph shall not apply to processing carried out by public authorities in the performance of their tasks.

2.      Member States may maintain or introduce more specific provisions to adapt the application of the rules of this Regulation with regard to processing for compliance with points (c) and (e) of paragraph 1 by determining more precisely specific requirements for the processing and other measures to ensure lawful and fair processing including for other specific processing situations as provided for in Chapter IX.

3.      The basis for the processing referred to in point (c) and (e) of paragraph 1 shall be laid down by:

(a)      Union law; or

(b)      Member State law to which the controller is subject.

The purpose of the processing shall be determined in that legal basis or, as regards the processing referred to in point (e) of paragraph 1, shall be necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller. That legal basis may contain specific provisions to adapt the application of rules of this Regulation, inter alia: the general conditions governing the lawfulness of processing by the controller; the types of data which are subject to the processing; the data subjects concerned; the entities to, and the purposes for which, the personal data may be disclosed; the purpose limitation; storage periods; and processing operations and processing procedures, including measures to ensure lawful and fair processing such as those for other specific processing situations as provided for in Chapter IX. The Union or the Member State law shall meet an objective of public interest and be proportionate to the legitimate aim pursued.

4.      Where the processing for a purpose other than that for which the personal data have been collected is not based on the data subject’s consent or on a Union or Member State law which constitutes a necessary and proportionate measure in a democratic society to safeguard the objectives referred to in Article 23(1), the controller shall, in order to ascertain whether processing for another purpose is compatible with the purpose for which the personal data are initially collected, take into account, inter alia:

(a)      any link between the purposes for which the personal data have been collected and the purposes of the intended further processing;

(b)      the context in which the personal data have been collected, in particular regarding the relationship between data subjects and the controller;

(c)      the nature of the personal data, in particular whether special categories of personal data are processed, pursuant to Article 9, or whether personal data related to criminal convictions and offences are processed, pursuant to Article 10;

(d)      the possible consequences of the intended further processing for data subjects;

(e)      the existence of appropriate safeguards, which may include encryption or pseudonymisation.’

6.        Article 15 of the GDPR, entitled ‘Right of access by the data subject’, provides:

‘1.      The data subject shall have the right to obtain from the controller confirmation as to whether or not personal data concerning him or her are being processed, and, where that is the case, access to the personal data and the following information:

(h)      the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

…’

7.        Article 21 of the GDPR, entitled ‘Right to object’, states:

‘1.      The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on point (e) or (f) of Article 6(1), including profiling based on those provisions. The controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims.

…’

8.        Under Article 22 of the GDPR, entitled ‘Automated individual decision-making, including profiling’:

‘1.      The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

2.      Paragraph 1 shall not apply if the decision:

(a)      is necessary for entering into, or performance of, a contract between the data subject and a data controller;

(b)      is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or

(c)      is based on the data subject’s explicit consent.

3.      In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.

4.      Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place.’

B.      German law

9.        Paragraph 31 of the Bundesdatenschutzgesetz (Federal Law on data protection) of 30 June 2017, (3) as amended by the Law of 20 November 2019 (4) (BDSG), entitled ‘Protection of trade and commerce in the context of scoring and credit reports’, provides:

‘(1)      The use of a probability value regarding specific future behaviour of a natural person for the purpose of deciding on the establishment, implementation or termination of a contractual relationship with that person (‘scoring’) shall be permissible only if

1.      the provisions of data protection law have been complied with,

2.      the data used to calculate the probability value are demonstrably relevant to the calculation of the probability of the specific behaviour, on the basis of a scientifically recognised mathematical statistical method,

3.      the data used for the calculation of the probability value were not exclusively address data, and

4.      where address data are used, the data subject has been notified of the intended use of such data before the calculation of the probability value; the notification must be documented.

(2)      The use of a probability value determined by credit information agencies in relation to a natural person’s ability and willingness to pay shall, in the case where information about claims against that person is taken into account, be permissible only if the conditions under subparagraph 1 are met and claims relating to a performance owed but not rendered despite falling due are taken into account only if they are claims

1.      which have been established by a judgment which has become final or has been declared provisionally enforceable or for which there is a debt instrument pursuant to Paragraph 794 of the Zivilprozessordnung (Code of Civil Procedure),

2.      which have been established in accordance with Paragraph 178 of the Insolvenzordnung (Insolvency Code) and not contested by the debtor at the meeting for verification of claims,

3.      which the debtor has expressly acknowledged,

4.      in respect of which

(a)      the debtor has been given formal notice in writing at least twice after the claim fell due,

(b)      the first formal notice was given at least four weeks previously,

(c)      the debtor has been informed in advance, but at the earliest at the time of the first formal notice, of the possibility that the claim might be taken into account by a credit information agency and

(d)      the debtor has not contested the claim, or

5.      whose underlying contractual relationship may be terminated without notice on the ground of arrears in payment and in respect of which the debtor has been informed in advance of the possibility that account might be taken of them by a credit information agency.

The permissibility of the processing, including the determination of probability values and of other data relevant to creditworthiness, under general data protection law remains unaffected’.

III. The facts giving rise to the dispute, the main proceedings and the questions referred for a preliminary ruling

10.      It is apparent from the order for reference that the applicant in the main proceedings was not granted credit as a result of a credit assessment carried out by SCHUFA. The latter is a company governed by private law, which operates a credit information service, providing its clients with information on the creditworthiness of consumers. To that end, SCHUFA carries out credit assessments for which it produces, from certain characteristics of a person, a prediction on the basis of a mathematical statistical method of the probability of future behaviour, such as the repayment of credit.

11.      The applicant requested SCHUFA to erase incorrect data concerning her and to grant her access to the entries concerning her. SCHUFA notified the applicant inter alia of the score calculated for her and the basic functioning of its calculation, but it did not inform her of the weighting of the various data in the calculation. SCHUFA considers that it is not obliged to disclose its calculation methods as they are covered by professional and commercial secrecy. In addition, it asserts that it provides information only to its clients, which make the actual decisions on the credit agreements.

12.      The applicant lodged a complaint against the report by SCHUFA with the defendant, a data protection supervisor, requesting the defendant to order the agency to provide and erase information in accordance with her request. In the decision on the complaint, the HBDI held that there was no need to take further action against the agency on the ground that it meets the requirements laid down by the German Federal Law on data protection.

13.      The referring court is hearing an action brought by the applicant against the decision of the defendant. It considers that, in order to give a ruling in the case in the main proceedings, it must be determined whether the activity of providers of credit information services by which they establish scores in relation to persons and transmit them to third parties, without making any further recommendation or comment, falls within the scope of Article 22(1) of the GDPR.

14.      The Verwaltungsgericht Wiesbaden (Administrative Court, Wiesbaden) decided to stay the proceedings and to refer the following questions to the Court of Justice for a preliminary ruling:

‘(1)      Is Article 22(1) of the [GDPR] to be interpreted as meaning that the automated establishment of a probability value concerning the ability of a data subject to service a loan in the future already constitutes a decision based solely on automated processing, including profiling, which produces legal effects concerning the data subject or similarly significantly affects him or her, where that value, determined by means of personal data of the data subject, is transmitted by the controller to a third-party controller and the latter draws strongly on that value for its decision on the establishment, implementation or termination of a contractual relationship with the data subject?

(2)      If Question 1 is answered in the negative, are Articles 6(1) and 22 of the [GDPR] to be interpreted as precluding national legislation under which the use of a probability value – in the present case, in relation to a natural person’s ability and willingness to pay, in the case where information about claims against that person is taken into account – regarding specific future behaviour of a natural person for the purpose of deciding on the establishment, implementation or termination of a contractual relationship with that person (scoring) is permissible only if certain further conditions, which are set out in more detail in the grounds of the request for a preliminary ruling, are met?’

IV.    Procedure before the Court

15.      The order for reference dated 1 October 2021 was received at the Court Registry on 15 October 2021.

16.      The parties to the main proceedings, SCHUFA, the Danish, Portuguese and Finnish Governments and the European Commission submitted written observations within the period prescribed by Article 23 of the Statute of the Court of Justice of the European Union.

17.      At the hearing on 26 January 2023, oral argument was presented by the legal representatives of the parties to the main proceedings and of SCHUFA and the agents of the German and Finnish Governments and of the Commission.

V.      Legal analysis

A.      Preliminary remarks

18.      Since mutual confidence forms the basis for any contractual commitment in a market economy, it is understandable, in principle, from a business point of view, that providers of goods and services wish to know about their clients and the risks inherent in such contractual commitment. Credit information agencies can help to establish this mutual confidence using statistical methods that allow undertakings to determine whether certain relevant criteria, including the creditworthiness of their clients, are met in a specific case. By doing so, they help undertakings to comply with various provisions of EU law which impose precisely that obligation on them for certain categories of contracts, in particular credit agreements. (5) Some of the methods used may be based on clients’ personal data which are collected and processed by automated means using information technology. This interest contrasts with the data subjects’ interest in knowing the manner in which those data are managed and recorded and the methods employed by undertakings to take decisions regarding their clients.

19.      The GDPR, which has applied since 25 May 2018, established a legal framework which seeks to take account of the abovementioned interests throughout the European Union, in particular by imposing certain restrictions on the processing of personal data. Thus, specific restrictions apply to automated processing which is likely to produce legal effects concerning a natural person or significantly affects him or her. Those restrictions are justified in the context of profiling, that is, an evaluation of personal aspects with a view to analysing or predicting a natural person’s economic situation, reliability or behaviour. Note should be taken in this connection of the restrictions laid down in Article 22 of the GDPR – which are relevant in this case and seek to protect human dignity – preventing the data subject being subject to a decision based solely on automated processing, without any human intervention capable of verifying, if necessary, that the decision was correct, fair and not discriminatory. (6) The human intervention to be envisaged in this type of automated data processing ensures that the data subject has the opportunity to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge it in the event of disagreement with the decision. It is precisely the extent of the restrictions under Article 22 of the GDPR that is the subject of the first question referred for a preliminary ruling.

20.      Although the GDPR created a comprehensive regulatory framework for the protection of personal data which is, in principle, full, certain provisions make it possible for Member States to lay down additional, stricter or derogating national rules, which leave them a margin of discretion as to the manner in which those provisions may be implemented (‘opening clauses’), provided those rules do not undermine the content and objectives of the GDPR. (7) The scope of that regulatory power remaining with the Member States lies at the heart of the second question to be examined in this Opinion.

B.      Admissibility

21.      The HBDI and SCHUFA dispute the admissibility of the request for a preliminary ruling. SCHUFA asserts in this regard that the reference is neither necessary for the outcome of the dispute nor does it have a sufficient statement of reasons; it would open up a second remedy and would be inconsistent with the other requests for a preliminary ruling made then withdrawn by the same referring court.

1.      The decisive character of the questions referred for the outcome of the dispute

22.      First of all, it must be borne in mind that, according to the Court’s settled case-law, in the context of the cooperation between the Court and the national courts provided for in Article 267 TFEU, it is solely for the national court before which a dispute has been brought, and which must assume responsibility for the subsequent judicial decision, to determine in the light of the particular circumstances of the case both the need for a preliminary ruling in order to enable it to deliver judgment and the relevance of the questions which it submits to the Court. Consequently, where the questions submitted concern the interpretation or the validity of a rule of EU law, the Court is in principle bound to give a ruling. It follows that questions relating to EU law enjoy a presumption of relevance. The Court may refuse to give a ruling on a question referred by a national court only where it is quite obvious that the interpretation, or the determination of validity, of a rule of EU law that is sought bears no relation to the actual facts of the main action or its purpose, where the problem is hypothetical, or where the Court does not have before it the factual or legal material necessary to give a useful answer to the questions submitted to it. (8)

23.      Those conditions are not satisfied in the present case as it is clear from paragraph 40 of the order for reference that the outcome of the proceedings depends on the first question referred for a preliminary ruling. The national court explains that if Article 22(1) of the GDPR were to be interpreted as meaning that scoring by a credit information agency is an independent decision within the meaning of Article 22(1) of that regulation, that agency – or more precisely its main activity – would be subject to the prohibition of automated individual decisions. Consequently, it would require a legal basis under Member State law within the meaning of Article 22(2)(b) of the GDPR, in respect of which only Paragraph 31 of the BDSG enters into consideration. However, the referring court expresses serious concerns as to the compatibility of that national provision with Article 22(1) of the GDPR. According to the referring court, SCHUFA would not only act without a legal basis, but also violate the prohibition laid down in the latter provision. As a result, the applicant would have a claim against the HBDI by which she could seek its (further) involvement in her case in a supervisory capacity. It is therefore clear that an answer to the questions asked by the referring court is decisive for the outcome of the dispute.

24.      SCHUFA further asserts that the request for a preliminary ruling is inadmissible on the ground that the applicant has already been informed of the logic involved in the score. However, it is clear from the order for reference that the applicant wished to be informed as fully as possible about all the data collected and the method used to determine the score. In so far as SCHUFA informed the applicant, in broad outline, of the basic functioning of its score calculation process, but not of what individual pieces of information are included in the calculation and with what weighting, it is evident that SCHUFA has not complied with that request for information. Consequently, the applicant has a legitimate interest in seeking a request for a preliminary ruling to ascertain the rights of the data subject vis-à-vis a credit information agency such as SCHUFA.

2.      The existence of two parallel remedies

25.      As regards the alleged risk of opening up a second remedy for the data subject, it should be noted that, contrary to the assertion made by SCHUFA in its observations, the fact that the applicant first brought the matter before the ordinary courts before taking the case to the referring administrative court does not preclude the admissibility of the request for a preliminary ruling. By its two actions, the applicant availed herself of the remedies provided for in Articles 78 and 79 of the GDPR, which guarantee a right to an effective judicial remedy against the supervisory authority and the data controller respectively. In so far as these remedies coexist independently, neither remedy being subsidiary to the other, they may be exercised in parallel. (9) It cannot therefore be alleged that the applicant committed any irregularity in defending her rights protected by the GDPR.

26.      Furthermore, according to the Court’s settled case-law, in the absence of EU rules in the field, it is for the national legal system of each Member State to designate the courts and tribunals having jurisdiction and to lay down the detailed procedural rules governing actions for safeguarding rights which individuals derive from EU law. (10) Consequently, there is no objective reason to call into question a Member State’s system of remedies, except where the effectiveness of EU law would be jeopardised, for example if national courts and tribunals were deprived of the right or exempted from the obligation under Article 267 TFEU to refer questions concerning the interpretation or the validity of EU law to the Court.

27.      In the present case, there is nothing to suggest that the existence of two remedies which permit an effective judicial remedy against the supervisory authority and the data controller respectively jeopardises the effectiveness of EU law or deprives national courts and tribunals of the right to have recourse to the procedure laid down in Article 267 TFEU. On the contrary, as I have already explained, having regard to Articles 78 and 79 of the GDPR, it is clear that the GDPR does not preclude such a system of remedies, but accords the Member State responsibility for designating the courts and tribunals having jurisdiction and for laying down the detailed procedural rules governing actions in full compliance with the principle of procedural autonomy. This has been recognised by the Court in its recent case-law. (11)

28.      Lastly, SCHUFA merely criticises the remedies available under German law without explaining precisely how a judgment delivered by the Court in these preliminary ruling proceedings would be irrelevant to the outcome of the dispute. As the referring court states, an interpretation of Article 22(1) of the GDPR by the Court would enable the defendant to exercise its supervisory powers over SCHUFA in conformity with EU law. Consequently, it would seem that SCHUFA’s argument challenging the admissibility of the request for a preliminary ruling is unfounded.

3.      The alleged failure to state reasons for the request for a preliminary ruling

29.      These considerations are sufficient, in principle, to reject SCHUFA’s argument. In the interest of gaining a better understanding of the present case, I nevertheless think that it is necessary to address the argument concerning an alleged failure to state reasons for the request for a preliminary ruling. Contrary to the assertion made by SCHUFA, the order for reference gives a sufficiently full statement of the issues in the present case to satisfy the requirements of Article 94(c) of the Rules of Procedure. More specifically, the referring court explains that the applicant intends to assert her rights vis-à-vis SCHUFA. According to the referring court, unless it is given a restrictive interpretation, Article 22(1) of the GDPR is in principle such as to confer protection on her in respect of automated processing of her personal data.

30.      The referring court considers that, given the importance which some undertakings attach to the score established by credit information agencies for the prognostic evaluation of a natural person’s financial capacity, the score can be considered an independent ‘decision’ within the meaning of the abovementioned provision. An interpretation along these lines would be necessary in order to fill the legal lacuna stemming from the fact that the data subject would otherwise be unable to obtain the necessary information under Article 15(1)(h) of the GDPR. In the light of this detailed statement of reasons, I consider that SCHUFA’s argument should be rejected and that the request for a preliminary ruling should be found to be admissible.

C.      Substance

1.      The first question referred for a preliminary ruling

(a)    The general prohibition under Article 22(1) of the GDPR

31.      Article 22(1) of the GDPR has a distinct feature compared with the other restrictions on the processing of data contained in that regulation in that it establishes a ‘right’ for the data subject not to be subject to a decision based solely on automated processing, including profiling. Despite the terminology used, the application of Article 22(1) of the GDPR does not require the data subject to actively invoke the right. An interpretation in the light of recital 71 of that regulation, taking account of the scheme of that provision, particularly its paragraph 2, which specifies the cases in which such automated processing is exceptionally permitted, suggests instead that that provision establishes a general prohibition on decisions of the kind described above. This being said, that prohibition applies only in very specific circumstances, namely to a decision ‘which produces legal effects concerning him or her’ or ‘similarly significantly affects him or her’.

32.      By its first question, the referring court asks whether the calculation of a score by a credit information agency falls within the scope of Article 22(1) of the GDPR where the score obtained is transmitted to an undertaking which draws strongly on it in adopting a decision on the establishment, implementation or termination of a contractual relationship with the data subject. In other words, it is asked whether that provision applies to credit information agencies which make scores available to financial undertakings. In examining this question, it must be ascertained whether the conditions laid down in Article 22(1) of the GDPR are satisfied in the present case.

(b)    The applicability of Article 22(1) of the GDPR

(1)    The existence of ‘profiling’ within the meaning of Article 4(4) of the GDPR

33.      Article 22(1) requires, first of all, that there is automated processing of personal data, ‘profiling’ being regarded as a sub-category, judging by the wording of the provision. (12) It should be noted in this regard that the scoring carried out by SCHUFA is covered by the legal definition contained in Article 4(4) of the GDPR, since that procedure uses personal data to evaluate certain aspects relating to natural persons to analyse or predict aspects concerning their economic situation, reliability and probable behaviour. It is clear from the case file that the method used by SCHUFA provides a score based on certain criteria, that is to say, a result from which conclusions can be drawn regarding the creditworthiness of the data subject. Lastly, I would stress that none of the interested parties contests the classification of the procedure at issue as ‘profiling’, and this condition can therefore be considered to be satisfied in the present case.

(2)    The decision must produce ‘legal effects’ concerning the data subject or ‘similarly significantly affect him or her’

34.      The application of Article 22(1) of the GDPR requires that the decision at issue produces ‘legal effects’ concerning the data subject or ‘similarly significantly affects him or her’. The GDPR thus recognises that automated decision-making, including profiling, can have serious consequences for data subjects. Although the GDPR does not define the term ‘legal effects’ or the expression ‘similarly significantly’, the wording used nevertheless makes clear that only effects having a serious impact will be covered by that provision. In this regard, it should be noted at the outset that recital 71 of the GDPR explicitly mentions ‘automatic refusal of an online credit application’ as a typical example of a decision which ‘significantly’ affects the data subject.

35.      Account should then be taken that, first, in so far as the processing of a credit application constitutes a step to be taken prior to the conclusion of a loan agreement, the refusal of such an application may have ‘legal effects’ on the data subject, who can no longer benefit from a contractual relationship with the financial institution in question. Second, it should also be noted that such refusal is also likely to have an impact on the financial situation of the data subject. It is therefore logical to conclude that the data subject will in any case be ‘similarly’ affected within the meaning of that provision. The EU legislature appears to have been aware of this in drafting recital 71 of the GDPR, in the light of which Article 22(1) of that regulation must be interpreted. I therefore take the view that, having regard to the situation of the applicant, the conditions of that provision are satisfied in the present case, whether emphasis is placed on the legal or the economic consequences of the refusal to grant credit.

(3)    The ‘decision’ must be ‘based solely on automated processing’

36.      Two additional conditions must be satisfied. First, it is necessary that an act having the nature of a ‘decision’ is taken in respect of the data subject. Second, the decision at issue must be ‘based solely on automated processing’. As far as the latter condition is concerned, there is nothing in the statement of facts in the order for reference to suggest that, alongside the mathematical and statistical procedure applied by SCHUFA, an individual evaluation and assessment by a human being is drawn on strongly in establishing the score. Consequently, scoring as an act carried out by SCHUFA must be considered to be ‘based solely on automated processing’. Having said that, it should be borne in mind that the financial institution to which SCHUFA transmits the score is required to adopt an act that is deemed independent in respect of the data subject, namely to grant or refuse to grant credit. The question thus arises which of these two acts can be regarded as a ‘decision’ within the meaning of Article 22(1) of the GDPR, and this question will be examined below.

37.      As regards the former condition, it must be established, first, what the legal nature of a ‘decision’ is and what form a decision must take. From an etymological point of view, the concept implies an ‘opinion’ or ‘position’ on a certain situation. It must also have ‘binding character’ in order to distinguish it from mere ‘recommendations’, which in principle have no legal or factual consequences. (13) Having said this, contrary to the assertion made by the HBDI, an analogy with the fourth paragraph of Article 288 TFEU does not seem to be relevant in the present context, especially since it does not have any basis in the provisions of the GDPR.

38.      The absence of a legal definition indicates that the EU legislature opted for a broad concept which can include a number of acts capable of affecting the data subject in many ways. As I have already stated, a ‘decision’ within the meaning of Article 22(1) of the GDPR can either have ‘legal effects’ or ‘similarly’ affect the data subject, which means that the ‘decision’ in question may have an impact that is not necessarily legal but rather economic and social. Since Article 22(1) of the GDPR seeks to protect natural persons against the potentially discriminatory and unfair effects of automated processing of data, it seems that particular vigilance is required and must also be reflected in the interpretation of that provision.

39.      Lastly, in so far as acts attributable to a private financial institution can also have serious consequences for the independence and the freedom of action of the data subject in a market economy, particularly in relation to certifying the creditworthiness of a credit applicant, (14) I cannot see any objective reason to confine the concept of ‘decision’ to the strictly public sphere, that is to say, the relationship between the State and the citizen, as is implicitly suggested by the analogy proposed by the HBDI. Classification of a position adopted in respect of the data subject as a ‘decision’ would seem to require an examination of the individual case, taking into account the specific circumstances and the seriousness of the effects on the legal, economic and social status of the data subject. (15)

40.      On the basis of the criteria set out in the preceding points, it must be determined, second, what is the relevant ‘decision’ in the case at issue. As was mentioned above, there is, on the one hand, the act by which a bank agrees or refuses to grant credit to the applicant and, on the other, the score derived from a profiling procedure conducted by SCHUFA. In my view, it is impossible to give a categorical answer to this question because the characterisation depends on the circumstances of each particular case. More specifically, the way in which the decision-making process is structured is of crucial importance. That process typically includes several phases, such as profiling, the establishment of the score and the actual decision on the grant of credit.

41.      It seems clear that although a financial institution can take on this process, there is nothing to prevent it assigning certain tasks, such as profiling and scoring, by contract to a credit information agency. Article 22(1) of the GDPR lays down no requirement that those tasks must be performed by one or more bodies. That being said, the possibility of assigning certain powers to an external service provider would not seem to play a crucial role in the analysis, as such assignment is generally prompted by economic and organisational considerations, which are likely to vary from one case to the next.

42.      By contrast, the aspect which seems to play a crucial role is whether the decision-making process is conceived in such a way that the scoring by the credit information agency predetermines the decision by the financial institution to grant or refuse to grant credit. If the scoring were carried out without any human intervention that could, where necessary, verify its result and the fairness of the decision to be taken in respect of the credit applicant, it would seem logical for the scoring itself to be considered to constitute the ‘decision’ under Article 22(1) of the GDPR.

43.      To suppose otherwise because the decision to grant or refuse credit formally rests with the financial institution not only would represent excessive formalism, but would hardly do justice to the specific circumstances of such a scenario, particularly since Article 22(1) of the GDPR does not lay down any requirement for the ‘decision’ to take a particular form. The crucial factor is the effect that the ‘decision’ has on the data subject. Because a negative score can, in itself, produce detrimental effects for data subjects, namely by limiting significantly the exercise of their freedoms or even by stigmatising them in society, it seems justified to classify it as a ‘decision’ for the purposes of the abovementioned provision where a financial institution accords it paramount importance in the decision-making process. (16) In such circumstances, credit applicants are affected from the stage of the evaluation of their creditworthiness by the credit information agency and not only at the final stage of the refusal to grant credit, where the financial institution is merely applying the result of that evaluation to the specific case. (17)

44.      Given that, first, Article 22(1) of the GDPR requires the decision in question to be based ‘solely’ on automated processing (18) and, second, the wording of a provision is generally the limit of any interpretation, it would appear necessary that automated processing remains the only aspect on which the financial institution’s approach vis-à-vis the credit applicant is based. This would be the case if there was human participation in the process, without the possibility of influencing the causal link between the automated processing and the final decision. It would have to be necessary for the credit information agency to take the final decision de facto for the financial institution. This depends on the internal rules and practices of the financial institution in question, which must not generally allow it any latitude in applying the score to a credit application.

45.      This is in essence a question of fact which, in my view, can best be assessed by national courts and tribunals. I therefore propose that the referring court be given the task of establishing itself the extent to which the financial institution is generally bound by the scoring carried out by a credit information agency such as SCHUFA, in the light of the abovementioned criteria. (19) I will rely on the information contained in the order for reference in order to give a useful answer to the national court in the present case.

46.      In this regard, I should first point out that, according to the referring court, even though human intervention is in principle still possible at that stage of the decision-making process, the decision to enter into a contractual relationship with the data subject ‘is in practice determined by the score transmitted by credit agencies to such a considerable extent that the score penetrates through the decision of the third-party controller’. According to the referring court, ‘it is ultimately the score established by the credit information agency on the basis of automated processing that actually decides whether and how the third-party controller enters into a contract with the data subject’. The referring court also explains that although the third-party controller does not have to make his or her decision dependent solely on the score, the fact remains that ‘he or she usually does so to a significant extent’. It adds that ‘while it is true that, even where a score is in principle sufficient, a loan may be refused (for other reasons, such as a lack of collateral or doubts as to the success of an investment to be financed), an insufficient score will lead to the refusal of a loan, at least in the context of consumer loans, in almost every case and even if, for instance, an investment otherwise appears to be worthwhile’. Lastly, the referring court states that ‘experience from the data protection supervision carried out by the authorities shows that the score plays the decisive role in the granting of loans and in determining the conditions under which they are granted’.

47.      The above considerations would seem to suggest, subject to the assessment of the facts which any national court or tribunal is required to make in each individual case, that the score established by a credit information agency and transmitted to a financial institution generally tends to predetermine the financial institution’s decision to grant or refuse to grant credit to the data subject, such that this position must be considered only to have purely formal character in the process. (20) It follows that the score itself must be regarded as having the status of a ‘decision’ within the meaning of Article 22(1) of the GDPR.

48.      It would seem reasonable to draw this conclusion since any other interpretation would undermine the objective pursued by the EU legislature through that provision, which is to protect the rights of data subjects. As the referring court correctly explained, a strict reading of Article 22(1) of the GDPR would give rise to a lacuna in legal protection: the credit information agency from which the information required by the data subject should be obtained is not obliged to provide it under Article 15(1)(h) of the GDPR as it is not purported to carry out its own ‘automated decision-making’ within the meaning of that provision and the financial institution which takes its decision on the basis of the score established by automated means and which is obliged to provide the information required under Article 15(1)(h) of the GDPR cannot provide it because it does not have that information.

49.      Consequently, the financial institution would be unable to review the evaluation of the creditworthiness of the credit applicant if the decision is contested, as is required by Article 22(3) of the GDPR, or to ensure fair, transparent and non-discriminatory processing by appropriate mathematical or statistical procedures and appropriate technical and organisational measures, as is required by the sixth sentence of recital 71 of the GDPR. (21) In order to avoid such a situation, which would clearly run counter to the legislative objective mentioned in the preceding point, I propose an interpretation of Article 22(1) of the GDPR which takes account of the real impact of scoring on the situation of the data subject.

50.      This approach seems logical as the credit information agency should, in general, be the only entity capable of responding to requests from the data subject based on rights also guaranteed by the GDPR, namely the right to rectification under Article 16 of the GDPR, where personal data used to carry out scoring proved to be inaccurate, and the right to erasure under Article 17 of the GDPR, where those data have been unlawfully processed. In so far as the financial institution is not generally involved in either the collection of those data or profiling where those tasks are assigned to third parties, it is reasonable to discount the possibility of it being capable of effectively ensuring respect for those rights. The data subject should not have to suffer the detrimental consequences of such assignment of tasks.

51.      Holding the credit information agency responsible based on the establishment of the score – and not by virtue of its further use – would seem to be the most effective way to ensure the protection of the fundamental rights of the data subject, namely the right to protection of personal data under Article 8 of the Charter of Fundamental Rights of the European Union (‘the Charter’), but also the right to respect for private life under Article 7 of the Charter, as that activity ultimately represents the ‘source’ of any potential damage. Given the likelihood that the scoring by the credit information agency is used by multiple financial institutions, it appears reasonable to allow data subjects to assert their rights directly vis-à-vis that agency.

52.      For the reasons explained above, I consider that the conditions laid down in Article 22(1) of the GDPR are satisfied, with the result that that provision is applicable in circumstances such as those in the main proceedings.

(c)    The scope of the right to information under Article 15(1)(h) of the GDPR

53.      Against this background, the importance of the controller complying fully with his or her information obligations in respect of the data subject cannot be emphasised strongly enough. Under Article 15(1)(h) of the GDPR, the data subject has the right to obtain from the controller not only confirmation as to whether or not personal data concerning him or her are being processed, but also other information, such as on the existence of automated decision-making, including profiling, referred to in Article 22 of the GDPR, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

54.      Since SCHUFA refused to disclose to the applicant certain information concerning the calculation method on the ground that it was a trade secret, it seems relevant to clarify the scope of the right to information under Article 15(1)(h) of the GDPR, in particular as regards the obligation to provide ‘meaningful information about the logic involved’. In my view, that provision must be interpreted as also covering, in principle, the calculation method used by a credit information agency to establish a score, provided there are no conflicting interests that are worthy of protection. Mention should be made in this regard of recital 63 of the GDPR, which states inter alia that ‘the right of access to personal data … should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property and in particular the copyright protecting the software’ (my emphasis).

55.      A number of conclusions can be drawn from an interpretation of Article 15(1)(h) of the GDPR read in the light of recitals 58 and 63 of that regulation. First, it is apparent that the EU legislature was fully aware of the conflicts that could arise between the right to protection of personal data guaranteed by Article 8 of the Charter and the right to protection of intellectual property under Article 17(2) of the Charter. Second, it is clear that its intention was not to sacrifice one fundamental right for another. On the contrary, further analysis of the provisions of the GDPR indicates that it wished to ensure a fair balance between rights and responsibilities.

56.      In my view, the EU legislature’s insistence in recital 63 of the GDPR that ‘the result of those considerations should not be a refusal to provide all information to the data subject’ (22) means that a minimum amount of information must be provided in any event so as not to compromise the essence of the right to protection of personal data. It follows that while protection of trade secrets or intellectual property in principle constitutes a legitimate reason for a credit information agency to refuse to disclose the algorithm used to calculate the score for the data subject, it cannot under any circumstances justify an absolute refusal to provide information, a fortiori where there are appropriate means of communication which aid understanding while guaranteeing a degree of confidentiality.

57.      Article 12(1) of the GDPR, under which ‘the controller shall take appropriate measures to provide … any communication under [Article 15] relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language’, (23) seems particularly relevant in this context. That provision supports the reasoning set out above in so far as it follows from it that the real objective of Article 15(1)(h) of the GDPR is to ensure that data subjects obtain information in an intelligible and accessible form in accordance with their needs. In my view, those requirements exclude any obligation to disclose the algorithm, given its complexity. The benefit of communicating a particularly complex formula without providing the necessary explanations for it would be questionable. Regard should be had in this connection to recital 58 of the GDPR, according to which compliance with the abovementioned requirements is of particular relevance ‘in situations where … the technological complexity of practice [makes] it difficult for the data subject to know and understand whether, by whom and for what purpose personal data relating to him or her are being collected’. (24)

58.      For the reasons explained, I consider that the obligation to provide ‘meaningful information about the logic involved’ must be understood to include sufficiently detailed explanations of the method used to calculate the score and the reasons for a certain result. In general, the controller should provide the data subject with general information, notably on factors taken into account for the decision-making process and on their respective weight on an aggregate level, which is also useful for him or her to challenge any ‘decision’ within the meaning of Article 22(1) of the GDPR. (25)

(d)    Answer to the first question referred for a preliminary ruling

59.      In the light of the above considerations, I take the view that Article 22(1) of the GDPR is to be interpreted as meaning that the automated establishment of a probability value concerning the ability of a data subject to service a loan in the future constitutes a decision based solely on automated processing, including profiling, which produces legal effects concerning the data subject or similarly significantly affects him or her, where that value, determined by means of personal data of the data subject, is transmitted by the controller to a third-party controller and the latter, in accordance with consistent practice, draws strongly on that value for its decision on the establishment, implementation or termination of a contractual relationship with the data subject.

2.      The second question referred for a preliminary ruling

60.      Although the second question is asked only if the first question is answered in the negative, it seems that it will also be relevant if the Court were to conclude that scoring is covered by the prohibition of automated decision-making under Article 22(1) of the GDPR. In that case, as the Finnish Government rightly states, it would then be necessary to consider whether an exception to that prohibition is possible under Article 22(2)(b) of the GDPR. Under that provision, the prohibition does not apply ‘if the decision is authorised by Union or Member State law to which the controller is subject’.

61.      That provision includes explicit regulatory latitude where the automated profiling to which it refers is based on EU or Member State law. If it is based on Member State law, national law must provide for specific safeguards for the data subject. The Court would then have to determine whether such ‘authorisation’ may be inferred from Article 6 of the GDPR itself or from a national provision adopted on a legal basis provided for therein. This latter scenario would seem to call for a close analysis, given that the referring court raises the question of the conformity of Paragraph 31 of the BDSG with Articles 6 and 22 of the GDPR, which requires that Articles 6 and 22 of the GDPR are capable of constituting appropriate legal bases.

(a)    The existence of a legal basis in the GDPR conferring regulatory powers on Member States

62.      The national court expresses doubts on precisely this point since, according to it, none of the provisions referred to in Articles 6 and 22 of the GDPR can serve as a legal basis for the adoption of a national provision such as Paragraph 31 of the BDSG which lays down certain rules on scoring. The question thus arises whether the national legislature correctly applied the provisions of the GDPR which give it latitude in laying down national rules on processing of personal data under the GDPR or even, in certain circumstances, in derogating from them. The answer to this question proves to be complicated, as the German legislature does not mention any opening clause in the explanatory memorandum for the law. (26) However, it is all the more relevant as Paragraph 31(1)(1) of the BDSG expressly provides that scoring is ‘permissible only if the provisions of data protection law have been complied with’ (my emphasis). As I will explain below, several arguments lead me to answer this question in the negative.

(1)    The applicability of Article 22(2)(b) of the GDPR

63.      Mention should be made at the outset of Article 22(2)(b) of the GDPR, which confers on Member States the power to authorise a decision based solely on automated data processing, including profiling, and which could therefore be invoked as a legal basis. It should be noted, however, that paragraph 2 would not be applicable if the Court were to rule that scoring is not covered by the prohibition under paragraph 1 of Article 22 of the GDPR. As is clear from the wording and the general scheme of Article 22 of the GDPR, the application of paragraph 2 presupposes that the conditions laid down in paragraph 1 are met. Accordingly, it is clear that an application of Article 22(2)(b) of that regulation would have to be ruled out if the first question had to be answered in the negative.

64.      For the sake of completeness and in view of my proposed affirmative answer to the first question, I consider it necessary to examine whether that provision is capable of being applied in the event that the Court were to take the view that the scoring carried out by a credit information agency constitutes a ‘decision’ within the meaning of paragraph 1 of Article 22. However, even in this case, doubts remain, for a number of reasons, whether Article 22(2)(b) of the GDPR is capable of serving as a legal basis.

65.      First, Article 22 of the GDPR concerns only decisions based ‘solely’ on automated data processing while, according to the referring court, Paragraph 31 of the BDSG also includes, without any differentiation, provisions for non-automated decisions but, in those provisions, it regulates the permissibility of the use of data processing for scoring. In other words, Paragraph 31 of the BDSG has a much broader scope ratione materiae than Article 22 of the GDPR. (27) It is therefore doubtful that Article 22(2)(b) of the GDPR is capable of serving as a legal basis.

66.      Second, as the referring court states, Paragraph 31 of the BDSG regulates the ‘use’ of a probability value by economic actors, but not the ‘establishment’ of that value by credit information agencies, an aspect which is nevertheless the subject of the first question referred for a preliminary ruling. This can also be inferred from the statements made by the German Government at the hearing. As I have explained in some detail in examining that question, Article 22(1) of the GDPR also applies at the stage of the ‘establishment’ of the score and not only at the stage of its ‘use’ by a financial institution, provided certain conditions are met. (28) In other words, it appears that Paragraph 31 of the BDSG seeks to regulate a situation other than the one falling within the scope ratione materiae of Article 22 of the GDPR, which must be confirmed by the referring court. In the light of the above remarks, I consider that Article 22(2)(b) of the GDPR must be ruled out as a legal basis for the adoption of a national legislative measure such as Paragraph 31 of the BDSG.

(2)    The applicability of Article 6(2) and (3) of the GDPR

(i)    Clauses conferring regulatory powers on Member States

67.      Under Article 6(1) of the GDPR, processing of personal data is lawful only if the one of the grounds set out therein applies. As the Court has ruled, this is an exhaustive and restrictive list of the cases in which such processing can be regarded as lawful. (29) Therefore, in order to be considered lawful, the establishment of a score by a credit information agency must be covered by one of the cases mentioned in that provision.

68.      In a case such as the one at issue in the main proceedings, points (b), (c) and (f) of Article 6(1) of the GDPR could be applicable in principle. Under paragraph 2 of that article, Member States may maintain or introduce more specific provisions to adapt the application of the rules of the GDPR. However, that provision applies only for compliance with paragraph 1(c) and (e). Similarly, paragraph 3 of Article 6 provides that the basis for the processing is laid down by the Member State law to which the controller is subject, provided the processing concerns the cases referred to in point (c) and (e) of paragraph 1.

69.      It follows that Member States may adopt more specific rules where processing is ‘necessary for compliance with a legal obligation to which the controller is subject’ or ‘necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller’. The effect of these conditions is to establish a strict framework for Member States’ regulatory power, thus precluding arbitrary use of the opening clauses laid down in the GDPR, which could frustrate the objective of harmonising the law on protection of personal data.

70.      Furthermore, it should be noted in this context that, in so far as some of those clauses use terminology which is specific to the GDPR without making any express reference to the law of the Member States, the terms used must be given an autonomous and uniform interpretation. (30) It is against this background that it is necessary to ascertain below whether the rules contained in Paragraph 31 of the BDSG are covered by one of the grounds set out in Article 6(1) of the GDPR.

(ii) Lawfulness of processing of data

–       The ground under Article 6(1)(b) of the GDPR

71.      As far as point (b) is concerned, that provision states that processing is lawful only if and in so far as processing is necessary for ‘the performance of a contract to which the data subject is party’ or ‘in order to take steps at the request of the data subject prior to entering into a contract’. It should be pointed out in this regard that services provided by information agencies are used only in exceptional cases at the contract performance stage. The most important phase is the precontractual phase during which credit information is generally obtained. The transmission of an inquiry to a credit information agency for the purposes of a credit check would seem to be authorised under that provision. (31) It should be stated, however, that this provision covers only authorisation for potential contractual partners, creditors and/or providers of legal services to make credit inquiries, and thus establishes the preconditions for the lawful collection of data by credit information agencies. On the other hand, that provision does not appear to be sufficient in itself to serve as a legal basis to legitimise the activities of a credit information agency in general. (32)

72.      It should also be noted that although Article 6(1)(b) of the GDPR codifies a ground for lawfulness of processing of personal data, it does not cover any of the situations provided for in paragraphs 2 and 3, under which Member States enjoy a regulatory power. It follows that, in so far as Article 6 of the GDPR sets out those situations exhaustively, a national provision such as Paragraph 31 of the BDSG cannot be adopted solely on the basis of Article 6(1)(b) of the GDPR.

–       The ground under Article 6(1)(c) of the GDPR

73.      The ground under Article 6(1)(c) of the GDPR concerns processing based on a ‘legal obligation’ to which the controller is subject. This means requirements imposed by the State itself. That provision does not, however, encompass obligations stemming from civil law contracts, for example a contract concluded between a financial institution and a credit information agency. Nevertheless, financial institutions which must satisfy themselves of the creditworthiness of their clients by virtue of their obligations under national law may rely on this legal basis for the purposes of the corresponding inquiries so as to guarantee, from the point of view of a credit information agency, that such inquiries are fully lawful. By contrast, the ‘establishment’ of a score by the credit information agency cannot be considered to be a measure taken in performance of a ‘legal obligation’ imposed on it, given that no such obligation appears to exist in national law. Consequently, point (c) could not be legitimately invoked as a legal basis in order to classify scoring as lawful processing.

–       The ground under Article 6(1)(e) of the GDPR

74.      The question then arises whether Article 6(1)(e) of the GDPR could be invoked as a legal basis for the adoption of Paragraph 31 of the BDSG. This would be the case if processing was ‘necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller’. On the one hand, it could be claimed, as is evident from the legislative objective of Paragraph 31 of the BDSG, as reflected in the title of that national provision (‘Protection of trade and commerce in the context of scoring and credit reports’), and from the travaux préparatoires, (33) that credit information agencies contribute to the smooth functioning of a country’s economy. (34)

75.      In so far as those agencies provide information on the creditworthiness of specified persons, they help to protect consumers by avoiding the risk of over-indebtedness, (35) but also undertakings which sell them goods or grant them credit. Such agencies ensure the stability of the financial system by preventing credit being granted irresponsibly to borrowers posing high risks of default. (36) Without a sound credit assessment system, a large section of the population would be practically prevented from obtaining loans on account of the incalculable risks, economic transactions in the information age would be much more difficult and attempted fraud would go undetected. With this in mind, it is possible to concur with the reasons which seem to have led the German legislature to adopt Paragraph 31 of the BDSG.

76.      Furthermore, even though legal persons governed by private law can certainly act in the public interest, it seems clear that not any ‘legitimate interest’ can justify the application of Article 6(1)(e) of the GDPR. The relationship with the ‘exercise of official authority’ and recitals 45, 55 and 56 of the GDPR tend to suggest that this provision refers, in the first place, to public authorities in the strict sense and legal persons vested with a measure of official authority and, in the second place, to legal persons governed by private law carrying out processing for public service purposes, for example in ‘public health’, ‘social protection’ and the ‘management of health care services’, which are expressly mentioned in recital 45. In other words, that provision concerns traditional tasks of the State.

77.      Similarly, recitals 55 and 56 of the GDPR refer to ‘officially recognised religious associations’ and to ‘political parties’, that is to say, organisations which, according to the criteria applied by the EU legislature, pursue activities in the public interest and carry out processing of personal data to that end. In the light of this, it is doubtful that this provision could also encompass the activities of credit information agencies, including scoring. Such an interpretation would broaden the scope of the provision considerably and make it particularly difficult to identify the limits of the opening clause. (37)

78.      Aside from the above considerations, it should be noted in this context that although the aim of Paragraph 31 of the BDSG is to protect trade and commerce, that provision does not mention any specific tasks of those agencies. (38) As I have already stated in this Opinion with reference to the clarifications given by the referring court regarding the national legal framework, that provision concerns the ‘use’ of the score by economic actors and not its ‘establishment’ by credit information agencies. (39) It is the lawfulness of this activity that lies at the heart of the dispute in the main proceedings. For the reasons set out above, I consider that Article 6(1)(e) of the GDPR is not an appropriate legal basis.

–       The ground under point (f) of the first subparagraph of Article 6(1) of the GDPR

79.      It should then be examined whether that activity falls within the scope of point (f) of the first subparagraph of Article 6(1) of the GDPR. According to the Court’s case-law, (40) the provision in question lays down three cumulative conditions in order for the processing of personal data to be lawful, namely, first, the pursuit of a legitimate interest by the data controller or by the third party or third parties to which the data are communicated; second, the need to process personal data for the purposes of the legitimate interests pursued; and third, that the fundamental rights and freedoms of the person concerned by the data protection do not take precedence.

80.      First, with regard to the pursuit of a ‘legitimate interest’, I would point out that the GDPR and the case-law recognise a wide range of interests considered legitimate, (41) while specifying that, according to Article 13(1)(d) of the GDPR, it is the responsibility of the controller to indicate the legitimate interests pursued under point (f) of the first subparagraph of Article 6(1) of the GDPR. As I have already stated in this Opinion, the legislative objective of Paragraph 31 of the BDSG is to ensure the lawfulness of activities carried out by credit information agencies given that, in the view of the German legislature, they contribute to the smooth functioning of a country’s economy. (42) In so far as those activities guarantee the protection of various economic actors against the risks inherent in insolvency, with serious consequences for the stability of the financial system, it can be stated at this point in the analysis that the abovementioned national provision pursues an economic objective which is capable of constituting a ‘legitimate interest’ within the meaning of point (f) of the first subparagraph of Article 6(1) of the GDPR.

81.      Next, as regards the condition relating to the need to process personal data for the purposes of the legitimate interests pursued, according to the case-law of the Court of Justice, derogations and limitations in relation to the protection of personal data must apply only in so far as is strictly necessary. (43) It is necessary therefore for a close link to exist between the processing and the interest pursued, in the absence of alternatives that are more data-protection friendly, since it is not enough for the processing merely to be of use to the controller. It should be noted in this regard that, while the referring court expresses some doubts as to the lawfulness of scoring in the light of the provisions of the GDPR, it does not provide any information to suggest the possible existence of alternatives that are more data-protection friendly. In the absence of information to the contrary, I am inclined to recognise a degree of latitude in the choice of appropriate measures to attain the objective pursued.

82.      As regards, lastly, the balancing of the interests of the controller and the interests or fundamental rights and freedoms of the data subject, it must be stated that the various interests at stake were balanced in this case through legislation. In adopting Paragraph 31 of the BDSG, the German legislature allowed economic interests to take precedence over the right to protection of personal data. That approach would be possible only if point (f) of the first subparagraph of Article 6(1) of the GDPR laid down a clause permitting Member States to maintain or introduce more specific provisions to adapt the rules of that regulation with regard to processing. That is not the case, however, as I will explain below.

83.      As is clear from the wording of Article 6(2) and (3) of the GDPR, the maintenance or introduction of more specific provisions is permitted only for the situations referred to in points (c) and (e) of paragraph 1. The above analysis showed that Paragraph 31 of the BDSG does not cover any of the circumstances which might come under those situations, which logically means that Article 6(2) and (3) of the GDPR cannot be invoked as a legal basis. Applying it to the situation referred to in point (f) of paragraph 1 would not only be contrary to the wording of those provisions, but would also run counter to the will of the EU legislature, as it can be inferred from the drafting history of those provisions.

84.      I would like to point out in this regard that under Article 5 of Directive 95/46/EC (44) – the legal act preceding the GDPR – it was for the Member States to ‘determine more precisely the conditions under which the processing of personal data [was] lawful’. The Court interpreted that provision by concluding that there was nothing to preclude Member States, in the exercise of their discretion laid down in Article 5 of Directive 95/46, from establishing ‘guidelines’ in respect of the balancing which was necessary pursuant to Article 7(f) of that directive, which corresponds to point (f) of the first subparagraph of Article 6(1) of the GDPR. It should be noted, however, that the GDPR no longer confers any such power on Member States. The fact that there is no equivalent provision in the GDPR means that Member States may no longer establish guidelines in their national law in order to specify the ‘legitimate interest’ within the meaning of point (f) of the first subparagraph of Article 6(1) of that regulation. (45)

85.      The effect of the reference to ‘provisions relating to specific processing situations’ in Chapter IX, contained in paragraphs 2 and 3, is not to broaden the scope of Article 6 of the GDPR. Rather it is a reference to provisions authorising Member States to adopt more specific rules in defined areas, namely where the processing is necessary for compliance with a ‘legal obligation’ in accordance with point (c) of paragraph 1 or for the performance of a ‘task carried out in the public interest or in the exercise of official authority’ in the case referred to in point (e) of that paragraph. (46) As I have already stated, however, these areas have no connection with the circumstances in which Paragraph 31 of the BDSG applies.

86.      It should also be noted in this connection that although the Commission’s proposal for a regulation conferred on it the power to ‘adopt delegated acts … for the purpose of further specifying the conditions referred to in [point (f) of the first subparagraph of Article 6(1)] for various sectors and data processing situations, including as regards the processing of personal data related to a child’, that proposal was not adopted by the EU legislature. An analysis of the development of the text shows that Member States’ regulatory powers were reduced in the interest of further harmonisation with a view to ensuring a consistent and homogeneous application of the rules on protection of personal data, as is confirmed by recitals 3, 9 and 10 of the GDPR. (47) This must be taken into account in interpreting Article 6 of the GDPR.

87.      Lastly, in my view, even if point (f) of the first subparagraph of Article 6(1) of the GDPR were applicable, a national provision such as Paragraph 31 of the BDSG cannot be considered to be consistent with EU law. I should point out that the Court has interpreted Article 7(f) of Directive 95/46 as meaning that ‘Member States cannot definitively prescribe, for certain categories of personal data, the result of the balancing of the opposing rights and interests, without allowing a different result by virtue of the particular circumstances of an individual case’. (48) Since that provision was drafted with almost identical wording to the provision which replaced it, namely point (f) of the first subparagraph of Article 6(1) of the GDPR, this interpretation would still appear to be valid. (49) As the referring court suggests, it seems that the national legislature had precisely this aim in view given that, in so far as Paragraph 31 of the BDSG authorises the use of scores in the financial sector, the economic interests of the financial sector are given precedence over the right to protection of personal data, without account being taken of the particular circumstances of the individual case. Such an approach would effectively broaden the scope of Article 6 of the GDPR to an unacceptable degree.

88.      In the light of the foregoing, I consider that point (f) of the first subparagraph of Article 6(1) of the GDPR cannot be legitimately invoked as a legal basis for the adoption of a national provision such as Paragraph 31 of the BDSG.

–       The ground under Article 6(4) in conjunction with Article 23(1) of the GDPR

89.      The referring court states that the combined provisions of Article 6(4) and Article 23(1) of the GDPR were invoked as legal bases in the legislative procedure which led to the adoption of Paragraph 31 of the BDSG. However, the idea of relying on those provisions was subsequently abandoned. The referring court takes the view that those provisions are not applicable in this case.

90.      It is not possible, in the absence of more detailed information, to take a view on the potential applicability of those provisions. Nor does there seem to be any need to do so if they did not play any role during the legislative procedure, as the referring court asserts. (50)

(iii) Interim conclusion

91.      In the preceding points I examined whether Articles 6 and 22 of the GDPR can serve as a legal basis for the adoption of a national provision such as Paragraph 31 of the BDSG in order to justify the lawfulness of scoring as part of the activities carried out by credit information agencies. A number of reasons, explained in detail in my analysis, reinforce my conviction that this possibility must be ruled out. In short, I consider that, in the absence of opening clauses or exemptions authorising Member States to adopt more precise rules or to derogate from the rules of the GDPR in order to regulate the abovementioned activity, and in view of the degree of harmonisation pursued by that regulation, which under Article 288 TFEU is binding in its entirety and directly applicable in all Member States, that national provision must be considered to be inconsistent with the GDPR.

92.      Because the Court does not have the competence to interpret national law or to rule on its consistency with EU law in preliminary ruling proceedings under Article 267 TFEU, the arguments addressed in this Opinion must be understood as guidance for the interpretation of the relevant provisions of the GDPR with a view to enabling the referring court, if appropriate, to exercise that competence after having itself examined Paragraph 31 of the BDSG in the light of the provisions of that regulation, in particular having regard to the possibility of interpreting the national legislation in compliance with the requirements of EU law.

93.      The principle of the primacy of EU law establishes the pre-eminence of EU law over the law of the Member States. That principle therefore requires all Member State bodies to give full effect to the various provisions of EU law, and the law of the Member States may not undermine the effect accorded to those various provisions in the territory of those States. In the light of that principle, where it is unable to interpret national legislation in compliance with the requirements of EU law, the national court which is called upon within the exercise of its jurisdiction to apply provisions of EU law is under a duty to give full effect to those provisions, if necessary refusing of its own motion to apply any conflicting provision of national legislation, even if adopted subsequently, and it is not necessary for that court to request or await the prior setting aside of such provision by legislative or other constitutional means. (51)

(b)    Answer to the second question referred for a preliminary ruling

94.      In answer to the second question referred for a preliminary ruling, I consider that Article 6(1) and Article 22 of the GDPR must be interpreted as not precluding national legislation on profiling where such profiling is not covered by Article 22(1) of that regulation. However, in that case the national legislation must comply with the conditions laid down in Article 6 thereof. In particular, it must have an appropriate legal basis, which must be verified by the referring court.

VI.    Conclusion

95.      In the light of the above considerations, I propose that the Court answer the questions referred for a preliminary ruling by the Verwaltungsgericht Wiesbaden (Administrative Court, Wiesbaden, Germany) as follows:

(1)      Article 22(1) of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)

must be interpreted as meaning that the automated establishment of a probability value concerning the ability of a data subject to service a loan in the future constitutes a decision based solely on automated processing, including profiling, which produces legal effects concerning the data subject or similarly significantly affects him or her, where that value, determined by means of personal data of the data subject, is transmitted by the controller to a third-party controller and the latter, in accordance with consistent practice, draws strongly on that value for its decision on the establishment, implementation or termination of a contractual relationship with the data subject.

(2)      Article 6(1) and Article 22 of Regulation 2016/679

must be interpreted as not precluding national legislation on profiling where such profiling is not covered by Article 22(1) of that regulation. However, in that case the national legislation must comply with the conditions laid down in Article 6 thereof. In particular, it must have an appropriate legal basis, which must be verified by the referring court.


1      Original language: French.


2      OJ 2016 L 119, p. 1.


3      BGBl. 2017 I, p. 2097.


4      BGBl. 2019 I, p. 1626.


5      These are specifically Articles 18 and 21 of Directive 2014/17/EU of the European Parliament and of the Council of 4 February 2014 on credit agreements for consumers relating to residential immovable property and amending Directives 2008/48/EC and 2013/36/EU and Regulation (EU) No 1093/2010 (OJ 2014 L 60, p. 34), and Articles 8 and 9 of Directive 2008/48/EC of the European Parliament and of the Council of 23 April 2008 on credit agreements for consumers and repealing Council Directive 87/102/EEC (OJ 2008 L 133, p. 66).


6      According to the ‘Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679’, adopted on 3 October 2017 by the Article 29 Data Protection Working Party, profiling and automated decision-making can pose significant risks for individuals’ rights and freedoms. In particular, profiling can perpetuate existing stereotypes and social segregation. Furthermore, in so far as data subjects may be limited in their freedom to choose certain products or services, profiling can lead to denial of goods and services and unjustified discrimination.


7      See judgment of 28 April 2022, Meta Platforms Ireland (C‑319/20, EU:C:2022:322, paragraphs 57 and 60).


8      See judgment of 28 October 2020, Pegaso and Sistemi di Sicurezza (C‑521/18, EU:C:2020:867, paragraphs 26 and 27).


9      See Opinion of Advocate General Richard de la Tour in Nemzeti Adatvédelmi és Információszabadság Hatóság (C‑132/21, EU:C:2022:661, point 43 et seq.).


10      See judgment of 22 June 2010, Melki and Abdeli (C‑188/10 and C‑189/10, EU:C:2010:363, paragraph 45).


11      See judgment of 12 January 2023, Nemzeti Adatvédelmi és Információszabadság Hatóság (C‑132/21, EU:C:2023:2, paragraph 57).


12      See the Spanish (‘tratamiento automatizado, incluida la elaboración de perfiles’), Danish (‘automatisk behandling, herunder profilering’), German (‘einer automatisierten Verarbeitung – einschließlich Profiling’), Estonian (‘automatiseeritud töötlusel, sealhulgas profiilianalüüsil’), English (‘automated processing, including profiling’), French (‘un traitement automatisé, y compris le profilage’) and Polish (‘zautomatyzowanym przetwarzaniu, w tym profilowaniu’) versions (my emphasis).


13      See, to that effect, Bygrave, L.A., ‘Article 22. Automated individual decision-making, including profiling’, The EU General Data Protection Regulation (GDPR), Kuner, C., Bygrave, L.A., Docksey, C. (eds.), Oxford, 2020, p. 532.


14      Abel, R., ‘Automatisierte Entscheidungen im Einzelfall gem. Art. 22 DS-GVO – Anwendungsbereich und Grenzen im nicht-öffentlichen Bereich’, Zeitschrift für Datenschutz, 7/2018, p. 307, considers that this provision relates to ‘decisions’ which have an impact on the legal situation of data subjects or cause lasting disruption to their economic or personal development.


15      Helfrich, M. and Sydow, G., DS-GVO/BDSG, 2nd edition, Baden-Baden, 2018, Article 22, paragraph 51, consider that it is crucial to ascertain whether the exercise of rights and freedoms by data subjects is affected, for example, by the consequences of an observation and an evaluation which could affect them significantly in the development of their personality.


16      Bernhardt, U., Ruhrman, I., Schuler, K. and Weichert, T., ‘Evaluation der Europäischen Datentschutz-Grundverordnung’, version of 18 July 2019, Netzwerk Datenschutzexpertise, p. 7, consider that the algorithms used in profiling have considerable potential for discrimination and may cause damage, which is why the authors take the view that it must be specified that all forms of comprehensive and complex profiling are covered by the prohibition mentioned in Article 22(1) of the GDPR.


17      See, to that effect, Sydow, G. and Marsch, N., DS-GVO/BDSG, 3rd edition, Baden-Baden, 2022, Paragraph 31 of the BDSG, point 5, who consider that scoring is capable of affecting data subjects significantly and similarly to a decision which produces legal effects.


18      See the Spanish (‘únicamente’), Danish (‘alene’), German (‘ausschließlich’), Estonian (‘üksnes’), English (‘solely’), French (‘exclusivement’) and Polish (‘wyłącznie’) versions.


19      Such an approach seems all the more necessary because at the hearing neither SCHUFA nor the HBDI were able to give a clear answer to the question whether scores tend to predetermine the decisions of financial institutions. However, SCHUFA’s representative stated that financial institutions benefit from the experience and expertise of credit information agencies in establishing the creditworthiness of a natural person, which could, in principle, be seen to indicate a significant influence on the decision-making process.


20      Blasek, K., ‘Auskunfteiwesen und Kredit-Scoring in unruhigem Fahrwasser – Ein Spagat zwischen Individualschutz und Rechtssicherheit’, Zeitschrift für Datenschutz, 8/2022, pp. 436 and 438, considers that an application of Article 22(1) of the GDPR cannot be ruled out where bank employees do not question automated evaluations (profiling, scores) carried out by credit information agencies. According to the author, banks should not rely solely on that external information, but should instead carry out appropriate verifications themselves.


21      See, to that effect, Horstmann, J. and Dalmer, S., ‘Automatisierte Kreditwürdigkeitsprüfung – Externes Kreditscoring im Lichte des Verbots automatisierter Einzelfallentscheidungen’, Zeitschrift für Datenschutz, 5/2022, p. 263.


22      My emphasis.


23      My emphasis.


24      See, Zanfir-Fortuna, G., ‘Article 15. Right of access by the data subject’, The EU General Data Protection Regulation (GDPR), Kuner, C., Bygrave, L.A. and Docksey, C. (eds.), Oxford, 2020, p. 463.


25      See, to that effect, ‘Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation (EU) 2016/679’, adopted on 3 October 2017 by the Article 29 Data Protection Working Party, pp. 28 and 30.


26      See ‘Entwurf eines Gesetzes zur Anpassung des Datenschutzrechts an die Verordnung (EU) 2016/679 und zur Umsetzung der Richtlinie (EU) 2016/680 (Datenschutz-Anpassungs- und Umsetzungsgesetz EU – DSAnpUG-EU)’, Bundesrat – Drucksache 110/17, 2.2.2017, pp. 101 and 102; Abel, R., ‘Einmeldung und Auskunfteitätigkeit nach DS-GVO und § 31 BDSG – Frage der Rechtssicherheit im neuen Recht’, Zeitschrift für Datenschutz, 3/2018, p. 105, criticises the draft law, which does not indicate the opening clause on which Paragraph 31 of the BDSG is based, and expresses doubts whether that provision complies with EU law.


27      See, to that effect, Horstmann, J. and Dalmer, S., ‘Automatisierte Kreditwürdigkeitsprüfung – Externes Kreditscoring im Lichte des Verbots automatisierter Einzelfallentscheidungen’, Zeitschrift für Datenschutz, 5/2022, p. 265.


28      See point 44 of this Opinion.


29      Judgment of 22 June 2021, Latvijas Republikas Saeima (Penalty points) (C‑439/19, EU:C:2021:504, paragraph 99).


30      Judgment of 22 June 2021, Latvijas Republikas Saeima (Penalty points) (C‑439/19, EU:C:2021:504, paragraph 81).


31      See, to that effect, von Lewinski, K. and Pohl, D., ‘Auskunfteien nach der europäischen Datenschutzreform – Brüche und Kontinuitäten der Rechtslage’, Zeitschrift für Datenschutz, 1/2018, p. 19.


32      See, to that effect, Abel, R., ‘Einmeldung und Auskunfteitätigkeit nach DS-GVO und § 31 BDSG – Frage der Rechtssicherheit im neuen Recht’, Zeitschrift für Datenschutz, 3/2018, p. 106.


33      See ‘Entwurf eines Gesetzes zur Anpassung des Datenschutzrechts an die Verordnung (EU) 2016/679 und zur Umsetzung der Richtlinie (EU) 2016/680 (Datenschutz-Anpassungs- und Umsetzungsgesetz EU – DSAnpUG-EU)’, Bundesrat – Drucksache 110/17, 2.2.2017, pp. 101 and 102. At the hearing, the German Government confirmed that this was indeed the legislative objective of Paragraph 31 of the BDSG.


34      See, in this regard, Guggenberger, N. and Sydow, G., Bundesdatenschutzgesetz, 1st edition, Baden-Baden, 2020, paragraph 31, points 2 and 5.


35      See judgment of 27 March 2014, LCL Le Crédit Lyonnais (C‑565/12, EU:C:2014:190, paragraphs 40 and 42), concerning the creditor’s obligation under Article 8(1) of Directive 2008/48 to assess the consumer’s creditworthiness before the conclusion of a credit agreement, including on the basis of a consultation of the relevant database. According to the Court, that obligation, prior to conclusion of the agreement, is intended to protect consumers against the risks of over-indebtedness and bankruptcy, ensuring a high level of protection of their interests and facilitating the emergence of a well-functioning internal market in consumer credit.


36      See judgment of 6 June 2019, Schyns (C‑58/18, EU:C:2019:467, paragraphs 45 and 46), in which the Court ruled that the obligation on the creditor under Article 18(5)(a) of Directive 2014/17 to check the consumer’s creditworthiness before making credit available to him is intended to prevent irresponsible behaviour by market participants which can undermine the foundations of the financial system.


37      See, to that effect, Sydow, G. and Marsch, N., DS-GVO/BDSG, 3rd edition, Baden-Baden, 2022, paragraph 31 of the BDSG, point 6; and Abel, R., ‘Einmeldung und Auskunfteitätigkeit nach DS-GVO und § 31 BDSG – Frage der Rechtssicherheit im neuen Recht’, Zeitschrift für Datenschutz, 3/2018, p. 105.


38      See, in this regard, Guggenberger, N. and Sydow, G., Bundesdatenschutzgesetz, 1st edition, Baden-Baden, 2020, paragraph 31, point 5.


39      See point 66 of this Opinion.


40      Judgment of 17 June 2021, M.I.C.M. (C‑597/19, EU:C:2021:492, paragraph 106).


41      See, in that regard, Opinion of Advocate General Rantos in Meta Platforms and Others (General terms of use of a social network) (C‑252/21, EU:C:2022:704, point 60).


42      See point 74 of this Opinion.


43      See judgments of 4 May 2017, Rīgas satiksme (C‑13/16, EU:C:2017:336, paragraph 30), and of 17 June 2021, M.I.C.M. (C‑597/19, EU:C:2021:492, paragraph 110).


44      Directive of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (OJ 1995 L 281, p. 31).


45      See, to that effect, Heberlein, H., DS-GVO – Kommentar, Munich, 2017, Article 6, paragraphs 28 and 32.


46      See, to that effect, Heberlein, H., op. cit., paragraph 32; Roßnagel, A., Datenschutzrecht,  Simitis, S., Hornung, G. and Spiecker, I. (eds.), Munich, 2019, Article 6, paragraph 23.


47      See judgment of 22 June 2022, Leistritz (C‑534/20, EU:C:2022:495, paragraph 26).


48      See judgment of 19 October 2016, Breyer (C‑582/14, EU:C:2016:779, paragraph 62).


49      See judgment of 1 August 2022, Vyriausioji tarnybinės etikos komisija (C‑184/20, EU:C:2022:601, paragraph 66), in which the Court gave a uniform interpretation to certain provisions of Directive 95/46 and of the GDPR.


50      Guggenberger, N. and Sydow, G., Bundesdatenschutzgesetz, 1st edition, Baden-Baden, 2020, Paragraph 31, point 6, confirm the referring court’s assessment that those provisions are irrelevant in the choice of a legal basis for the adoption of Paragraph 31 of the BDSG.


51      Judgment of 21 June 2022, Ligue des droits humains (C‑817/19, EU:C:2022:491, paragraph 293).