“The Court considers that it is contrary to the GDPR for private agencies to keep such data for longer than the public insolvency register,” it wrote in a press release on case C-634/21 (plus joined cases C-26/22 and C-64/22). “The discharge from remaining debts is intended to allow the data subject to re-enter economic life and is therefore of existential importance to that person. That information is still used as a negative factor when assessing the solvency of the data subject. In this case, the German legislature has provided for data to be stored for six months. It therefore considers that, at the end of the six months, the rights and interests of the data subject take precedence over those of the public to have access to that information.”

In so far as the retention of data is unlawful, as is the case beyond six months, the data subject has the right to have the data deleted and the agency is obliged to delete the data as soon as possible,” the court added. 

The CJEU also ruled on a second complaint that looks rather existential for credit scoring companies — being as it questions whether Schufa can automatically issue credit scores, given the GDPR provides protections for individuals subject to solely automated decisions with legal or significant impacts on them. So, essentially, they may need to acquire people’s explicit consent to being credit scored.

The Court held that Schufa’s credit scoring must be regarded as an “automated individual decision”, which its press release notes is “prohibited in principle by the GDPR, in so far as Schufa’s clients, such as banks, credit to it a determining role in the granting of credit”.

If this kind of credit scoring is the basis for a decision by a bank, for instance, to deny an individual credit the practice risks ruling foul of EU data protection rules.

Though in the specific case it will be up to the Administrative Court of Wiesbaden to appraise whether the German Federal Law on data protection contains a valid exception to the prohibition in accordance with the GDPR. And, if that’s so, to check whether the general conditions laid down by the GDPR for data processing have been met — such as ensuring individuals are aware of their right to object and to ask for (and get) human intervention, as well as being able to furnish meaningful information about the logic of the credit scoring on ask.

‘Judicial review’ of DPA decisions

In another significant ruling, the CJEU also made it clear national courts must be able to exercise what its PR calls “full review” over any legally binding decision of a data protection authority.

Privacy rights group noyb, which has had multiple run ins with DPAs over their failure to act on (let alone enforce) complaints, seized on this as especially significant — dubbing it “full judicial review” of DPAs.

“The CJEU ruling massively increased the pressure on DPAs. In some EU member states, including Germany, they have so far assumed that a GDPR complaint from data subjects is merely a kind of ‘petition’. In practice, this has meant that despite an annual budget of €100M the German DPAs have rejected many complaints with bizarre justifications and GDPR violations have not been pursued. In countries such as Ireland, more than 99% of complaints were not processed and in France any right of those affected to take part in the procedure concerning their own rights was denied. Some DPAs, such as the Hessian authority in the present case, have also argued that the courts are prohibited from reviewing their decisions in detail,” it wrote in a press release responding to the ruling.

“The CJEU has now put an end to this approach. It has ruled that Article 77 of the GDPR is designed as a mechanism to effectively safeguard the rights and interests of data subjects. In addition, the court has ruled that the Article 78 of the GDPR allows national courts to carry out a full review of DPA decisions. This includes the assessment whether the authorities have acted within the limits of their discretion.”

Higher GDPR fines on the way too?

The pair of significant rulings follow another handed down by the CJEU yesterday (also via, in part, another Germany case referral) which legal experts suggest could result in significantly higher penalties for breaches of the GDPR as it lowers the requirements for imposing fines on legal entities.

So while, in this case (C-807/21), the Court held that wrongful conduct is necessary for a fine to be imposed — i.e. that a breach of the GDPR must have been committed “intentionally or negligently” — judges also said that, where a controller is a legal person, it is not necessary for the infringement to have been committed by its management body; nor is it necessary for that body to have had knowledge of that infringement.

They advance stipulated that the calculation of any fine requires the supervisory authority to take as its basis the concept of “an ‘undertaking’ under competition law”. (Aka, per the Court PR, that “the maximum amount of the fine must be calculated on the basis of a percentage of the total worldwide annual turnover of the undertaking concerned, taken as a whole, in the preceding business year” — or, basically, that the revenue of an entire group of companies may be used to compute a GDPR penalty for an infringement committed by a single unit of that group.)

Jan Spittka, partner at law firm Clyde & Co, predicted beefier GDPR fines could result. “The overall context of the decision will make it way easier for the data protection supervisory authorities of the EU member states to sanction legal entities and is also likely to result in significantly higher fines on average,” he suggested in a statement.

“Against the background of this standard only a detailed and strictly monitored data protection compliance system may put a legal entity in a position to argue that it was unaware of the unlawfulness of its conduct with regard to GDPR infringements committed by an employee,” he also said. “Furthermore, a legal entity may exculpate itself if representatives or employees act totally out of the scope of their job description, e. g. when misusing personal data for private purposes.”

Source link