Data Protection Insider, Issue 56

–  AG Bobek Reads the Judicial Capacity Exception Broadly

On 6th October, the AG Bobek delivered his Opinion in the case of X, Z v Autoriteit Persoonsgegevens. As to the facts of the case, as a standard practice, in the Netherlands journalists are granted access to certain judicial documents before the hearing of a court case in order to be able to better report on it. The applicants in the present case, whose case was heard in Dutch courts and relevant materials disclosed to journalists, submitted a complaint with the Dutch Supervisory Authority (SA), claiming, amongst others, that they had not consented to the disclosure. The SA did not consider itself competent to examine the complaint, because it concluded that ‘the processing at issue was carried out in the national courts’ ‘judicial capacity’, pursuant to Article 55(3) of the GDPR.’ The local courts, with which the decision was appealed, submitted a question to the CJEU whether this is indeed the case. Based on the CJEU case law, AG Bobek noted that the disclosure in question should indeed constitute a personal data processing operation, which is in principle subject to the rules of the GDPR. However, he argued that it could be exempt from the supervision by the SAs set up under the GDPR, pursuant to Article 55 (3), because the disclosure should be considered to be performed in the court’s judicial capacity. AG Bobek gave a very broad reading of the scope of ‘judicial capacity’, arguing that a data processing does not need to prejudice the independence of the courts for it to fall within this exception, but that one ‘should employ a broad interpretation of the concept of ‘judicial capacity’ that goes beyond mere judicial decision-making in an individual case. It must also cover all activities that may indirectly impact upon the judicial independence of the courts. As such, courts should, by default, be considered to be acting in a ‘judicial capacity’ unless it is established, as regards a specific type of activity, that it is of administrative nature only. In addition, he argued that it is enough that there is a general legal basis, which allows the disclosure, and it is not necessary that the proportionality of each disclosure is examined in advance. We find that the Opinion makes a very interesting read for all those who are trying to define the scope and limits of EU data protection law and its role within the legal system of other laws and social norms.


ECtHR Rules on Secret Surveillance in Gladkiy and Others v. Russia

On 30th September, the European Court of Human Rights ruled in the case of Gladkiy and Others v. Russia. The Case was decided by Committee. The case concerned the secret surveillance of the plaintiffs by law enforcement agencies, which had suspected the plaintiffs of engaging in smuggling activities. The plaintiffs complained on the basis of Article 8 – right to respect for private and family life – and Article 13 – right to an effective remedy – of the ECHR that the secret surveillance constituted a violation of their rights. With regards to Article 8, the Court found a violation. The Court highlighted that: ‘measures aimed at interception of telephone communications amounted to an interference with the exercise of the rights set out in Article 8…Such interference will give rise to a breach of the Convention unless it can be shown that it was “in accordance with law”, pursued one or more legitimate aim or aims…and was “necessary in a democratic society” to achieve those aims’. After recalling comparable cases, the Court found: ‘There is no evidence that any information or document confirming the suspicion against the applicants was submitted to the courts which authorised interception of the applicants’ telephone conversations. Nor is there any indication that those courts applied the test of “necessity in a democratic society”, and in particular assessed whether the surveillance measures carried out against the applicants were proportionate to any legitimate aim pursued. These complaints are therefore admissible and disclose a breach of Article 8 of the Convention.’ The case is short and easy to read but – as to be expected from a Committee ruling – predominantly reiterates previous considerations concerning secret surveillance measures and comes to a predictable conclusion.


EP Resolution to Ban Mass (Biometric) Surveillance

On 6th October, the European Parliament adopted a resolution, in which the MEPs called for strong safeguards in relation to AI technologies, in particular against discrimination, especially those in the law enforcement field and border control context. In addition, ‘[t]o ensure that fundamental rights are upheld when using these technologies, algorithms should be transparent, traceable and sufficiently documented, MEPs ask. Where possible, public authorities should use open-source software in order to be more transparent.’ Most importantly, the Resolution goes as far as to call for an explicit ban on ‘on the automated recognition of individuals in public spaces’, ‘the use of private facial recognition databases (like the Clearview AI system, which is already in use) and predictive policing based on behavioural data’, and ‘social scoring systems, which try to rate the trustworthiness of citizens based on their behaviour or personality.’ Finally, the Resolution calls for border control projects like iBorderCtrt and technologies for remote identification of travellers to be discontinued.


Council Agrees Position on Data Governance Act

On 1st October, the Council announced that the Member States had agreed a position on the Data Governance Act. In principle the Act aims to ‘set up solid mechanisms to facilitate the reuse of certain categories of protected public-sector data, increase trust in data intermediation services and promote data altruism across the EU.’ The Council position includes a number of deviations from the initial proposition adopted by the Commission. The Council position, for example, suggests: ‘introduc[ing] more flexibility in the text and tak[ing] account of national specificities that already exist in some member states [in relation to the reuse of public data]’, ‘clarif[ying] the scope of [data intermediation] provisions, in particular to indicate more clearly which types of companies can be data intermediaries’ and ‘add[ing] compliance with a code of conduct as a requirement for registration as a recognised data altruism organisation.’ The adoption of the position will now allow the Council to negotiate with the European Parliament toward a final text.


EDPB Adopts Opinion on Adequacy in relation to the Republic of Korea

On 24th September 2021, the EDPB adopted ‘Opinion 32/2021 regarding the European Commission Draft Implementing Decision pursuant to Regulation (EU) 2016/679 on the adequate protection of personal data in the Republic of Korea Version 1.0’. The Opinion highlights positive aspects. For example, the EDPB observes: ‘With regards to the content, the EDPB notes key areas of alignment between the GDPR framework and the Korean data protection framework with regard to certain core provisions such as, for example, concepts (e.g., “personal information”, “processing”, “data subject”); grounds for lawful and fair processing for legitimate purposes; purpose limitation; data quality and proportionality; data retention, security and confidentiality; transparency; and special categories of data.’ The EDPB, however, also highlight a number of challenges related to the Draft Decision. These include, for instance: the legal status of Korean administrative rules bridging the gap between the GDPR and the Korean framework; the scope and function of the concept of pseudonymised data in the Korean framework; the scope of the right to withdraw consent under the Korean framework; and the scope of the obligation to inform data subjects in relation to telecommunications companies’ disclosures of personal data to national security authorities.


CNIL Issues Injunctions against French Fingerprint Database

On 30th September, the CNIL concluded that the French computerised fingerprint and palmprint database (FAED), maintained by the French Interior Ministry, suffers from ‘apparent illegal storage of data, poor file management, and a lack of information provided to persons whose data is kept on the system’. More precisely, the databases contained further personal data than envisaged in law and included data of persons who are no longer suspects. It looks like more than 2 million records were kept longer than the legally allowed storage limits. In addition, a huge number of records were stored in a paper form and were not yet digitalized. Finally, the security features of the system were considered to be very poor and individuals were not informed that their data are stored on the FAED, in breach of French law. Now, the Interior Ministry has to bring the FAED in compliance with data protection law. However, it will not be fined, because the CNIL may not fine State institutions, but it can only issue injunctions.


DPI Editorial Team

Dara Hallinan, Editor: Legal academic working at FIZ Karlsruhe. His specific focus is on the interaction between law, new technologies – particularly ICT and biotech – and society. He studied law in the UK and Germany, completed a Master’s in Human Rights and Democracy in Italy and Estonia and wrote his PhD at the Vrije Universiteit Brussel on the better regulation of genetic privacy in biobanks and genomic research through data protection law. He is also programme director for the annual Computers, Privacy and Data Protection conference.

Diana Dimitrova, Editor: Researcher at FIZ Karlsruhe. Focus on privacy and data protection, especially on rights of data subjects in the Area of Freedom, Security and Justice. Completed her PhD at the VUB on the topic of ‘Data Subject Rights: The rights of access and rectification in the AFSJ’. Previously, legal researcher at KU Leuven and trainee at EDPS. Holds LL.M. in European Law from Leiden University.

Hinterlasse eine Antwort