– EDPB Guidelines on Calculation of Administrative Fines –
On 12th May 2022, the EDPB published their ‘Guidelines 04/2022 on the calculation of administrative fines under the GDPR’. With the Guidelines, the EDPB aims ‘to harmonise the methodology supervisory authorities use when calculating of the amount of the fine’. The EDPB has thus ‘devised [a] methodology, consisting of five steps, for calculating administrative fines for infringements of the GDPR’, which it elaborates in the Guidelines. In terms of substance, the Guidelines cover the following issues: ‘Methodology for Calculating the Amount of the Fine’; ‘Concurrent Infringements and the Application of Article 83(3) GDPR’; ‘Starting Point for Calculation’; ‘Aggravating and Mitigating Circumstances’; ‘Legal Maximum and Corporate Liability’; ‘Effectiveness, Proportionality and Dissuasiveness’; and ‘Flexibility and Regular Evaluation’. The guidelines provide an extensive elaboration of one of the most significant aspects of the GDPR. The Guidelines take on particular significance in light of the variation in the levels of fines between Member States thusfar. It will be interesting to see the extent to which the Guidelines are followed and what impact this has on fining practises. The Guidelines are now open for comments, which ‘should be sent 27th June 2022 at the latest’.
– EDPB Adopts Guidelines on the Use of FRT in the Law Enforcement Sector–
On 12th May, the EDPB adopted ‘Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement’. The Guidelines are structured as follows. First, the EDPB provide a short background on what FRT is, what its possible uses and applications are. Second, the EBPD recalls that the CFREU is applicable when FRT is in operation in the law enforcement sector and is likely to entail an interference with several of its provisions – especially with Articles 7 and 8, the fundamental rights to privacy and data protection. On that point the EDPB also provides a short analysis of how and when such technology may or may not satisfy the conditions for interference with fundamental rights as set out in Article 52(1) CFREU. The EDPB also recalls that the provisions of the ECHR need to be respected. Third, the EDPB discusses the applicability of the LED provisions to FRT. It focuses on the following points: the legal basis for the processing of sensitive data; automated decision-making, including profiling; the distinctions between the different categories of data subjects; the rights of the data subjects; DPIA; prior consultation of the supervisory authority; data protection by design and by default; and logging. Finally, the Guidelines contain three useful Annexes. Annex I contains a ‘Template for description of scenarios’. Annex II provides ‘Practical guidance for managing FRT projects in LEAs’. Annex III discusses several use cases as practical examples of FRT. The Guidelines are open for public consultation until 27th June 2022.
On 19th May, Advocate General (AG) Campos Sánchez-Bordona interpreted the compliance of the re-purposing of the personal data processing collected for law enforcement purposes with the LED and the GDPR. As to the facts of the case, the personal data of the applicant, VS, were initially collected because he was believed to be a victim of a crime. Later, the prosecution decided to qualify the applicant as a suspect in the same crime and to continue processing the collected data to investigate the applicant. The prosecution also wished to process VS’s personal data, who complained that their data had been stored for a disproportionately long period by the prosecution in relation to other investigations against him, in order to defend itself in civil court. In the course of the administrative and judicial proceedings at national level, questions on the legality of the re-purposing of the data processing under the LED, especially with Articles 4 and 9 LED, and in relation to the GDPR, were raised and sent for preliminary ruling to the CJEU. The AG proposed the following interpretation of the LED. First, on the question of the re-processing of the applicant’s data for the purposes of investigating against him, the AG argued that the re-qualification of the applicant as a suspect did not constitute a new purpose for the processing of their personal data in the sense of Article 4(2) LED. He noted, though, that, should the Court nevertheless interpret the re-qualification as a new purpose of the processing, it would easily fulfil the requirements of Article 4(2) LED, because: i) the processing of data for the purposes of investigating against the applicant was carried out by the prosecution, which was empowered to process personal data for prosecution purposes – which is one of the legitimate purposes under Article 1(1) LED; and ii) the ‘new’ processing was necessary and proportionate – especially due to the logical continuity between the two processings. Second, on the question of the processing of the data for civil dispute purposes, the AG observed that the processing of the data for civil dispute purposes is not a law enforcement purpose and hence the GDPR is applicable. For the processing to be legal under the GDPR, the AG suggested that Article 6(1)(e) GDPR could in principle be relied on as a legal basis and that the national courts should examine whether the requirements of Article 6(1), and potentially also Articles 6(3) and 6(4) GDPR, are fulfilled. It remains to be seen whether the Court will follow the Opinion.
– EDPS Publishes Opinion on Cybercrime Negotiation –
On 18th May, the EDPS published ‘Opinion 9/2022 on the Recommendation for a Council Decision authorising the negotiations for a comprehensive international convention on countering the use of information and communications technologies for criminal purposes’. In principle, the EDPS ‘supports the recommendation for the Commission to be authorised to negotiate on behalf of the EU, as it would contribute to better preserving the level of protection guaranteed by the EU data protection framework’. The EDPS also, however, sees potential issues with such a negotiation and, in this regard, offers the current Opinion ‘to provide constructive and objective advice to the EU institutions with a view to ensuring that the level of data protection as guaranteed by EU law is not undermined.’ In terms of substantive content, the Opinion includes sections and comments on: i) ‘General remarks’ – including highlighting that ‘the EU should not seek to become a party to such a Convention if it could undermine the level of data protection of natural persons guaranteed by EU law’; ii) ‘Relationship with other instruments’ – including a recommendation to include ‘in the Annex that the Union should aim to achieve that future agreements with third countries should apply in lieu of the Convention should these future agreements ensure higher standards with regard to the protection of fundamental rights, in particular the right to privacy and data protection’; iii) ‘Scope of the Convention’ – including the observation that ‘cross-border direct access to data by law enforcement authorities of third countries [are] a particularly intrusive measure and consequently [have] a bigger impact on the fundamental rights to privacy and data protection…[and] such provisions should not be introduced in the Convention and a directive to this effect should be introduced in the Annex’; iv) ‘Need for appropriate safeguards regarding international data transfers and the respect of fundamental rights’ – including an expression of ‘strong doubts whether the future Convention could provide a stand-alone basis for transfers of personal data in accordance with EU law and therefore [recommending] making sure that Member States are able to invoke and impose additional safeguards so that transfers taking place in the context of this Convention would only be those in line with EU law, including EU secondary law’; and v) ‘Suspension, review of the Convention and establishment of relations pursuant to the Convention’ – including a recommendation to ‘further specify that the Union should aim to achieve that a Contracting State may at the time of signature, ratification or accession, declare that it will not execute a request for the transfers of personal data to another party, should there be indications that an essential level of protection of the data is no longer ensured in the requesting State.’
– EDPB Publishes Annual Report for 2021 –
On 12th May, the EDPB published its ‘Annual Report 2021’. The report covers a wide range of EDPB activities in 2021. In terms of content, the report includes sections on: ‘Highlights’; ‘the EDPB Secretariat’; ‘’Activities in 2021’; ‘Supervisory Authority – Activities in 2021’; and ‘Coordinated Supervision Committee of the Large Eu Information Systems and of EU Bodies, Offices and Agencies’. The report is long, at 94 pages, and much of the content will be familiar to the data protection community – for whom the EDPB is a central point of reference. Despite this familiarity, the report nevertheless interesting and worth reading. This is true for several reasons. In the first instance, the report allows the reader, at a glance, to gain an overview of the range of EDPB activities in 2021. This offers a holistic impression of the impressive scope of work and activity of the board – a perspective which is not available from usual interfaces with this work, for example consulting Opinions and Guidelines. Further, there are aspects of the report which deal with issues which do not always receive much attention – for example, the work of the secretariat.
– ICO Takes Measures against Clearview AI –
On 23rd May, the ICO fined Clearview AI £7.5 m for breaching several UK data protection provisions – by collecting the facial images of individuals for facial recognition purposes – and ordered it to delete the data it had collected on UK residents and to stop collecting their facial images in the future. The ICO noticed that even though British institutions no longer use its services to perform facial recognition identification, the data of British residents are still stored on Clearview AI’s servers and may be accessed by institutions in other countries. More precisely, the ICO established the following data protection breaches:
- ‘failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way;
- failing to have a lawful reason for collecting people’s information;
- failing to have a process in place to stop the data being retained indefinitely;
- failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR);
- asking for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used.’
The enforcement action is the result of a joint investigation between the ICO and the Office of the Australian Information Commissioner. We note that, as we have previously reported, this is not the first enforcement action on data protection grounds against Clearview AI.