– EDPB Adopts Eleven New Documents –
During its July 2019 Plenary Session, the EDPB adopted eleven new documents. These include:
• a set of Guidelines on video surveillance
• an Opinion on standard contractual clauses (pursuant to Art. 64 GDPR)
• an Opinion on codes of conduct (pursuant to Art. 64 GDPR)
• an Opinion on the competence of supervisory authorities when the main establishment of a controller changes (pursuant to Art. 64 GDPR)
• four Opinions on DPIAs (pursuant to Art. 64 GDPR)
• an Opinion concerning the EDPS’ list of processing operations subject to a DPIA pursuant to Regulation 2018/1725
• a joint EDPB–EDPS Opinion on the processing of patients’ data in the eHealth Digital Service Infrastructure (eHDSI) – the first joint Opinion by the EDPB and the EDPS in response to a request from the European Commission pursuant to Regulation 2018/1725 on data protection for EU institutions and bodies
• a joint EDPB–EDPS reply to the LIBE Committee on the implications of the US CLOUD Act.
Interestingly, the joint reply on the CLOUD Act observes that ‘any order under the CLOUD Act for transfer of personal data from the EU could only be lawful if there is a legal basis under Article 6 and Article 49 of the GDPR’ and calls explicitly for the conclusion of an international agreement containing strong procedural and substantive fundamental rights safeguards.
– Hearings on Bulk Communication Surveillance at ECtHR –
On 10th July 2019 the ECtHR held two hearings on the topic of bulk communication surveillance for intelligence purposes. Both hearings concerned referrals to the Grand Chamber following on from existing complaints. The first set of complaints was brought by Big Brother Watch and others (including journalists). This set of complaints concerned a Chamber judgement which ruled that bulk surveillance, in principle, does not necessarily violate Article 8 – although a violation of Article 8 was found regarding the standard of relevant national safeguards in place in relation to bulk surveillance. The second set of complaints was brought by Centrum för rättvisa in Sweden. This set of complaints concerned the possibility for individuals’ electronic communication to be subject to signal intelligence operations. The Chamber found no violation of Articles 8 and 13 despite having identified shortcomings in the applicable Swedish regime. Now it is up to the Grand Chamber to re-examine, in substance, the human rights considerations brought to its attention.
– Marriott International Fined 100 Million –
The ICO has handed down another colossal fine. The fine was handed down to the Marriott hotel group in relation to a data breach which led to the exposure of 339 million guest records globally – of which 30 million records related to residents of the European Union. This fine is significant for two reasons. The first reason is the scale of the fine. The fine is not only enormous but closely follows a fine of more than 180 million handed down to British Airways earlier this month and, accordingly, begins to normalize such fines. The second reason is the substance of the decision. The breach resulted from a vulnerability in the Starwood hotel group’s personal data processing systems dating back to 2014. Marriott acquired Starwood in 2016 and, since then, had failed to conduct an adequate audit of Starwood’s systems. The fine, thus, confirms firms’ inherited liability in relation to acquired systems. The issue of mergers and acquisitions, whilst substantially highly significant, remains little problematized in data protection discussions.
– Geographical Disparity in GDPR Fines –
Following from the above, a study on GDPR fines in Central and Eastern Europe has just been published by Deloitte. By far the highest fine recorded in the region was a fine of 240,000 EUR handed down by the Polish DPA. The fine itself is not remarkable. What is remarkable is the miniscule size of the fine in comparison with top-level sanctions being handed down by DPAs in Western Europe – particularly the CNIL and the ICO, whose fines can extend into the hundreds of millions of Euros. There is still too little empirical evidence to draw concrete conclusions. It would seem, however, that a gap is opening up between the scale of fines in Eastern Europe and those in Western Europe. Such a disparity would be a far cry from the intended geographical harmony envisaged by GDPR sanctions mechanisms. The development of this disparity, the crystallization of its causative factors and its consequences for European data protection will be fascinating subjects of observation moving forward.
– German State Prohibits Microsoft Office in Schools –
The DPA in the German State of Hesse has declared that the use of Microsoft Office 365 in schools is prohibited. The decision comes following the publication of a problematic DPIA for Microsoft Office done by the Privacy Company last year. Two reasons were specifically cited by the DPA for the prohibition. The first reason was the potential access possibilities for US security services to personal data stored in Microsoft’s European cloud. The second reason was the limitations on the ability to restrict the transmission of user telemetry data to Microsoft. The DPA concedes that telemetry data could be collected and processed with student consent. The DPA highlights, however, that neither schools, nor parents, can provide this consent on behalf of students. The DPA clarified that the decision does not simply apply to Office 365, but also potentially affects all cloud and SaS services working with the same use conditions –Apple and Google services were specifically mentioned. The DPA did not, however, go further and draw conclusions for the use of Office 365 and similar products in other public and private contexts. It seems highly likely similar decisions will emerge concerning Office 365 – and similar services. More interesting, perhaps, will be the real-world consequences of such decisions: given its market position, are bans on Office 365 realistic?
– AI and Law Enforcement: The Finnish Presidency –
In an informal Justice and Home Affairs Meeting on 18-19th July 2019, the Finnish Presidency opened a discussion on the usage of AI in law enforcement. Whereas it acknowledges the benefits which AI brings, the presidency also highlights potential dangers. The presidency warns that: ‘(t)he mass surveillance that is already a reality in some parts of the world represents a grave privacy issue.’ “Deep fakes” and cyberattacks are amongst the key identified threats. From a data protection perspective, it is notable that the Finnish Presidency has put the issue of AI and law enforcement – specifically transparency of AI including the control and scrutiny over the algorithms, their fairness and accuracy and the reliability of produced results – on the agenda. This is a starting point for discussion between Member States. It remains to be seen what political priority the question of legally and socially acceptable AI in law enforcement will eventually obtain. It also remains to be seen whether any legislative output will result from these discussions.