– EDPB Issues Five Documents –
During its 34-36th Plenary Sessions, the EDPB adopted five new documents:
- Statement on the Court of Justice of the European Union Judgment in Case C-311/18 – Data Protection Commissioner v Facebook Ireland and Maximillian Schrems.
- Guidelines 06/2020 on the interplay of the Second Payment Services Directive and the GDPR – version for public consultation.
- Letter to MEP Ďuriš Nicholsonová on contact tracing, interoperability of apps and DPIAs. In this document, the EDPB repeats the main GDPR provisions which apply to the operation of contact tracing apps and reiterates that the pandemic does not automatically “waive” the need for compliance with these unless a national law implementing restrictions under Article 23 GDPR and Article 15 e-Privacy Directive applies.
- Information note on Binding Corporate Rules for Groups of undertakings / enterprises which have ICO as Lead Authority. In this document the EDPB seeks to help corporations become compliant before the end of the Brexit transition period.
- FAQ on CJEU judgment C-311/18 (Schrems II). See below for further analysis.
The documents are available on EDPB’s website.
– EDPB Issues FAQs Following Schrems II –
On 23rd July, the EDPB released the document ‘Frequently Asked Questions on the judgment of the Court of Justice of the European Union in Case C-311/18 – Data Protection Commissioner v Facebook Ireland Ltd and Maximillian Schrems’. As the title suggests, the document provides guidance on the consequences for the interpretation of the GDPR’s international transfer mechanisms following the landmark CJEU decision in the Schrems II case. The guidance deals with both the specific contents of the decision – i.e. transfers from the EU to the US under Privacy Shield and under SCCs. The guidance also, however, goes further and elaborates certain of the broader consequences of the decision. This elaboration includes discussion of the consequences of the decision for other forms of transfer mechanism – including consequences for other adequacy agreements, for other Article 46 options such as BCRs and for Article 49 mechanisms. This elaboration also includes discussion of the consequences of the decision for transfers to other countries apart from the US. Unsurprisingly, given the recent nature of the decision, the guidance remains relatively superficial and follows the logic of previous guidance concerning international transfer mechanisms. Unfortunately, this means that the guidance does not touch on some of the more interesting and involved questions arising on the back of Schrems II – for example, what forms of safeguard might be implemented in relation to SCC-based transfers to the US to ensure adequate protection?
On 17th July, the German Constitutional Court published a judgment concerning access to data of telecommunication subscribers by law enforcement and intelligence authorities. The case primarily concerned the compliance of Article 113 of the German Telecommunications Law (GTL) with several constitutional provisions, including privacy. Article 113 GTL provides a legal basis for telecommunications providers to transfer subscriber data necessary for the conclusion of a contract between the provider and the subscriber to law enforcement and intelligence agencies – such as name, telephone number, date of birth and possibly address, bank account information, types of services for which a user is subscribed and log-in details. Law enforcement and intelligence authorities may, however, only request access to subscribers’ data to investigate and prosecute criminal and administrative offences, to prevent public threats, and to fulfilling the tasks of the intelligence services. The Constitutional Court ruled that the contested provisions on (1) transfer to and (2) retrieval by law enforcement and security authorities to subscribers’ data violates the constitutional right to privacy in conjunction with the right to personal development and dignity. The Court argued that the provisions on transfer of subscribers’ data are not proportionate because the purposes for which they may be processed are phrased too broadly: ‘Information that solely serves to facilitate the performance of the authority’s tasks in general may not be provided without specific grounds.’ Instead, there should be a specific danger in an individual case. In addition, with respect to dynamic IP addresses, the Court noted that accessing these results in greater interference with the right to privacy and thus transfer for administrative offences would be disproportionate. As to the retrieval of the data by the law enforcement authorities, the Court ruled that with the exception of the provisions on retrieval of log-in details, these are not adequately restricted either. We observe that this is one of the few cases concerning the processing of telecommunication data for law enforcement purposes which discusses the fundamental rights compliance of the processing of subscriber data as opposed to traffic data.
– Bundesgerichtshof on the Right to be Forgotten –
On the 27th July, the German Bundesgerichtshof ruled in two cases dealing with the Right to be Forgotten under the GDPR. In the first case, the claimant, who had been a manager at a charity, complained that the ongoing listing, by Google, of search results concerning him, constituted an infringement of his rights. The results in question linked to content which suggested that, just prior to the emergence of information that his charity was running at a financial deficit, the claimant had called in sick. The Court ruled against the applicant. The Court highlighted that each consideration of whether a link should be delisted required a consideration of the specific interests involved. The Court decided in the case that the interests of Google, the public and the press in keeping the information publicly available outweighed those of the claimant. In the second case, the claimants, who run several businesses in the finance sector, complained that the ongoing listing, by Google, of search results concerning them, constituted an infringement of their rights. The results in question concerned the fairness of their business practices. Significantly, the factual accuracy of this content was disputed by the claimants. The Court stayed proceedings and referred two questions to the CJEU. The first question concerns the degree to which the accuracy of information to which a link connects should be taken into account in delisting evaluations. The second question concerns the degree to which, in the case of delisting evaluations concerning photos, the context of the website from which the photo was sourced should be considered.
– ICO Produces Framework for AI Audits –
On 30th July, the ICO released new guidance on AI and data protection. In terms of content, the guidance comprises: i) ‘auditing tools and procedures [to be used] in audits and investigations’; ii) ‘detailed guidance on AI and data protection’; and iii) ‘A toolkit designed to provide further practical support to organisations auditing the compliance of their own AI systems (forthcoming).’ In terms of audience, the guidance is aimed at: i) those with a focus on data protection compliance, such as DPOs; and ii) those specialising in technology, such as software developers. In terms of content, the guidance is broken down into four parts, each of which covers types of data protection principles relevant in relation to AI systems: i) ‘the accountability principle’; ii) ‘lawfulness, fairness, and transparency of processing personal data in AI’; iii) ‘security and data minimisation in AI’; iv) ‘compliance with individual rights, including rights relating to solely automated decisions’. The guidance is comprehensive. However, further work is necessary to consider how the framework compares to the numerous other guidance documents and assessment frameworks dealing with AI and data protection.
– Drivers Challenge Uber over Algorithmic Transparency –
UK Uber drivers have submitted filed suit with the District Court of Amsterdam, requesting full access to all their personal data. Their request also includes a request for information on the algorithmic decision-making process operated by Uber. More specifically, they want access to the profiles used by Uber and the automated decisions made by Uber. The drivers wish to know how Uber has profiled their behaviour – e.g. in terms of ride cancellations, late arrival at work, attitude and “inappropriate behaviour.” In this way, they want to understand whether their performance has been monitored by Uber commensurate with an employer-employee relationship. The drivers also request that their data be transferred to a trade union data trust. Trade Unions and the Worker Info Exchange NGO support the case, arguing that the information could increase the bargaining power of the Uber drivers. We note that requests for algorithmic transparency under the right of access seem to be increasing, and that jurisprudence on this topic is much awaited to shed light on the interpretation and application of the GDPR in relation to algorithmic transparency. The case also demonstrates the importance the exercise of data subjects’ rights could have beyond the mere protection of personald data – e.g. in the regulation of professional relationships.