– ECtHR Rules on the Balance between Privacy and Freedom of Expression –
On 14th May the ECtHR issued its ruling in Rodina v Latvia on an Article 8 ECHR complaint. The case concerned the media attention given to a familial conflict. The applicant’s mother and sister made claims that the applicant sold her mother’s apartment after the applicant’s mother was admitted to a psychiatric clinic. Reportedly, following the release from the clinic, the applicant’s mother did not have a place to live and the applicant did not wish to take care of her. A newspaper published an article about the situation which contained enough information to identify the applicant – including a photograph, published without consent or forewarning. The applicant complained that the newspaper article, subsequent coverage on television as well as domestic courts’ failure to balance her right to privacy with her relatives’ and the media’s rights to freedom of expression constituted breaches of her right to privacy. The ECtHR ruled that the domestic courts had failed to strike a balance between the right to privacy and the right to freedom of expression and that there was a violation of Article 8 ECHR. The Court argued that a violation was found because the applicant could not be considered to be a public figure, the story did not contribute to a debate of public interest, the applicant had clearly expressed her disagreement with the publication and the journalists working on the story did not seem to have followed the necessary professional standards in reporting – for example, they focused on presenting only one side of the conflict. The Court further argued that these factors were not adequately considered by the domestic courts. This case demonstrates again the fine balance between the rights to privacy and freedom of expression. The case is not particularly noteworthy in terms of content or argumentation. The case is noteworthy, however, for the fact it needed to be heard in front of the ECtHR at all. Given the comprehensive ECtHR jurisprudence on the issue, it remains surprising that domestic courts regularly fail to perform proper balancing exercises between the two rights.
– Constitutional Protection of Telecommunications Privacy for Intelligence Abroad –
On 19th May the German Constitutional Court issued a ruling on the constitutional compatibility of three main aspects of the German Federal Intelligence Service (BND) Act. The challenged aspects concern the BND’s powers to collect and analyse the telecommunications data of non-German citizens living abroad, to transmit it to other German and foreign authorities and to cooperate with foreign intelligence services. In its ruling, the Court set out the following three significant considerations. First, the Court established that the BND is bound by the German Constitution even when its actions concern non-Germans living abroad. Second, the Court ruled that, whereas surveillance of telecommunications data abroad is not, per se, unconstitutional, the current provisions of the BND Act breach the fundamental rights to privacy of telecommunications and of freedom of the press – Articles 10(1) and 5(1) of the Constitution. The Court put forward five reasons supporting this position: i) by not treating the contested powers as subject to the Constitution, the German government did not comply with the procedural requirement to specify affected fundamental rights in a law under Article 10(1) of the Constitution; ii) with regards to the collection and analysis of telecommunications data, the contested Act does not satisfy relevant procedural requirements – such as having a clearly stated purpose, breaking down surveillance measures into clear-cut categories to enable oversight, respecting proportionate data storage principles etc; iii) the law cannot be regarded as proportionate, because the Constitution does not allow “global and general” surveillance even for the purpose of foreign intelligence; iv) concerning the transfer of data to other German authorities and to other foreign intelligence services, the Court ruled that these were not sufficiently restricted by law to be proportionate – for example, the law did not set out requirements obliging recipient authorities to process data in accordance with the rule of law; and v) concerning the cooperation with other foreign intelligence agencies, the Court set out the requirement that such cooperation should not permit German authorities to circumvent the requirements of the Constitution. Finally, the Court generally set out detailed requirements for the independent oversight of the actions of foreign surveillance. Despite the constitutional incompatibility of the contested Act, the Court granted the government a grace period until 31st December 2021 to amend the Act. The ruling is particularly welcome for the fact the Court set out clear (constitutional) standards for the operation of intelligence services. Whilst intelligence activities fall outside the scope of EU law, it is notable that the standards adopted by the Court resemble those set out by the CJEU in telecommunication surveillance cases for law enforcement purposes – for example Digital Rights Ireland.
– EDPB Holds 28th Plenary Session –
On the 19th May, the EDPB held its 28th Plenary Session. In the session, the EDPB adopted one document and made one significant decision:
- The EDPB adopted an ‘opinion on the draft Standard Contractual Clauses (SCCs) for controller-processor contracts submitted to the Board by the Slovenian Supervisory Authority’.
- The EDPB decided to ‘publish a register containing decisions taken by national supervisory authorities following the One-Stop-Shop cooperation procedure (Art. 60 GDPR) on its website’.
The outcomes of this Plenary Session differ from those of Plenary Sessions held over the past two months. This is the first session which has not focused overwhelmingly on issues related to the COVID-19 emergency. Whether this change in focus represents a conscious shift in EDPB practice remains to be seen. Documents which are not yet available on the EDBP’s website should be made available shortly, following internal checks.
In May 2020, the EDPB adopted its 2019 Annual Report. The report is useful reading for all those involved in EU data protection discussions. This is true for two reasons in particular. First, the Report offers the expected, yet highly valuable, summary of substantive EDPB activities over the course of 2019 – including of all published documents as well as of other significant activities. Whilst most of the work discussed will be familiar to the data protection community, other aspects of EPDB work outlined in the Report have not been so publicised. For example, in the Report, the EDPB discuss its Survey on SA Budget and Staff. Interestingly – and echoing observations from other organisations covered in the previous issue of Data Protection Insider – the EDPB highlight that: ‘Most of SAs stated that resources made available to them are insufficient’. Second, the Report offers an overview of the structure and procedures of the EDPB. Whilst information on these topics is not hidden or secret, it is often overlooked as individuals and organisations focus on the EDPB’s more substantive output. Yet, this information provides vital background to the function of the Board and to the constraints under which it operates. For example, the Report offers an overview of: the EDPB’s perspective on their mission; the EDPB’s guiding principles, the EDPB’s rules of procedure – including changes to their rules of procedure; the EDPB’s technical infrastructure; and the EDPB’s expert subgroups and their mandates.
– ICO Issues Guidelines on Explainability in AI –
The ICO and the Alan Turing Institute have issued practical guidelines to help organisations explain to concerned persons decisions, processes and services taken with the help of AI technologies. The guidelines are split into three parts: i) ‘The Basics of Explaining AI’ (aimed mostly at DPOs and Compliance Teams); ii) ‘Explaining AI in Practice’ (aimed mostly at technical staff); and iii) ‘What Explaining AI Means for Your Organisation’ (aimed mostly at senior management). The guidelines are noteworthy for three reasons in particular. First, the guidelines explicitly highlight that an explanation of a fully automated decision, after it has been taken, should be given to individuals. This position is derived from several articles in the GDPR, including Articles 15 and 22, as interpreted in light of Recital 71 – the latter of which explicitly states that individual decisions should be explained to affected individuals. This clarification is particularly welcome given the protracted academic arguments concerning the issue. Second, the guidelines highlight that explanation obligations could also arise for AI assisted decisions which are not fully automated. This obligation is derived from the data protection principles of fairness, transparency and accountability. Finally, the guidelines state that the advice contained therein could be relevant to other legal instruments alongside the GDPR – for example the e-Privacy Directive and the Law Enforcement Directive. Explainability of AI decisions in these instruments is barely discussed and the guidelines could thus provide a useful reference point for interpreting relevant provisions.
– CNIL Issues Guidance on Anonymisation –
On 19th May, the CNIL issued guidance on anonymisation. The guidance is short and provides a cursory overview of the CNIL’s perspective on the following issues: the concept of anonymisation and its difference to pseudonymisation; the reasons one might anonymise personal data; how to anonymise personal data whilst preserving the utility of a dataset; how to verify the effectiveness of an anonymisation procedure; how to guard against the risks of re-identification. Given its length, the guidance is relatively superficial and contains little of substantive novelty. The guidance also fails to address several open questions concerning anonymisation. In particular, the guidance avoids the question of the degree to which, as suggested in the CJEU Breyer case, legal limitations on re-identification can render personal data anonymous. The guidance is, however, noteworthy in one key respect: the guidance marks key normative assumptions underpinning the CNIL’s perspective on anonymity. This facet of the guidance gains further in significance as these assumptions do not seem to chime with those of other DPAs. In particular, the guidance suggests the CNIL takes a strict, ‘old school’, approach to anonymisation. The guidance suggests, for example, that personal data are only anonymous if reidentification is practically impossible and highlights, as one of its key reference points, the 2014 Article 29 Working Party Opinion on Anonymisation Techniques – in which a strict approach was also taken. This approach differs from that of the ICO, for example – in their document ‘Anonymisation: managing data protection risk’ – which suggests that, in certain instances in which re-identification is technically possible, data may still be regarded as anonymous.