The overview of interesting Data & IT Law articles and news in January 2017!
Is there a Right to Explanation of Automated Decision-Making in GDPR?
The authors Sandra Wacter, Brent Mittelstadt and Luciano Floridi published an article “Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation”.
They challenge the idea that the right to explanation is an ideal mechanism to enhance accountability and transparency of automated decision-making. “(T)he GDPR only mandates that data subjects receive limited information (Articles 13-15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’. Further, the ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the protection actually afforded to data subjects. These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless. We propose a number of legislative steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.”
Several problems of the draft of EU ePrivacy Regulation
On January 10, 2017, the European Commission published a proposal for a Regulation on Privacy and Electronic Communications. This legislation should repeal ePrivacy Directive. The purpose of this regulation is to adapt the legal regime to conform with the General Data Protection Regulation.
The article at data guidance deals with possible risks of the new Regulation. Firstly, it is “the obligation to offer software such as browsers with the option to prevent third parties from storing information on the terminal equipment of an end-user.” The Regulation expands the material scope of ePrivacy rules: “to include internet-based voice and internet-messaging services, i.e. OTT communications services, the Internet of Things and machine-to-machine communications, as well as to include both the content of communications and metadata (…) The rules on cookies or other technologies apply to ‘information,’ not only to the processing of electronic communications metadata or content. Therefore, the scope of Article 8 is broader than the other obligations of the Draft Regulation.”
Finally, in relation to GDPR, the authors argue that “(s)ome references to obligations under the GDPR seem to be inconsistent. In particular, in recital 17, [with regards] to ‘electronic communications metadata,’ the Draft Regulation refers to the obligations under Articles 35 and 36 of the GDPR to carry out impact assessments and consult the supervisory authority. [However,] in Article 6(2) of the Draft Regulation, where the conditions for processing such metadata are set out, no reference is made to the [abovementioned articles].”
Text and data mining and data protection
The BYTE project funded by European Commission researched positive and negative societal impacts of big data. The article at futuretdm.eu analyzes its conclusions about the compliance of text and data mining with data protection legislation.
The author argues that the main problem is the purpose limitation principle. Based on an analysis of the Article 5(b) of the GDPR, the author argues that “(t)he GDPR does not impose a requirement of the same or a similar purpose, but rather prohibits incompatible purposes. This limits further processing of personal data, but also provides space for processing with altered and even new purposes. The compatibility of the original purpose and the purpose of the further processing has to be assessed.”
Moreover, the author argues that in the long-term the rules should change. The legislation needs “a shift from an individual control model, where the individual decides about each interaction, to a liability model, in which decision-making during operations happens more collectively or aggregated and the active role of the individual gets reduced to receiving compensation or obtaining a claim towards the data controller.”
The author refers to the current consent regime or to privacy by design principle. “This can concern technical measures, like data mining on encrypted data, and build further on statistical disclosure control techniques. But it can also concern organizational (e.g. clear role descriptions with access rights) and legal (e.g. confidentiality and use limitation clauses) measures, limiting access, transfer and use.”
First guidelines on Privacy Impact Assessments
The GDPR requires a wide range of companies to deploy privacy impact assessments by May 2018. The article at lukaszolejnik.com deals with the first example of guidelines on how to implement them. It is based on the guidelines published by the Belgian Data Protection Authority. The author notes that there will be many more guideline published in the near future.