The overview of interesting Data & IT Law articles and news in February 2017!
Consent for data mining: new developments
The authors Luke Hutton and Tristan Henderson published an article at arxiv.org “Beyond the EULA: Improving consent for data mining”.
It deals with the lawfulness of personal data processing using data mining methods. Their work is based on an analysis of current practices in commercial and academic practice, as well as their own case studies. They distinguish between secured and sustained consents, when “sustained consent involves ongoing reacquisition of consent over the period that the data are collected or used”.
As a proposal for organizations, the authors argue that “organisations involved in data mining should provide legible consent information such that people can understand what they are agreeing to, support people’s agency by allowing them to selectively consent to different processing activities, and to support negotiability by allowing people to review or revoke their consent as the context of the data mining changes.”
AI in law: examples of applications supported by courts
In February, several courts all around the world supported the use of artificial intelligence in law.
In Australia, the Supreme Court of Victoria endorsed the use of predictive coding in discovery. The authors at idm.net.eu described the implications of the case. “The case in question is McConnell Dowell Constructors v Santam Ltd & Ors, an insurance coverage dispute in connection with a large claim that arose from the design and construction of a natural gas pipeline. In an underlying arbitration concerning a dispute about the construction contract, approximately 4 million documents were discovered (…) (T)he Court estimated it would take a junior solicitor approximately 583 working weeks to review the remaining documents.” The Court held that “In larger cases, ordinarily technology assisted review will ordinarily be an accepted method of conducting a reasonable search in accordance with the Rules of Court. It will often be an effective method of conducting discovery where there are a large number of electronic documents to be searched and the costs of manually searching the documents may not be reasonable and proportionate. In such cases, the Court may order discovery by technology assisted review, whether or not it is consented to by the parties.”
In New Jersey, US, the states is trying a new algorithm to set bails. The idea is to have an algorithm that will mathematically assess the risk of defendants fleeing or committing a crime before their trial date. However, several articles deal with potentially discriminatory effects of an algorithm. In one of them, the author refers to the experience with similar programs with in-built racial biases – “there is still a concern about over-policing of black neighborhoods in general, particularly in the case of nonviolent offenses.” According to the article, the authors of the algorithm found that racial factors were not useful in a prediction. “The strongest predictor of pretrial failure largely has to do with someone’s prior conduct.”
GDPR enabling an over-removal of online information?
The author Daphne Keller published an article about Europe´s Intermediary Liability Laws.
The General Data Protection Regulation introduces a new regime of notice-and-takedown rules for erasure requests. The author argues that “the new rules make deliberate or accidental over-removal of online information far too likely. They give private Internet platforms powerful incentives to erase or de-list user-generated content – whether or not that content, or the intermediaries’ processing of the content, actually violates the law. They also create new data disclosure obligations that undermine privacy and Data Protection rights for people who post content online.”
The author proposes several solutions for these problems:
Rules from the eCommerce Directive Should Govern Notice-and Takedown under the GDPR
If GDPR Rules Apply to Notice-and-Takedown, They Should Be Interpreted to Maximize Procedural Fairness
Hosts Should Not Be Subject to RTBF Obligations
DPAs Should Not Assess Financial Penalties Against OSPs That Reject RTBF Requests in Good Faith.
EU Member State Law and Regulatory Guidance Should Robustly Protect Freedom of Expression in RTBF Cases
National Legal Differences Should Be Respected
Databases unprotected by copyrights – large margin of uncertainty
Amsterdam’s Court of Appeal decided an interesting case about the use of unprotected databases. The authors at kluwercopyrightblog analyzed the decision.
A party (Party A) used data extracted from several psychological tests of the other party (Party B). Party B had not given a permission to use the data. Both first instance court and the court of appeal held that data of Party B were not protected by copyright.
The most interesting part of the case is about the clause in the contract, prohibiting the copying or duplication of any part of the databases with permission. At this moment, the authors refer to the EU Court of Justice’s Ryanair case.
“Articles 6(1) and 8 of the Database Directive give users of a database several rights that cannot be restricted by contract. According to article 15, ‘any contractual provision contrary to Articles 6(1) and 8 shall be null and void.’ However, in the Ryanair case, the CJEU ruled that the Directive does not apply to databases that are not protected by copyright or the sui generis right. In such a case, the owner of a database can limit the rights of users without taking the aforementioned articles into account. (…) The Ryanair judgment leaves the owner of an unprotected database a large margin of uncertainty in restricting users’ rights: whether a provision is allowed is a matter of national contract law. The Court of Appeal adds to this by indicating that certain provisions can only be understood as referring to copyright. Therefore, they are not valid where the database is not protected by copyright. The court thereby narrows the margin for the database owner.”