data it law September 2017: smart contracts & ECHR case law

The overview of interesting Data & IT Law articles and news in September 2017!

 

September edition of ECHR case law in data protection

The new version of the overview of the European Court of Human Rights case law was published.

It includes the overview of data protection cases of ECHR. In comparison with the previous version, it includes these decisions:

  • Mustafa Sezgin Tanrıkulu v. Turkey, about an interception of communications in Turkey
  • rbulescu v. Romania, about the decision of a private company to dismiss an employee – the applicant – after monitoring his electronic communications and accessing their contents.
  • Aycaguer v. France, about a breach of his right to respect for his private life on account of the order to provide a biological sample for inclusion in the national computerised DNA database (FNAEG) and the fact that his refusal to comply with that order had resulted in a criminal conviction
  • Satakunnan Markkinapörssi Oy and Satamedia Oy v. Finland, about the publication of the personal tax information of 1.2 million people.”

 

A commercial application of self-executing smart contract

The global insurance company AXA had launched a smart contract technology aimed at the consumer insurance market.

The service’s name is fizzy and it is based on the Ethereum blockchain and covers flight delays. At present, it is available on flights between the US and Paris.

“‘If your plane is more than two hours late, fizzy will reimburse you immediately.’ It does this by creating a smart contract with your flight details, which you fill out online when you buy the AXA/fizzy policy. The legally sound contract is placed on the blockchain and is designed to respond to official airline information, such as flight delays. Once a ‘real world’, verified input is received that the flight has been delayed, the contract is triggered immediately and self-executes and the consumer receives their compensation.”

 

Tracker-blocking tools & Smart Consumer Goods

A group of scholars published an article “Block Me If You Can: A Large-Scale Study of Tracker-Blocking Tools”.

“Our analysis quantifies the protection offered against trackers present on more than 100,000 popular websites and 10,000 popular An-droid applications. (…) Among others, we discover that rule-based browser extensions outperform learning-based ones, trackers with smaller footprints are more successful at avoiding being blocked, and CDNs pose a major threat towards the future of tracker-blocking tools. Overall, the contributions of this paper advance the field of web privacy by providing not only the largest study to date on the effectiveness of tracker-blocking tools, but also by highlighting the most pressing challenges and privacy issues of third-party tracking.”

There is also a new issue of JIPITEC journal. Among other things, it includes articles about:

  • What Rules Should Apply to Smart Consumer Goods? Goods with Embedded Digital Content in the Borderland Between the Digital Content Directive and “Normal” Contract Law
  • Standards for Duty of Care? Debating Intermediary Liability from a Sectoral Perspective
  • EU Copyright Liability for Internet Linking
  • Internet Intermediary Liability Reloaded – The New German Act on Responsibility of Social Networks and its (In-) Compatibility with European Law

 

A study addressing the ethics of artificial intelligence

A Stanford Graduate School of Business researchers Kosinski and Wang published an article in which they claimed that they had built artificial intelligence that could tell if the person is a gay or straight based on a few images.

“Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 74% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person.”

In a following article, the author of the study said: “This is the lamest algorithm you can use, trained on a small sample with small resolution with off-the-shelf tools that are actually not made for what we are asking them to do.”

The authors also argue that “our findings expose a threat to the privacy and safety of gay men and women.”

Leave a Reply

Your email address will not be published. Required fields are marked *