Discrimination of Digital Age – Discrimination by Data

Discrimination-Diversity-02The legal systems of the majority of countries all around the world include anti-discrimination laws. People should be treated equally and not be discriminated based on any ground. According to Article 26 of the International Covenant on Civil and Political Rights, people cannot be discriminated based on “race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status.”

However, the purpose of this article is to open a discussion about the brand new type of discrimination that results from our present digital age. The discrimination based on data. The extent of an issue might be surprising for a reader.

Data discrimination by now

This is not a first article to discuss the issue of data discrimination. The term has its definition in Wikipedia. However, it focuses on a different topic. If the service provider selectively filters information, it is the case of data discrimination. In principle, this represents a censorship of certain online content. This practice might be perceived as an unfair trade practice or one of other anticompetitive practices.

Another use of discrimination by data was described by Latanya Sweeney. She pointed at the differences between the ads appearing in search results page, when the search query included black or white sounding names. In the queries with black sounding names, there was a higher rate of appearance of ads that invoked that the person might have a criminal record.

However, the advertiser is the author of the ad text for any search query. Accordingly, it is not the case of discrimination by data, but it can be a case of a discrimination by race. On the other hand, it is interesting to note the author´s conclusion that online advertising is an environment well-suited for discrimination.

Price discrimination and data

The closest example of the discrimination by data is the first degree price discrimination. The article by Forbes deals with this issue. First degree price discrimination involves charging every individual customer a price based on their individual willingness to pay.

In the past, it was hard to find enough data about the customer to determine his or her willingness to pay. However, the recent increase in the number of data collected about consumers enables firms to overcome this obstacle.

The article provides an example of a firm providing certain service. The probability of getting a customer without any further information about him or her, is 16 percent. However, the percentage significantly increases with an introduction of customer´s demographic data or web browsing behavior.

The results might lead to different prices for customers based on their willingness to pay, calculated from data. Is this an act of discrimination or a private law relationship with no need to interfere from the regulative perspective?

Recruitment and data

The issue of decision-making based on data might lead to discrimination in the area of recruitment. Imagine a situation: two candidates applied for a specific position and they might be considered equal in qualifications and experience. However, according to previous big data research, the company knows that people with certain characteristic have a 41 percent higher chance of leaving the company within 6 months (illustrative example).

These characteristics appear in one of the candidates and therefore, the other one gets the job. The company believes that it is more effective to hire this other one, because there is a lower chance that he or she would leave. On the other hand, the actual candidate might not have any intention of leaving the company in 6 months. The decision about him is based on data that have nothing to do with him. The question is similar – is it an act of discrimination or a private law relationship?

The technological methods to deal with the issue

Data scientists are aware of this fact. In their analysis, the authors noted that data mining techniques might lead to the discrimination of certain groups. The advantage of automated processes is that they are not influenced by personal preferences. However, the authors admit that the training data used in these processes, may be biased.

Therefore, they describe several processing techniques, how to prevent the situation, such as pre-processing, in-processing and post-processing. At a certain point, the discriminatory data are removed, either in the original dataset or in the data mining algorithm. However, the implementation of these procedures is not always possible.

The legal regulation

From the legal point of view, there is no explicitly expressed prohibition of the discrimination based on data. The Article 26 of the International Covenant was already cited. The discrimination by data might fall under “other opinion” or “other status”. Similar categories are usually part of the majority of national legislation against discrimination. Accordingly, it should be possible to refer to “other” categories or general prohibition of discrimination.

However, there is no case law supporting this opinion (not available to the author of the post – contact us if you know the case).

Another possibility, how to defend against discrimination by data, is to analyze the process of data collection. Had the discriminated user given a consent to the data collector to use data for the purposes of their analysis in this particular way? In the majority of cases, users give consent to process their data in order to improve the services data collector is providing. Does it include the use of user´s data “against” him in other situations?

Conclusion

The purpose of this article was to raise questions in this area. There is a clash between a public aspect of anti-discrimination law and private aspect of doing business effectively. Finding the balance is not an easy task.

However, the lawyers should find the solution, since the importance of an issue would only grow with more and more information being collected. Referring to the above mentioned Forbes article, data are collected not only in online environment, but also in offline environment, with the emergence of wearable computing, such as Google Glass. First case law or new regulations might bring more light into the issue.

 

Note: This article is intended as a summary of issues. Its purpose is not a to provide legal advice or create an attorney-client relationship between you and the author of this article (see Terms and Conditions)

Leave a Reply

Your email address will not be published. Required fields are marked *