TOP-5 high-profile scandals around Big Data and Machine Learning
Digitalization does not always bring only positive results: increased profits, reduced costs, and other business optimization bonuses. Big Data is a big responsibility that not everyone can handle. In this article, we have collected 5 of the most striking events in the IT world over the past couple of years related to Big Data and Machine Learning, which caused a mixed reaction and even public condemnation.
Harm from artificial intelligence or when Machine Learning is not to blame
In December 2019, the automobile company Mazda recalled a defect in the intellectual braking system of its 35,390 cars in the 2019 and 2020 model years, recalling several tens of thousands of cars. Due to software errors, the fourth-generation Mazda 3 can detect a non-existent object in its path and automatically start emergency braking while driving. This behavior of the machine disorientates the driver and can lead to a collision with a vehicle traveling behind. It is assumed that reinstalling the new version of the software will help to eliminate this defect.
This inadvertent error Machine Learning has not yet led to fatal accidents, in contrast to intentional crimes of machine learning that occurred in the USA in 2016-2019. Then, as part of a criminal conspiracy of developers of a software module for doctors with a manufacturer of opioid drugs for the treatment of pain, the electronic clinical decision support system issued erroneous recommendations. This has led to a significant increase in mortality due to overdoses of narcotic and opioid drugs. Moreover, we are not talking about “experienced” drug addicts, but about people who receive specific treatment for therapeutic purposes and have never before taken opioids. This case once again shows that digitalization, Big Data and Machine Learning are just high-tech tools that can also be used for unseemly purposes.
3 major scandal of Big Data and Machine Learning in security
The most critical fuck ups in the field of big data over the last couple of years related to information security. And now we are talking not so much about data leaks as about their misuse. Despite legislative measures, such as the introduction of GDPR in 2018, even the largest data-driven companies illegally process personal data of their users.
For example, the Facebook social network is considered to be involved in Cambridge Analytica, which collected personal data of 50 million users in order to build ML-models of the results of the US presidential election. In this regard, Facebook was fined at the US Federal Trade Commission by $ 5 billion and lost another 40 billion on a fall in stocks. Facebook was also seen in other incidents with the personal data of its users and fined in October 2018 in the UK by 500 thousand pounds, which amounts to more than 620 thousand dollars. And in 2019 in Italy, 1 million euros for the leak of user data was punished in Italy.
Another major scandal involving Machine Learning, Big Data and big money also happened in 2019. Using Deep Fake technology, the attackers successfully imitated the voice of the head of the company, forcing the deputy director to transfer about 220 thousand euros to their account. Such unlawful use of neural networks and open data about public persons makes it possible to create fake audio and video recordings with their participation, which can lead to social, financial and political risks. In addition, Deep Fake is dangerous for biometric systems that are actively implemented in various government services.
In conclusion, we note a high-profile incident with the widespread introduction of the Big Data face recognition project on the streets of Moscow and in the metro. In addition to the positive results (search for missing citizens and criminals, identification of criminogenic places and collection points for illegal migrants), such a system can be used to monitor citizens – both from the authorities and the offenders themselves. In particular, there are real examples where recordings from street video cameras of urban video surveillance were sold on the black market, and access to the cameras themselves was not too protected from a technical point of view.
There are also cases with photos of innocent citizens who mistakenly got into the database of criminals. In particular, in 2018, a young man was detained in Moscow who was returning from a concert and did not commit any illegal actions. Several lawsuits have already been filed against the introduction of facial recognition on city streets, but so far all of them have been rejected.