As we have shown in the article of racist soap
dispensers, technologies might reproduce existing social inequalities. A
twitter user commented the soap dispenser video, pointing out that the same has
happened with facial recognition software.
In 2015 there has been a case where Google’s
new Photo application, which was supposed to automatically label people,
labeled black people as ‘gorillas.’ This caused an outcry throughout the internet,
such as Jack Aliné’s tweet, which shows how Google Photos tagged her friends as
gorillas. In response, Google apologized for the biased software.
-Amazon’s biased tech
In another case, Amazon’s facial recognition
software “Rekognition” – which is used by police and Immigration and Customs
Enforcement – does well with recognizing white men’s faces, but has
difficulties to recognize people with darker skin color as well as
Further, Amazon’s technology also has proven to
be gender biased. Research from MIT and the University of Toronto showed that
women of dark skin color were wrongly labeled as men 31% of the time.
Besides the bias in facial recognition
software, the technology can also be used in a more direct way to exercise (political)
power. In China, the fear that government might use technologies like facial
recognition software in order to deepen their influence seem to become reality.
The Chinese government is using AI-backed facial
recognition software in order to track and control the Uighurs, a Muslim
minority. The software is integrated into a network of surveillance cameras. In
this way, the government can constantly track the Uighurs.