Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in
The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.
IBM will no longer develop facial recognition technology following George Floyd protests
'We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,' CEO writes in letter to congress
Sign up to our free weekly IndyTech newsletter delivered straight to your inbox
Sign up to our free IndyTech newsletter
IBM will no longer develop technology for facial recognition following protests against racial inequality in the US and UK.
In a letter to congress, IBM CEO Arvind Krishna said that the company “no longer offers general purpose IBM facial recognition or analysis software.”
“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency."
"We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,” the letter states.
Mr Krishna also recommended that police should be more accountable for their misconduct, and expanding opportunities in communities of colour for “new collar” jobs – those which require specialised skills but do not require the traditional university degree, such as cybersecurity and cloud computing, both of which IBM develops.
IBM itself has also been criticised in the past for how it has developed its facial recognition systems. Last year, the company released a collection of nearly one million photos taken from Flickr which were used to train its algorithm.
However, IBM had not informed those photographed that their images were being “annotated with details including facial geometry and skin tone”, with Meredith Whittaker, co-director of the AI Now Institute, saying at the time that such an action infringed people’s privacy in order “[train] systems that could potentially be used in oppressive ways against their communities.”
George Floyd death: Minneapolis protests erupt in the streets
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies