The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.

Microsoft calls for regulation of face recognition technology after admitting it could discriminate against women and people of colour

'The facial recognition genie is just emerging from the bottle'

Andrew Buncombe
Friday 07 December 2018 19:17 GMT
Microsoft president calls for regulation of AI facial recognition technology over discrimination fears

The president of Microsoft has called for greater government regulation of AI facial recognition technology, because of the risk of it discriminating against women and people of colour.

In a rare incident of a tech giant calling for greater government scrutiny, Brad Smith said such regulation would help avoid “a commercial race to the bottom, with tech companies forced to choose between social responsibility and market success”.

The comments of Mr Smith, 59, which were released at the same time as a report by a research group consisting of both Microsoft and Google employees also calling for more regulation, are especially noteworthy because of the controversy the company triggered earlier this year over its AI work.

In June, the company’s general manager Tom Keane, wrote how proud Microsoft was to be working with the US Immigration and Customs Enforcement agency (ICE) to use facial recognition technology to help identify immigrants and process applications. In a blog post about Azure Government, a programme designed to allow government agencies upload information to the computing cloud, he said: “The agency is currently implementing transformative technologies for homeland security and public safety, and we’re proud to support this work with our mission-critical cloud.”

The comments were made as the Trump administration and ICE were facing intense criticism from human rights advocates and others for the way migrant families were being broken up and separated at the US-Mexico border.

At the time, more than 100 employees posted an open letter to the company’s internal message board, protesting about the work and asking for it to be stopped. “We believe that Microsoft must take an ethical stand, and put children and families above profits,” said the letter, which was addressed to chief executive, Satya Nadella. Mr Nadella was among the technology executives who met Donald Trump in this White House this week.

In a blog on the company website, which was similar to comments he later made during a speech at the Brookings Institution in Washington DC, Mr Smith said: “We believe it’s important for governments in 2019 to start adopting laws to regulate this technology. The facial recognition genie, so to speak, is just emerging from the bottle. Unless we act, we risk waking up five years from now to find that facial recognition services have spread in ways that exacerbate societal issues.”

He added: “In particular, we don’t believe that the world will be best served by a commercial race to the bottom, with tech companies forced to choose between social responsibility and market success. We believe that the only way to protect against this race to the bottom is to build a floor of responsibility that supports healthy market competition.”

Police are trailling controversial facial recognition technology in Stratford

Mr Smith, who joined Microsoft in 1993, said he was concerned that at the current state of development, “certain uses of facial recognition technology increase the risk of decisions, outcomes and experiences that are biased and even in violation of discrimination laws”.

He added: “Recent research has demonstrated, for example, that some facial recognition technologies have encountered higher error rates when seeking to determine the gender of women and people of colour.”

He said the risk of misidentification increased when the technology was “used in those communities”.

Meanwhile, AI Now, an institute at New York University founded by Kate Crawford and Meredith Whittaker, issued a report similarly calling for more regulation. Among concerns raised in the report, was alarm over AI applications that claim to read people’s emotions and mental well-being – something called affect recognition.

“These tools are very suspect and based on faulty science,” said Ms Crawford, who works for Microsoft Research. “You cannot have black box systems in core social services.”

In addition to the use of facial technology by ICE, The Verge reported recently the Secret Service had revealed plans for a test of facial recognition surveillance around the White House, with the goal of identifying “subjects of interest” who might pose a threat to the president.

A document published by the Department of Homeland Security last month said the Secret Service would run a facial recognition pilot programme “in order to biometrically confirm the identity of volunteer Secret Service employees in public spaces around the complex”.

The American Civil Liberties Union, which publicised the plan, said at the time: “Face recognition is one of the most dangerous biometrics from a privacy standpoint because it can so easily be expanded and abused — including by being deployed on a mass scale without people’s knowledge or permission.”

In Britain, South Wales Police, the Metropolitan Police in London and Leicestershire Police all use the technology, according to the Daily Telegraph, which said doubts had been raised about its reliability. It said a recent study found the systems, created by Japanese company NEC, had a difficult time identifying, suspects wearing hats or glasses.

A year ago, security officials in Germany extended a six-month trial of

facial recognition technology at Berlin’s Suedkreuz railway station, after the initial tests delivered a good success rate, using more than 200 volunteers.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies


Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in