If the face fits, you're nicked

As the fight against crime continues, facial-recognition software could soon be coming to the high street. But the benefits of the technology are far from proven, says Nick Huber

Monday 01 April 2002 00:00 BST
Comments

A police force tests a powerful and controversial technology at a sporting showcase, scanning crowds for known criminals. A public outcry ensues, with civil-liberties groups saying the facial-recognition software gives the police a worrying amount of power to monitor individuals at a whim. Security experts raise concerns that the technology, known as biometrics, might not be as reliable as the suppliers want us to believe.

A police force tests a powerful and controversial technology at a sporting showcase, scanning crowds for known criminals. A public outcry ensues, with civil-liberties groups saying the facial-recognition software gives the police a worrying amount of power to monitor individuals at a whim. Security experts raise concerns that the technology, known as biometrics, might not be as reliable as the suppliers want us to believe.

The country was America, the sporting event the Superbowl. The concerns were real. Yet right now similar technology is being trialled by UK airports and police forces. Heathrow is testing iris-recognition technology for people going through passport control; Essex police are testing facial-recognition software to match still images of suspects picked up on closed-circuit TV (CCTV) cameras to images of criminals on its database.

The 11 September terror attacks on the United States have boosted the profile of biometric technology, which measures people's physical characteristics – eyes, fingerprints, faces – and uses these to identify them. Suppliers tout biometric technology as an important weapon in the fight against terrorism, even though it is accepted that basic security procedures – not biometric technology – would have had a better chance of preventing the attacks.

"Biometrics has a pretty much limited use in the war against terrorism," says Neil Garner, the head of development at Consult Hyperion, an IT consultancy specialising in e-commerce. "Facial-recognition technology seems to be a complete disaster when used [covertly] by the police to spot criminals' faces in crowds. It effectively just gives a two-dimensional map of the face. Facial hair also mixes it up."

This raises the scenario of being marched off for questioning by the police because a computer system has wrongly matched your face with that of a criminal on their database. But UK authorities using biometric technology insist that it can help to reduce crime and will gain public support. In east London, Newham council has been openly using facial-recognition software for four years.

The system, from Visionics, can only be used in eight CCTV cameras at a time. As they scan crowds it creates a template of each face and compares it with images on a linked police database. A close match sounds a buzzer to alert council staff in a control room. They then compare the suspect's image on CCTV with that in the police criminal database, and decide whether to alert the police. They do not see the details of the suspect's criminal record, which remains encrypted.

Council staff can set a threshold for triggering a match. "The certainty level could be 70 per cent," says John Page, the head of community safety and emergency services at Newham. "But if you are looking for a missing child you might set it at 10 per cent so that any child who is similar looking can be checked against a picture."

Often cited as a facial-recognition success story, Newham council claims that the software used with CCTV cameras helped to reduce the crime rate in town centres by 15 per cent where it was in operation. But the claim is contentious; other factors, such as police detection, could have reduced the rate, while the CCTV signs throughout Newham may also have served as a deterrent.

So how many arrests, if any, has the technology been responsible for? No one really knows. Newham says it does not know how many of the matched images it passes on to the police result in a conviction, or what proportion of suspects identified by the facial-recognition software are criminals. Newham police were unable to provide figures showing how effective the technology has been, although they said they were working on this. The police claim some success in using the software to identify football hooligan ringleaders in crowds at West Ham games.

But is there a public appetite for these new forms of surveillance? There could be a public backlash if biometric technology falsely identifies people as criminals. Mr Page cites a Newham council survey last year in which 93 per cent of residents favoured the use of more CCTV cameras and facial-recognition software. There are strict guidelines on what the council can use the facial-recognition systems for, he says.

The Essex police force is also conducting a trial of facial-recognition technology. The software matches suspects' images, such as stills from CCTV cameras, to a police database containing more than 160,000 images. So is it an effective crime-buster? The Essex force says the system – developed by Securicor Information Systems and Visionics – has made important identifications, helping to solve cases. However, its effectiveness has been undermined by the poor quality of many CCTV images provided by shopkeepers.

"We have found the biggest headache to be the quality of images from shops' CCTV," says Detective Sergeant Steve Jones of Southend CID. "You get tapes that haven't been changed enough and have been through the video head thousands of times." That makes the images grainy and unreliable. DS Jones has yet to write his report, but he says the system would only need to detect one murderer to pay for itself in reduced trauma for the victim's family and saved police time.

But biometric technology does raise serious ethical issues, such as a citizen's right to privacy and the potential abuse of the technology by governments. Mark Littlewood, the director of campaigns for the civil-rights group Liberty, argues that although biometric technology has a role, it needs to be regulated to prevent the government or security forces from abusing its power.

"How can you really be sure that the awesome power [of the technology] is being used for the purpose it should be? To what extent could it be used to monitor people who are politically unsavoury? Post-11 September, for example, a leading radical Muslim could easily be the target." One way to regulate the use of the technology would be under the Data Protection Act, which should include a new section to cover increasingly sophisticated CCTV, Mr Littlewood adds.

Tim Pidgeon, the director of business development at Visionics, admits the public can find biometric technology a "bit spooky" and says there must be legislative checks.

There are also signs that employers will use biometrics to monitor employees at work. The civil construction company O'Rourke Group uses biometric technology to monitor employees' attendance and hours worked. Workers clock in at a computer, which verifies their identities through facial recognition software developed by Aurora, another biometric supplier.

O'Rourke hopes the technology will give it a definitive picture of which workers are on a site at any one time, which should help to locate staff in an emergency. It should end the practice of "buddy punching", where employees clock in colleagues' cards.

Elsewhere, other forms of biometric technology are inching their way into everyday life. Heathrow airport's iris-recognition trial involves identifying passengers enrolled in its trial as they walk through passport control. The identification occurs in a matter of seconds. However, the volunteer passengers still have to carry their passports.

A spokeswoman for BAA, the airport's operator, says the machine for recording the image of the iris is like a normal camera; it does not involve shining red beams into passengers' eyes. And few people would argue against technology that helps to improve the crime detection rate, or tighten airport security.

Significant doubts remain, though, about the reliability of biometric technology when used in public places, with police forces only able to provide anecdotal evidence of its value. Public authorities using the technology also need to explain how it will be regulated in order to protect the individual's privacy. "There is an awful lot of hype surrounding biometric technology, which may be clouding its practical value," Mr Garner warns. "The public need more assurances."

Nick Huber is deputy news editor of 'Computer Weekly' magazine

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in