The nuances of eye movement, the stock in trade of romantic novelists, are providing fresh insights into how we see. Eye-tracking recorders - portable versions can look like virtual-reality monocles - can be used to chart exactly where the eyes move, how long they linger to take in information and how a visual trigger leads us to act in certain ways. When eye movement is linked to a specific measurable task - driving, reading, selecting the week's groceries - eye tracking can chart how it's done, yielding an on-line measure of activity. The recorders are providing valuable data for experts from psychologists to packaging designers, and from neurobiologists to the Department of Transport.
Geoffrey Underwood, Professor of Cognitive Psychology at Nottingham University, says eye tracking gives researchers a "window on a person's mind, by providing a record of what they see as it's happening." He is using it to compare the eye movements of new and experienced drivers. The study, commissioned by the Department of Transport, is aimed at helping accelerate the learning curve for new drivers and reducing the accident toll among the freshly qualified.
During a 45-minute ride, drivers wear an eye-tracker over one eye, which reflects the cornea from an infra-red spot. In addition, a small video camera points forward in front of the eye, capturing the driver's field of vision. Humans pick up most of their visual information during momentary "fixations" when the eye focuses. Between these points, the eye moves too fast for the brain to register the information. "The image is smeared across the retina," says Professor Under-wood. "It's suppressed."
A zig-zagging line from the driver's eye movements, plotting fixations and the sweeps of the eye which link them, is superimposed on the video of the road scene. A "map" of exactly where the eye moves is then analysed frame by frame, yielding information on driver behaviour which could help boost road safety. New drivers could be given hints on how to search the road for hazards before getting their licence.
Back in the university lab, the same drivers watch videos of on-the-road scenarios and more information is gathered on how novices and seasoned drivers search the environment for potential hazards.
Michael Land, Professor of Neurobiology at Sussex University, built his own eye camera to examine the relationship between where drivers look and how they steer. It consists of a split-screen video worn over one eye when driving - the first two-thirds of the frame has a mirror that looks forward, and the bottom third images the eye. A trace of how the eye moves can be mapped frame by frame onto the video record of the journey. "Most of the work in this area has been done on American freeways, where the job is to stay awake," says Professor Land. "On winding roads, drivers look at a very particular place, the tangent point, where the sight just grazes the inside of the kerb." He has found it takes 0.8 of a second for this advance information on the curvature of the road to translate into steering.
Sophie Furneaux, a graduate student working with Professor Land, is using eye tracking devices to see how pianists translate sheet music into music. Pianists' eyes, she found, are a second in front of their fingers; however experienced the player, the second lead time remains. Another student has been charting the eye movement of table-tennis players, who have to stay one jump ahead of the ball - or 0.2 of a second, says the data collected from the eye camera.
Professor Land believes eye-tracking devices worn as people carry out specific tasks have opened up new vistas. "In every skilled action, we tend to concentrate on motor activity - steering or playing table tennis. But there's a whole other strategy we have to learn about which we know nothing. There is an intermediate level of getting in-formation - the interaction between eye and brain - which has a world of its own. That, to me, is the most exciting part of this work."
Eye tracking is also being used to see how shoppers make decisions when browsing in the supermarket. The design package company, Siebert Head, and retail research specialists ID Magasin work together on PackTrack, using a mix of covert surveillance, face-to-face interviews and eye tracking to gather information on how brands perform on the shelf. Once randomly selected shoppers have been interviewed, they are asked to come back when the supermarket shuts, put on the eye-tracking device and have their eye movements monitored. The shopping trip is recorded, slowed down 100 times and analysed, providing data to improve package design and store layout.
The idea, according to Siemon Scamell-Katz from ID Magasin, is to identify the "prompts and barriers to making selections on the shelf". In the highly competitive world of shopping, such information can give companies quantifiable evidence of what catches the consumers' eye, says Saktar Gupta, from Siebert Head. "If someone spends a long time looking at a packet, it could mean the brand is trying to send too many messages, and the consumer ends up being confused by the product."
Work is being done on enhancing the sensory systems of the disabled. If gaze can be maintained for a critical period, it is possible to "type" on screen by looking at letters on a keyboard one by one.
But there are scarier applications for leading-edge eye-movement research. The automatic activation of military weapons by gaze control sounds like the stuff of science fiction movies. However, this kind of research is being carried out by the US army.
Stare at a target for long enough and there's no need to aim or press the button. "If looks could kill", that cliche of torrid tales of romance, could take on a whole new meaning. !Reuse content