The Millennium Brain

The psychiatrist Edward Bullmore explains why he agrees with the late Ted Hughes that the brain is the best symbol of the end of the millennium

TED HUGHES wrote an article nearly a year ago, suggesting that one way of making the Millennium Dome "the most astonishing building on earth" would be to build it as "a giant model of the human brain". I don't suppose this seemed like a very good idea to those on the ground at Greenwich, labouring to engineer a vast smooth dome without collapsing the Blackwall Tunnel or creating a vaulted micro- climate of tropical humidity. The architectural problems entailed in trying to build an enormous structure that attempted to represent the immense complexity of the human brain must be currently beyond us. But even if it could be done, why would we want to do it? When was it that the brain became a suitable millennial symbol?

For most of the past 2000 years, the brain has enjoyed a popular status not much loftier than the liver or the heart or any other kind of offal. Even 50 years ago, it might have seemed a bit peculiar for the Poet Laureate of the day to play with the idea of building a brain rather than, say, a Festival Hall. But, quite suddenly, people have started to market the brain. Now it's selling better than ever before: there are more books about the brain selling more copies to general readers; more television and radio programmes about the brain are being made. There are brain images on T-shirts and coffee cups and advertising hoardings. No fashionable home is without a phrenology head on the mantelpiece. This mass market success could be dismissed as a fad, but in fact it is built on a revolution in the scientific understanding of the brain which began about 30 years ago and shows every sign of gathering momentum as we roll into the next century.

The revolutionary force is cognitive neuroscience, which means simply the science of neural or nervous systems applied to the phenomena of cognition or thought. The revolutionary programme, in a nutshell, is to discover how the brain thinks. Like most scientific revolutions, cognitive neuroscience involves looking at the world in a new way. The Copernican revolution in astronomy, which put the sun at the centre of what we've since called the solar system, became unavoidable when Galileo pointed his new- fangled telescope at the sky. The equivalent has happened in cognitive neuroscience with the development of new machines for looking at the human brain. These machines, such as the magnetic resonance scanner we use for neuroimaging research at the Institute of Psychiatry in London, can reveal the structure of a living brain to the nearest millimetre. Even more remarkably, this can be done without causing any pain or danger to the person under study.

To appreciate fully the impact of being able to see inside someone's head, one has to remember that the skull, assisted by the traditional taboo against dissection of the body, has historically done a great job of keeping the brain secret. We were already three-quarters of the way to the second millennium when Leonardo da Vinci and Christopher Wren produced the first recognisable illustrations of brains (their models were those of executed convicts). These showed that the outer surface or cortex of the brain was symmetrical and deeply wrinkled, like a walnut, and that inside the brain there were fluid-filled cavities or ventricles. But it wasn't at all clear how this anatomy functioned before death. Leonardo guessed that thinking might have something to do with the circulation of fluid through the ventricles. Famously, Descartes pointed to the pineal gland as the body's conduit to the mind, on the grounds that it was the only bit of the brain that was not dually represented in both cerebral hemispheres.

By the middle of the 19th century, some of the first neurologists, like Paul Broca, had begun to make more enduring progress by dissecting the brains of patients who had died after a stroke or another illness which caused paralysis or loss of speech. They reasoned that the part of the brain that normally controlled whatever function had been lost would be obviously damaged at post- mortem. Thus it was discovered that the thinking part of the brain was not the ventricles, nor the pineal gland, but the cortex; and that different parts of the cortex were responsible for different mental functions.

This did not alter the fact that the subjects of human brain research had to be dead. Even the discovery of X-rays at the start of the 20th century did not really unveil the living human brain. X-rays shoot through grey and white matter like a laser through cloud. So it is only in the past 20 years that we have been able to look directly at the structure and function of the human brain, and only in the past 10 years that we have been able to do so without exposing people to the risks of radiation.

The images produced by the new machines are, like all digital images, infinitely mutable by computers. The bland natural palette of grey and white matter can be replaced by vivid pseudocolour; the brain can be zoomed, warped or rotated in 3D. It may seem hazardous that a set of scientific observations can take so many forms - what has become of the facts? - yet entirely new perspectives are allowed. For example, something approximating an "average brain" can now be created by mapping a number of living brains and morphing them into a single image. The new technology also makes the brain look interesting, as never before, to many people outside neuroscience. Data on the cortical location of a certain function, which might seem forbidding in the form of a set of numbers or graphs, becomes immediately meaningful when rendered as a coloured focus of activity on a finely detailed anatomical background.

These brain maps show us that the cortex can indeed be subdivided into areas specialising in different kinds of information processing. There are, for example, specialist areas for seeing colour, movement, faces. There is also intriguing evidence for specialisation in more refined aspects of visual processing - such as recognising facial expressions of fear or disgust or discerning a walking human figure in a pattern of moving dots - which might have been advantageous earlier in the course of our evolution. Maps of basic visual functions like these have confirmed the results of pioneering research conducted by recording single nerve cells in the brains of animals. But brain mapping can also be used to examine higher functions (such as language or will or memory) which are difficult to investigate in any animal other than a human. One generalisation to be drawn is that the higher the function in question, the less likely it is to be located in a single specialised area of cortex. A complex function like language, for example, is mapped not to one area of the brain but to at least half a dozen interconnected areas. Different functions can overlap in a single area: short-term memory, for example, can be mapped to a network which includes some regions of cortex that are also part of the language network.

One of the big questions for cognitive neuroscience is how best to comprehend the networked organisation of the brain. I have already described some mental functions as higher than others, casually implying that regions within a network might be organised hierarchically. By analogy to some computers, we could imagine that a given cortical region is specialised to process some input from a subordinate region, before passing it on to another region higher up in the hierarchy for further processing, and so on until ... what, exactly? At what point in this series of processing do we become aware of it? Is there a single brain region at the very pinnacle of the hierarchy which says to itself, "Oh, I see" as bytes of processed visual data are delivered to it? The short answer is no. Much more likely is that brain networks are organised for parallel- rather than serial-processing, and that the "highest" functions emerge as a corollary of the integrated activity of an entire network.

Figuring out how the mind mysteriously emerges from cortical networks for parallel- processing will demand much more than a few brain-imaging machines. It will require the combined expertise of scientists trained in a wide variety of disciplines that are involved in cognitive neuroscience. And this isn't the only big question that cognitive neuroscience seeks to answer. We need to know how the neural apparatus for thinking evolved - like any other bodily structure - by natural selection; and how these naturally selected genes control the extraordinary process by which a cluster of primitive cells develops into a uniquely complicated adult brain.

Some 19th-century ideas which had been all but abandoned in the first half of this century have been recognised anew as being visionary; much of what was considered avant-garde before the Second World War now seems reactionary or simplistic. A surprising winner in the new order is Frederick Gall, the inventor of phrenology. He is now hailed as the first prophet of the basic principle of cortical specialisation - the idea that different areas of the surface of the brain are specialised for different mental functions. Gall may have been preoccupied with functions such as high-mindedness and religious scrupulousness which are all but forgotten by contemporary psychologists, and his experimental method of locating areas of function - by feeling for lumps on the head - is considered as ridiculous as it was 200 years ago. His search for the site of sexual desire, for example, involved feeling for hot spots on the heads of widowed (ergo frustrated) young women. But Gall has been vindicated on principle, and the porcelain phrenology head, quaintly dotted about with antiquated states of mind, is honoured as the prototype human brain map.

Carl Wernicke, a 19th-century neurologist, was one of the first to argue that "higher" functions such as language were determined by networks of cortical regions. In retrospect, this seems like a major breakthrough, but Wernicke's idea was quickly damned for lack of evidence. Wernicke had actually examined dozens of brains from deceased patients who in life had suffered language problems; but his case histories were cursory, his post-mortem examinations crude and biased by his expectations. Above all, there was virtually no corroborating evidence from any other area of brain science.

Sigmund Freud's last major work as a neurologist attacked Wernicke's ideas about cortical networks, although in fact Freud had already begun to have his first thoughts about the unconscious in terms of energy flowing through a network of connected nerve cells. After the First World War, antipathy towards Wernicke and his mainly German colleagues intensified, especially in the English-speaking world, and the concept of cortical networks was thoroughly disparaged. All this, the history writers can now claim, was an indication that the long night before the dawn of cognitive neuroscience had begun.

For the next 40 years, from 1920 to the end of the Fifties, the dominant schools of thought in psychology were psychoanalysis and behaviourism. These had nothing in common apart from a desire to make sense of the mind without worrying too much about the brain.

The language of psychoanalysis was still peppered with words such as libido, instinct, neurosis, which had originally signified something about the body or the brain. But as the ageing Freud relinquished his hope of founding a scientific psychology, rooted in what he knew of the brain, these words were used ever more metaphorically. Followers of Freud adopted the master's language, and invented much more, to construct a model of the mind that was entire in itself. Any sceptical questions about where or how the superego or the death instinct might actually be located in the brain could be turned against the questioner as proof of her resistance or his Oedipal hostility.

Behaviourists like Ivan Pavlov insisted that all we could know of the mind could be seen in the form of behaviour. We can't see that a dog is hungry, but we can see that it eats when presented with a bowl of food. We might wish to believe that it eats the food to satisfy an appetitive instinct; but this is simply jargon. Psychoanalysts might suppose that even the mind of a newborn child was already densely inhabited by instincts and archetypes. The mind of the Pavlovian baby was empty. It knew nothing at birth, and had to learn by applying a few simple rules to the maelstrom of its early experience. It was obviously necessary that there should be a brain in order for learning to take place, and the brain must at least know innately the rules for learning. But the brain of the Pavlovian baby was otherwise as unorganised as its mind was blank; and the development of adult skills like language was not critically dependent on a few key areas of cortex but rather on the "mass action" of the entire brain.

Neither of these two contradictory schools of thought has survived with vigour the advent of cognitive neuroscience, but perhaps the more obvious loser is behaviourism. It now seems incredible that the brain of a newborn baby is as unorganised as its mind is supposedly empty; or that the connected apparatus for thinking which is so consistently visualised in one brain after another still allows for the belief that an individual learns everything by the accidents of his or her upbringing. Accordingly, "instinct" has been retrieved from the grasp of the psychoanalysts and restored to something like its original meaning: that of an innate (and at least partly genetic) predisposition.

But what effect does the revolution in cognitive neuroscience have on our vision of the future? Once the revolution is over, we may imagine that there will be no further call for disembodied old soldiers like "self" or "soul". The circuit diagrams for talking, laughing, dreaming and lusting will have been worked out in detail. Impulses and ideas will be seen merely as changes in the neurophysiological weather. Criminality, addiction and mental illness will be diagnosed and treated more incisively. It could be a brave neuro- world indeed.

Versions of this have been common currency in science fiction for some decades. But science fiction is limited by the science of its times - witness the rivets on Buck Rogers rockets. Past fictional attempts to imagine a future world where the human mind is understood, and even controlled, in terms of the brain, have been correspondingly overpopulated by humans reduced to robots, reprogrammable for good or evil by dextrous application of electrical probes. To my mind (if it still makes sense to use that phrase, and I think it does), recent progress in cognitive neuroscience points towards a rather less totalitarian future.

It seems reasonable to hope that there will be major medical benefits, particularly in the shedding of light on poorly understood psychiatric and neurological diseases. An emphasis on evolutionary and genetic causes of brain organisation is likely to dominate our sense of how we became what we are. I expect us to have a much better grasp of the neural mechanics of emotion, attention, language and memory. I think there will be some astonishingly realistic computer models of human intelligence. But I also expect that there will be a limit to what neuroscience can tell us about the experience of cognition as we know it most intimately. Nobody has demonstrated a mind-reading machine that can infer what somebody is thinking about from the pattern of their brain waves. Indeed, it is an un- resolved question whether we can assign a subjective content to some objectively observed brain data. And, given that we already know that the dynamics of brain networks are inherently unpredictable, the idea that we could make people think certain thoughts, or reprogramme their mental trajectories, by any non-destructive intervention seems even more far-fetched ...

So perhaps Ted Hughes was being more serious than I thought when I first read his article. The brain as we now know it would make a great millennial symbol. It is universally relevant and fascinating. It is the focus of an international scientific revolution which resonates far beyond science, and in which this country has a leading role. The badges and posters would sell like hot cakes. Maybe by the next millennium we'll be smart enough to take Hughes literally, to make a completely realistic, all-encompassing model of the growing, thinking, human brain, or as he put it: "the palace of the greatest Genie in the Universe, the Human Spirit!" Or maybe we'll just be smart enough to realise that even the most beautiful and intricate map of the palace can never tell us all we might want to know about the Genie.

! Dr Edward Bullmore is a research fellow at the Institute of Psychiatry in London

PROMOTED VIDEO
Arts and Entertainment
Gothic revival: artist Dave McKean’s poster for Terror and Wonder: The Gothic Imagination
Exhibition
Arts and Entertainment
Diana Beard has left the Great British Bake Off 2014

TV
Have you tried new the Independent Digital Edition apps?
Arts and Entertainment
Lisa Kudrow, Courtney Cox and Jennifer Anniston reunite for a mini Friends sketch on Jimmy Kimmel Live

TV
Arts and Entertainment
TVDessert week was full of the usual dramas as 'bingate' ensued
Arts and Entertainment
Clara and the twelfth Doctor embark on their first adventure together
TVThe regulator received six complaints on Saturday night
Arts and Entertainment
Vinyl demand: a factory making the old-style discs
musicManufacturers are struggling to keep up with the resurgence in vinyl
Arts and Entertainment
David Baddiel concedes his show takes its inspiration from the hit US series 'Modern Family'
comedyNew comedy festival out to show that there’s more to Jewish humour than rabbi jokes
Arts and Entertainment
Puff Daddy: One Direction may actually be able to use the outrage to boost their credibility

music
Arts and Entertainment
Suha Arraf’s film ‘Villa Touma’ (left) is set in Ramallah and all the actresses are Palestinian

film
Arts and Entertainment
Madame Vastra and Jenny Flint kiss in Doctor Who episode 'Deep Breath'

TV
Arts and Entertainment
Steve Carell in the poster for new film 'Foxcatcher'
filmExclusive: First look at comic actor in first major serious role
Arts and Entertainment

TV
Arts and Entertainment
Kingston Road in Stockton is being filmed for the second series of Benefits Street
arts + entsFilming for Channel 4 has begun despite local complaints
Arts and Entertainment
Led Zeppelin

music
Arts and Entertainment
Radio presenter Scott Mills will be hitting the Strictly Come Dancing ballroom
TV
Arts and Entertainment

TV
Arts and Entertainment

TV
Arts and Entertainment
The Doctor and Clara have their first real heart to heart since he regenerated in 'Deep Breath'
tv
Arts and Entertainment
Beyonce performs in front of a Feminist sign at the MTV VMAs 2014

music
Arts and Entertainment
Miley Cyrus has taken home the prize for Video of the Year at the MTV Video Music Awards 2014

music
Arts and Entertainment
Peter Paige and Scott Lowell in Queer as Folk (Season 5)
tvA batch of shows that 'wouldn't get past a US network' could give tofu sales an unexpected lift
Arts and Entertainment
books... but seller will be hoping for more
Independent
Travel Shop
the manor
Up to 70% off luxury travel
on city breaks Find out more
santorini
Up to 70% off luxury travel
on chic beach resorts Find out more
sardina foodie
Up to 70% off luxury travel
on country retreats Find out more
Have you tried new the Independent Digital Edition apps?

ES Rentals

    Independent Dating
    and  

    By clicking 'Search' you
    are agreeing to our
    Terms of Use.

    The other Mugabe who is lining up for the Zimbabwean presidency

    The other Mugabe who is lining up for the Zimbabwean presidency

    Wife of President Robert Mugabe appears to have her sights set on succeeding her husband
    The model of a gadget launch: Cultivate an atmosphere of mystery and excitement to sell stuff people didn't realise they needed

    The model of a gadget launch

    Cultivate an atmosphere of mystery and excitement to sell stuff people didn't realise they needed
    Alice Roberts: She's done pretty well, for a boffin without a beard

    She's done pretty well, for a boffin without a beard

    Alice Roberts talks about her new book on evolution - and why her early TV work drew flak from (mostly male) colleagues
    Get well soon, Joan Rivers - an inspiration, whether she likes it or not

    Get well soon, Joan Rivers

    She is awful. But she's also wonderful, not in spite of but because of the fact she's forever saying appalling things, argues Ellen E Jones
    Doctor Who Into the Dalek review: A classic sci-fi adventure with all the spectacle of a blockbuster

    A fresh take on an old foe

    Doctor Who Into the Dalek more than compensated for last week's nonsensical offering
    Fashion walks away from the celebrity runway show

    Fashion walks away from the celebrity runway show

    As the collections start, fashion editor Alexander Fury finds video and the internet are proving more attractive
    Meet the stars of TV's Wolf Hall... and it's not the cast of the Tudor trilogy

    Meet the stars of TV's Wolf Hall...

    ... and it's not the cast of the Tudor trilogy
    Weekend at the Asylum: Europe's biggest steampunk convention heads to Lincoln

    Europe's biggest steampunk convention

    Jake Wallis Simons discovers how Victorian ray guns and the martial art of biscuit dunking are precisely what the 21st century needs
    Don't swallow the tripe – a user's guide to weasel words

    Don't swallow the tripe – a user's guide to weasel words

    Lying is dangerous and unnecessary. A new book explains the strategies needed to avoid it. John Rentoul on the art of 'uncommunication'
    Daddy, who was Richard Attenborough? Was the beloved thespian the last of the cross-generation stars?

    Daddy, who was Richard Attenborough?

    The atomisation of culture means that few of those we regard as stars are universally loved any more, says DJ Taylor
    She's dark, sarcastic, and bashes life in Nowheresville ... so how did Kacey Musgraves become country music's hottest new star?

    Kacey Musgraves: Nashville's hottest new star

    The singer has two Grammys for her first album under her belt and her celebrity fans include Willie Nelson, Ryan Adams and Katy Perry
    American soldier-poet Brian Turner reveals the enduring turmoil that inspired his memoir

    Soldier-poet Brian Turner on his new memoir

    James Kidd meets the prize-winning writer, whose new memoir takes him back to the bloody battles he fought in Iraq
    Aston Villa vs Hull match preview: Villa were not surprised that Ron Vlaar was a World Cup star

    Villa were not surprised that Vlaar was a World Cup star

    Andi Weimann reveals just how good his Dutch teammate really is
    Bill Granger recipes: Our chef ekes out his holiday in Italy with divine, simple salads

    Bill Granger's simple Italian salads

    Our chef presents his own version of Italian dishes, taking in the flavours and produce that inspired him while he was in the country
    The Last Word: Tumbleweed through deserted stands and suites at Wembley

    The Last Word: Tumbleweed through deserted stands and suites at Wembley

    If supporters begin to close bank accounts, switch broadband suppliers or shun satellite sales, their voices will be heard. It’s time for revolution