To say that we are irrational is not just to say we are emotional. The affliction goes deeper. Emotions can lead to irrational actions, but in themselves they are neither rational nor irrational. It's our reasoning processes (in the largest sense) that Sutherland is targeting, and most of his book is concerned with errors that we would make even if our emotions never misled us.
To act rationally, he supposes, is to act in the way most likely to achieve one's ends. This is what we are so bad at. We have a perverse genius for defending our existing beliefs at all costs rather than putting them to the test of the facts. When we do set about collecting evidence we are very bad at it, because we tend to look only for evidence that will confirm what we already believe, ignoring or discounting evidence that contradicts it (illustrating the point, Sutherland gives an alarming account of Montgomery's decision to go ahead with the Arnhem operation in 1944). And even when we've got a good range of evidence, we are very bad at using it to make the best decision. One of our biggest problems, surprisingly, is our passion for making sense of things and fitting them to what we already believe. A human loves an explanation, and if a good one is not available a bad one will be confabulated. We see patterns where none exist, and organise our actions around them.
Sutherland supports his claims in detail. He shows, for example, that the irrationality of financial consultants is matched only by that of their clients. Many studies have proved that advisers on equities, managers of pension funds, unit trusts, and the portfolios of insurance companies 'constantly do worse than the market they are in, even making any allowance for any fees that they charge'. Their 'wretched clients would do better to stick a pin in Stock Exchange listings'. The point is well known. No one takes much notice.
Psychologists have names for our failings. We make the availability error (a striking or salient fact hijacks our thought, and we ignore other relevant facts), we succumb to the halo effect (one salient good quality makes us overrate a thing's other qualities). We are wide open to the sunk-cost error and the fundamental attribution error. We strut our glassy essences, we place too much trust in Which?, we are desperately unwilling to admit we are wrong, we make elementary errors in calculating probabilities. We ignore the simplest consequences of the 'law of large numbers', and of the phenomenon of 'regression to the mean'. We are confused by anchoring effects and the alarming boomerang effect, which works as follows: once one has publicly committed oneself to some position, genuinely good arguments against it tend to increase one's commitment to it rather than decrease it, as obstinacy, the fear of losing face and other weird mechanisms swing into action.
In judging others, we tend to believe that they are much more like ourselves than they are. This is a favourite error. A human loves to be knowing and this usually means generalising from his or her own case. It is a very strange experience to talk to someone who takes it for granted that one has motives that are quite foreign to one - especially when they are unattractive. Sex can provide good examples. Some - especially the cynical - are apodeictically confident in their generalisations about sexual experience, inexplicably convinced that it must be the same for others (of the same sex) as it is for themselves.
This is only the beginning. Sutherland lists nearly a hundred distinct and characteristic errors in his highly informative surveys, and ends each chapter with a useful and humorous moral. He concludes, though, with a doubt about how desirable it would be if everyone were rational. And would it be fully rational to try to become fully rational? There is a philosopher in California writing a book on The Rationality of Irrationality.Reuse content