The reasons for caution go to the heart of scientific advice on complex issues. On the face of it, the rise in global temperature of around 0.5C since the mid-19th century is incontrovertible. There are, however, worrying discrepancies between different measurements of temperatures around the world. Furthermore, knowledge of how the climate can shift of its own accord is still sketchy. Finally, our understanding of how human activities are altering the climate, how this will vary from place to place, and how it differs from natural variability, is incomplete.
The IPCC conclusion is the product of a combined assault on these uncertainties. It relies on matching the predictions of computer models of the global climate with observed changes. This approach requires knowledge of the three-dimensional variations in temperature - not only how the temperature varies around the world, but also how the temperature change from ground level up through the various layers of the atmosphere. Other data required include the changes in precipitation (rain and snowfall), and the incidence of extreme weather events. If the approach, often termed "fingerprinting", is to succeed, the computer modellers also need to understand how the impact of human activities will differ from natural variations in the climate.
The scientific debate on the scale of global warming hinges on the limited geographical coverage of the early records, the non-standard nature of early measurements and discrepancies between recent observations using different technologies. There is no way round the spotty nature of the early records. Most observations are limited to parts of the land masses in the northern hemisphere until the mid 19th century.
Sea surface temperatures (SSTs) have been available since around 1850, but there are huge gaps in the early observations. Worse still, there is a sudden jump around 1941 between the records of SSTs and air temperatures. Recent research has attributed this to an undocumented change in the measurement technique.
The temperature of the sea surface was measured in the simplest of fashions: a sailor would drop an empty bucket on a rope over the side of the ship, and measure the temperature of the water recovered. But during the Second World War it became too dangerous to collect samples from over the side at night and so they switched to relying on engine air-inlet temperatures. Extensive detective work by Chris Folland and David Parker at the UK Met Office Hadley Centre, Bracknell, shows that the raw temperature data have to be "adjusted" by factors ranging from 0.11C in 1856 to 0.42C in 1940 in order to make allowances for such changes of technique and ensure all the data is consistent. This work highlights the challenge of using early instrumental records.
Once the corrections have been made so that recent data are comparable with the early measurements, a rise in global temperatures over the period is still observed. But the correction factor for 1940 is disturbingly large. There is also a debate about recent temperature trends. Measurements from space show a much smaller warming than surface-based observations.
The potential of the global climate to change of its own accord, however, is the real joker in the pack. The more we find out about past changes the greater its capacity to spring surprises on us. Not only have we discovered that before 10,000 years ago the climate was capable of sudden large shifts, but also that more recent relatively orderly changes such as the warm period around 1,000 years ago (the "Medieval Climatic Optimum") and the subsequent cool period (the "Little Ice Age") are less clear-cut than previously assumed.
This changing view of natural variability makes it more difficult to test whether computer models are providing a realistic representation of the longer-term behaviour of the global climate. By the late 1980s the broad consensus was that the effect of greenhouse gases, equivalent to doubling pre-industrial carbon dioxide levels, was to increase the global temperature between about 1.5 and 4.5C with the greatest warming occurring in polar regions in the winter half of the year. But these predictions did not tally well with observed changes this century.
Recent models of the climate provide improved treatment of the atmosphere and the oceans and consider other human activities, notably production of sulphate aerosols. The impact of aerosols is of particular interest and the Hadley Centre model, developed by a team under John Mitchell, indicates more modest warming than earlier predictions. Moreover, the combination of greenhouse gases and sulphate aerosols matches observed global temperature trends rather well. In addition, the predictions of the regional effects of sulphate aerosols produce temperature trends around the world more in tune with recent developments.
The fact that the inclusion of aerosols produces a better match between model predictions and observed surface temperature trends is a significant step towards attributing global warming to human activities. Moreover, the models do a reasonable job of simulating both natural variability and other spatial patterns, including temperature differences between the hemispheres, land and oceans, and the troposphere and stratosphere. This suggests, then, that recent changes are not solely natural variability but, in part, evidence of the fingerprint of human activities.
The IPCC view represents the painstaking compilation and analysis of a huge amount of work, but there has been no sea-change in our understanding. This is the only way forward with scientific advice on a subject as complex as the global climate. Policy makers and the general public will have to possess their souls in patience: to do otherwise would be foolhardy and almost inevitably would be regretted.