‘Multiple independent lines of evidence.’ Get used to hearing that term in reference to the past year’s record warmth across the contiguous United States and the near-record global temperatures too. The long-range forecast: a hotter planet by the end of the century.
The final station data are in, and 2012 was a scorcher for the United States and one of the top 10 hottest years on record for the world as a whole.
The contiguous U.S. smashed prior records by nearly half a degree C (0.8 °F). These figures for the U.S. are quite robust, despite some unfounded criticism from some quarters, and are validated by multiple independent datasets and methods.
The United States in 2012
Figure 1: Trailing 12-month average temperature anomalies for the contiguous United States from both the existing USHCN network and the new pristinely-sited USCRN network.
Temperature data in the United States are imperfect. Temperature is measured at weather stations not intended to create long-term climate records. Over the past century these stations have been relocated, for instance from building rooftops to airports and wastewater treatment plants in the 1940s. The instruments themselves have changed, going from liquid-in-glass thermometers to electronic sensors in the 1980s. The time of day at which the temperatures were measured has changed, from evening pre-1970 to morning. And the environment around sensors has changed as cities have grown and more land has been developed.
To correct for these issues, the National Oceanic and Atmospheric Administration’s National Climatic Data Center (NOAA/NCDC) has created methods to detect and remove so-called inhomogeneities — those biases resulting from non-climate factors. The approach uses an automated method that looks for step-changes or spurious trends that occur at one station but are not seen at the same time in nearby surrounding stations. The idea is that climate operates on larger regional scales, and local changes not seen in the regional average are likely artifacts resulting from station moves, sensor changes, etc. The approach has been extensively tested both on real and synthetic data and validated by external groups using different approaches.
Figure 2: Monthly anomalies from the contiguous United States from both the existing USHCN network and the new pristinely sited USCRN network.
Starting in 2001, NCDC began to install a network of pristinely sited modern measurement instruments distributed around the United States. This network, called the U.S. Climate Reference Network (USCRN), provides an important independent check on the standard U.S. Historical Climatological Network (USHCN) data. USCRN data is free from any influences by station moves, instrument changes, urbanization, etc. Figure 2, above, shows that for the period from 2004 (when USCRN first achieved representative coverage) to 2012 the two are nearly identical both in monthly measurements and trends. Both show 2012 as an anomalously warm year, with a similar magnitude of increase relative to the prior decade.
Figure 3: Trailing 12-month average temperature anomalies from the contiguous United States from both the existing USHCN network and the new pristinely sited USCRN network.
In addition to surface measurements, satellites also measure the temperature of the lower part of the atmosphere (the lower troposphere). These measurements are available both for the U.S. and the world. For the U.S., they appear to match USHCN adjusted data quite well, at least in the maximum temperatures (minimum temperatures differ a bit for technical reasons involving the nocturnal boundary layer). It is important to note how much better the adjusted data match satellite readings than the raw data. Satellite readings aren’t perfect, however, as they too are subject to a large array of adjustments to reflect factors like orbital drift.
Figure 4: Annual maximum temperature anomalies from UAH (satellite) and both USHCN raw and fully adjusted records. Additional variance adjustments have been applied to both raw and adjusted USHCN data to match the effect of El Niño events on the satellite record. Data provided by John Christy and Matthew Menne.
It is worth pointing out that even the raw data (with no adjustments) show 2012 as the warmest year on record for the contiguous United States.
The World in 2012
While the U.S. temperature topped the charts in 2012, the world as a whole was not quite so hot. Nonetheless, it still ranks as either the 9th, 10th, or 11th hottest year on record depending on which record is used. Figure 5 shows surface temperatures from NASA, NOAA, and the United Kingdom’s Hadley Centre from 1880-2012 and satellite temperature records from the University of Alabama, Huntsville (UAH) and Remote Sensing Systems (RSS) from 1979-2012. While the satellite records run slightly lower than the surface temperature records, the trends are comparable in magnitude.
Figure 5: Annual temperature anomalies from major global (land/ocean) records.
Over the prior decade or so, temperatures have somewhat stagnated. It is important to point out, however, that the current temperatures are still consistent (or even higher than) what would be predicted based on the trend from 1979 to 2000, as discussed here. Global temperatures are subject both to greenhouse gas-driven warming and natural variability, and periods spanning multiple decades are needed to separate out the human-driven changes.
Figure 6: Annual temperature anomalies from major global (land/ocean) records, 1979-2012.
While a single year’s temperature data may reflect natural variability rather than climate change, the long-term trend both for the U.S. and the world is unambiguously up. These records are robust and supported by multiple independent lines of evidence. The question of exactly how much warming there will be in the future is far from settled, but we can be sure that the world of 2100 will be significantly warmer than the world we live in today.