The so-called warming ‘hiatus’ over the past decade and a half is no reason for complacency on future warming. Mathematics teaches us that 15 years is simply too short a period from which to draw statistically valid conclusions.

Is global warming slowing down?

Is the past 10 to 15 years — which have seen little net change in the average surface temperature of the Earth despite ever-larger carbon dioxide emissions — an indication that climate change will not be as bad as previously projected? That the atmosphere is less sensitive to carbon dioxide than many scientists have concluded based on their understanding of the scientific evidence? That the warnings from those in-the-know are overblown and the world can keep burning fossil fuels?

These questions, percolating for a few months in the blogosphere, came to a head with a recent article in The Economist questioning climate sensitivity — the amount of surface warming expected for a doubling of atmospheric carbon dioxide levels. “The climate may be heating up less in response to greenhouse-gas emissions than was once thought,” read the article’s tagline. “But that does not mean the problem is going away.”

The second half of that conclusion is certainly right. Even if climate sensitivity is somewhat less than the IPCC’s median value of about 3 degrees Celsius, atmospheric carbon dioxide levels are increasing exponentially, so a smaller value merely buys an extra decade or two until the same amount of warming is reached.

But is the climate less sensitive to greenhouse gases like carbon dioxide and methane than has been forecast?

How Much of a Slowdown?

While there’s no doubt that the Earth’s recent surface temperature has not leaped ahead like it did in the 1980s and 1990s, it hasn’t exactly shown a flat trend over the last decade and a half.

Each of the three most-cited datasets estimating average surface temperature shows warming over the last 15 years (180 months): GISS: 0.11 C; HadCRUT4: 0.07 C; and NCDC 0.07 C (.2, .13, and .13 F respectively).

Like all measured numbers, these data have uncertainties, as discussed in the sidebar at the end of this article.

These increases are certainly less than the warming rates of the 1980s and first half of the 1990s of about 0.15 to 0.20 C (.27 and .36 F respectively) per decade. The earlier period may have provided an unrealistic view of the global warming signal, says Kevin Trenberth, climate scientist with the National Center for Atmospheric Research in Boulder, Co.

“One of the things emerging from several lines is that the IPCC has not paid enough attention to natural variability, on several time scales,” he says, especially El Niños and La Niñas, the Pacific Ocean phenomena that are not yet captured by climate models, and the longer term Pacific Decadal Oscillation (PDO) and Atlantic Multidecadal Oscillation (AMO) which have cycle lengths of about 60 years.

From about 1975, when global warming resumed sharply, until the 1997-98 El Niño, the PDO was in its positive, warm phase, and heat did not penetrate as deeply into the ocean. The PDO has since changed to its negative, cooler phase.

“It was a time when natural variability and global warming were going in the same direction, so it was much easier to find global warming,” Trenberth says. “Now the PDO has gone in the other direction, so some counter-effects are masking some of the global warming manifestations right at the surface.”

In a 2011 analysis, researchers Grant Foster and Stefan Rahmstorf tried to mathematically separate out the influences of El Niños and La Niñas, volcanic eruptions that can lead to cooling, and a recent decrease in the radiance of the Sun. They found that the surface has an underlying average warming trend of about 0.15 C (0.27 F) per decade, presumably resulting from greenhouse gases.

The Oceans are Heating Up

The slowdown of surface warming has led many scientists to ask: If ever-larger carbon dioxide emissions are trapping ever more heat, where is the heat going?

The suspicion is that it’s been heating up the oceans, and recent studies have provided a much clearer picture. Direct measurements show the oceans still gaining heat.

Using data collected over the last decade, researchers Virginie Guemas from Météo-France and colleagues have shown a retrospective prediction at least five years in advance of the recent temperature pause, and they attribute it to increased ocean heat uptake, especially of the top 700 meters of the oceans, with 65 percent of it in the tropical Pacific and Atlantic Oceans.

Giving support to their finding is a forthcoming “reanalysis” by Magdalena Balmaseda and Erland Källén of the European Centre for Medium Range Weather Forecasts in the U.K., and Trenberth. Their research, by combining several sources of data with climate models, finds a sharp increase in ocean heating over the past decade, beginning shortly after the 1997-98 El Niño. “In the last decade, about 30 percent of the warming has occurred below 700 m, contributing significantly to an acceleration of the warming trend.”

In fact, their reanalysis finds that the total of all oceans actually lost heat during the 1990s, at a rate of about -0.26 Watts per square meter of ocean surface area. By contrast, the ocean gained about 1.19 Watts per square meter in the first decade of the 21st century, most in the top 700 meters. That gain, Trenberth says, is associated “with changes in the winds and changes in the ocean currents that are associated with a particular PDO pattern that has dominated in the 2000s.”

So it’s not surprising that there was a significant warming of the surface during the 1990s, but not over the past decade. This recent, large increase in ocean heat content is the best sign that the Earth is still undergoing an energy imbalance caused by an enhanced greenhouse effect.

About 90 percent of this extra energy goes into the oceans. But meteorologist Roger Pielke Sr. of the University of Colorado in Boulder says he would like to understand why more heat is going into the deep ocean. “Until we understand how this fundamental shift in the climate system occurred,” says Pielke, “and if this change in vertical heat transfer really happened, and is not just due to the different areal coverage and data quality in the earlier years, we have a large gap in our understanding of the climate system.”

These large changes in ocean content reveal that the Earth’s surface is not a great place to look for a planetary energy imbalance. “This means this heat is not being sampled by the global average surface temperature trend,” he says. “Since that metric is being used as the icon to report to policymakers on climate change, it illustrates a defect in using the two-dimensional field of surface temperature to diagnose global warming.”

A Shift in Aerosol Cooling?

Some have wondered if an increase in aerosols from China might be constraining the atmospheric warming.

Aerosols are small particles of smog, smoke, or dust that reflect sunlight before it reaches Earth’s surface. It’s long been known that sulphur dioxide aerosols from volcanic eruptions lead to a quick but temporary decrease in surface temperatures, as with the 1816 “Year Without a Summer” after the massive eruption of Indonesia’s Mt. Tambora a year earlier.

Indeed, an increase in aerosol emissions from the ramp-up of industrial production after World War II, at a time with few controls on pollution, are thought to be at least partly responsible for the hiatus in the rise in world temperatures from about 1945 to 1975. And the current aerosol forcing is substantial — an estimated cooling of 1.6 Watts per square meter, compared with manmade greenhouse warming of about twice that (see Figure 1 in Hansen et al. 2011).

James Hansen, just retired from NASA, wrote recently:

The rapid growth of fossil fuel CO2 emissions in the past decade is mainly from increased coal use…mostly in China with little control of aerosol emissions. It is thus likely that there has been an increase in the negative (cooling) climate forcing by aerosols in the past decade, as suggested by regional aerosols measurements in the Far East, but until proper global aerosol monitoring is initiated, as discussed below, the aerosol portion of the amplified Faustian bargain remains largely unquantified.

However, a recent study by Daniel Murphy of the Earth System Research Laboratory at NOAA, in Boulder, Colorado, found surprisingly little net change in aerosol forcing over the past decade. As air pollution shifted from the northern latitudes of the U.S. and Europe towards the equator in China and India, competing effects largely cancelled one another out — there is more sunlight nearer the equator, but its effect on aerosols is undone by its steeper angle, which means both that it travels through a shorter path in the atmosphere (so has less opportunity to scatter off aerosol particles) and less of its scattering is upward.

Average trend in aerosol concentrations (optical depth) over the last decade. Blue areas have less polluting aerosols; red areas have more.*

Murphy actually found that in the past decade aerosol concentrations have increased the most in the Middle East at about 20 degrees North latitude, perhaps because of dust. Aerosol concentrations decreased around 40 degrees North and around 40 degrees South, with the latter probably brought about by winds that scatter sea salts.

He cautions that his result applies only to aerosol’s “direct effect” — its scattering of sunlight — and not to its many “indirect effects,” such as the function aerosols serve as condensation sites for cloud formation. (The effects are roughly comparable in magnitude.)

“The message is simple,” he says. “For the direct effect, it matters more how much total aerosols there are than where you put them around the Earth.”

No Expectation of a Steady Temperature Rise

“Our expectation has never been that each year would be inexorably warmer than the previous year,” says Ben Santer, a climate modeler at Lawrence Livermore National Laboratory.

It’s simply scientifically incorrect, he says, to attribute the divergence of climate model projections and observations to an overestimation of the climate sensitivity. Santer says he sees several explanations of why climate model projections of surface warming may be differing from actual observations in the past decade or so.

“It’s certainly the case that we got some of the forcings wrong,” he says of the factors that specify the influence of any particular component of the atmosphere. “It’s likely we underestimated the true volcanic aerosol forcing, and may have underestimated the cooling effect of stratospheric ozone depletion.”

And Santer prefers the temperature measurements on the lower troposphere (below about 10 kilometers in altitude) derived from microwave emissions measured by satellites. His reasoning: they have true global coverage, and are independent from complicating factors like the urban heat island effects that influence thermometers on the surface.

Top: The probability that a modeled, unforced climate has a trend above what is actually observed (as of 2011). This chance drops below 5 percent only for trend lengths of at least 17 years, showing that the observed trends have a 95 percent chance of falling below natural noise only for time intervals that approach two decades. Bottom: Signal-to-Noise ratios for different timescales.**

“One would be foolish to rule out residual observational uncertainties given the history of the whole MSU [microwave sounding unit] saga,” he says, referring to the many years when scientists struggled to reconcile the surface and satellite measurements.

While the University of Alabama at Huntsville measurements were earlier biased low, the two satellite datasets have differed significantly in recent years, with the Remote Sensing Systems data showing essentially no warming in the lower troposphere over the past 15 years while UAH data show a statistically significant 0.12 C of warming in that time.

“There are very real and very large difficulties in generating coherent, homogeneous temperatures from two dozen drifting satellites,” Santer says.

Santer’s work in recent years has helped clarify how and when a climate signal can be found amid the noise of natural variability. Model studies found many 10-year periods that showed no surface warming, with even longer periods easily possible, much like there have been periods of several years where the Dow Jones Industrial Average has declined while moving higher over the decades. Santer’s group found that “temperature records of at least 17 years in length are required for identifying human effects on global-mean tropospheric temperature.”

The bottom line, Santer says, is “there are multiple, not mutually exclusive interpretations of modeled versus observed differences, and claiming that there is only one explanation is not scientifically accurate.”

“We study the signal. If others want to study the noise, let them.”

So Is There Anything to Explain?

Why did the ocean start taking up so much heat around the turn of the century, and will it continue? Calling it a “surprising finding,” Judith Curry of the Georgia Institute of Technology asks “Is this real, or an artifact of the reanalysis process? We don’t know,” she says, expecting the debate to continue.

Nor is it clear that recent surface trends are particularly unusual. “The term ‘hiatus’ is premature,” says planetary climatologist Raymond Pierrehumbert of the University of Chicago. “Maybe with another 10 years of data you’d say that’s something that needs explanation here.”

Pierrehumbert notes that the increase in carbon dioxide’s radiative forcing over any one decade is about one-fourth of a Watt per meter-squared, so if climate sensitivity is 2 C, the expected warming is only about 0.13 C (forcing increase divided by sensitivity). That can easily be swamped by natural fluctuations of 0.2 to 0.3 C from an El Niño or La Niña, and fluctuations from longer ocean cycles.

And, he says, “There’s really nothing in this that changes our estimates of climate sensitivity.” Calculation of that all-important number from the 20th century record is not possible, because the aerosol forcing is not well known, nor are the data for ocean warming up to the task.

“Any estimate of sensitivity requires all of the record and not just the last 20 years of it,” Pierrehumbert says. “The smaller the piece of it you take, the less certainty you have in your result.”

Nonetheless, he agrees that earlier warming may have been deceiving.

“I think it’s true that some rather sloppy discussion of the rapid warming from the 20th century has given people unrealistic expectations about the future course of warming.”

All the same, the warming effect of carbon dioxide is far down his list of topics that need further examination.

“Why would anyone seriously question greenhouse gases?” he asks. “They absolutely have a radiative effect, and no serious scientist thinks climate sensitivity could be much lower than 2 degrees Celsius based on the balance of the evidence.”

As noted above, with carbon emissions increasing exponentially, even a somewhat lower climate sensitivity simply does not buy much time to avoid a set level of warming.


Global warming — about 0.8 C or 1.5 F on the surface since the Industrial Revolution, with about twice this already committed to appearing in the future — is just getting started. Scientists have largely succeeded in digging its signal out of the climate noise, but that signal is still not obvious to many.

There have been hiatus periods in the past — from about 1945 to 1975, and slow downs in 15-year warming rates around 1994-1995 — and there will likely be more in the future. Those times of hiatus are consistent with human-caused warming in the natural world, and they are no reason at all to be lulled into complacency.

*Reprinted by permission from Macmillan Publishers Ltd: Nature Geoscience, Little net clear-sky radiative forcing from recent regional redistribution of aerosols, D. M. Murphy 6, 258–262 copyright (2013).
**Source: B.D. Santer et al, “Separating signal and noise in atmospheric temperature changes: The importance of timescale,”
Journal of Geophysical Research Vol 116 D22105, doi:10.1029/2011JD016263, 2011. Used with permission.

*The headline of this piece was edited May 9 to bracket the letter “h” in “Wither,” clarifying the deliberate intent at a double meaning.

The Temperature Lately

What has been the actual warming over the past 15 years? The answer has several layers of complexity.

Of the three most cited datasets that estimate average surface temperature, all show warming over the last 15 years (180 months):

GISS: 0.11 C
HadCRUT4: 0.07 C
NCDC: 0.07 C

while the change in the temperature of the lower troposphere over the same period is

RSS: -0.03 C
UAH: 0.12 C

These numbers are estimates based on fitting a straight line through the data, and each comes with a level of uncertainty that indicates how much the value’s estimate differs from the “true” value. The simplest (but naive) estimates of these uncertainties are:

GISS: 0.11 ± 0.06 C
HadCRUT4: 0.07 ± 0.06 C
NCDC: 0.07 ± 0.06 C

RSS: -0.03 ± 0.08 C
UAH: 0.12 ± 0.08 C

Here the uncertainties give the bounds of the 95 percent confidence level — that is, we can be 95 percent sure that the “true” amount of warming as measured by GISS is between 0.05 C and 0.17 C.

But his method is “naive” because it assumes each month’s temperature is independent of those before and after it. In reality the climate system has a great deal of inertia, and any given month is more likely to be warm (or cold) if the previous month was warm (or cold). This “autocorrelation” increases the uncertainties significantly given that there are in effect many fewer independent data points.

The calculation of uncertainties in the presence of autocorrelation is mathematically involved. It was adapted for climate science about 13 years ago; the blog Skeptical Science has a useful calculating tool based on the method of Grant Foster and Stefan Rahmstorf. It shows that the uncertainty that accounts for autocorrelation is, for a relatively short interval like 15 years, several times higher than the naive uncertainty.

For example, including autocorrelation gives, for the GISS data, a 15-year trend of 0.07 ± 0.14 C per decade, so the warming in that time is GISS: 0.11 ± 0.21 C

The large uncertainty simply shows that very little can be said about the “true” warming over such a short time period, when climate inertia is properly considered. In fact, in this case the autocorrelation is so strong the GISS data effectively has only about 11 independent degrees of freedom instead of the naive 180 (=15´ × 12) present in 15 years worth of monthly data points.

Uncertainty from the statistics, from climate inertia, and from climate noise all show that 15 years is simply too short of a time period when making judgments about climate.

So the lesson of the mathematics is: 15 years is simply too short of a time interval from which to draw statistically valid conclusions, which are heavily influenced by “end point” effects. This is why most climate scientists prefer to draw their conclusions from 30 years’ worth of data or more.

Back to article

A regular contributor to Yale Climate Connections since 2012, David Appell, Ph.D., is a freelance writer living in Salem, Oregon, specializing in the physical sciences, technology, and the environment. His...