Is today a warmer-than-average day? That seemingly basic question has become fraught in the era of the Anthropocene, as greenhouse gases emitted during a century-plus of fossil fuel use have been warming most everything on our planet.

The most common U.S. yardstick for determining whether a given day is unusually warm or cool – NOAA’s 30-year database of “climate normals” – is now being updated. (Update: The new normals were released on May 4, 2021.) Over the past decade, the averages for the U.S. and many other nations have been based on data from 1981-2010. Now these averages are being revised across the world to reflect the just-ended period of 1991-2020.

Contiguous U.S. average temperatures, 1895-2020
As the 30-year normal period for the contiguous U.S. shifts from 1981-2010 (blue box) to 1991-2020 (red box), the 1980s will be dropped and the considerably warmer 2010s will be added. (Image credit: NOAA/NCEI)

As one would expect, the new norms will be warmer in most parts of the world, though not everywhere. And that raises big questions about what qualifies as “normal”. (Note that we’re using “normal” as a proxy for average; it’s actually “normal” for weather to vacillate rather than sticking to the statistical norm day after day.)

Change in climate normals
Based on this preliminary map, nearly all of the world was warmer in 1991-2020 than in 1981-2010. One of the few exceptions is the north central U.S. and south central Canada. (Image credit: Brian Brettschneider, University of Alaska Fairbanks, using data from the Global Historical Climatology Network)

For the contiguous U.S., the period 1991-2020 was roughly 0.44°F warmer and 0.34″ wetter than in 1981-2010, based on national-scale data published by NOAA’s National Centers for Environmental Information (NCEI). The final station-by-station data are scheduled to be released this spring.

The warming has been even more dramatic in Alaska. The coverage of “tundra” climate – one where the warmest month averages less than 50°F – decreased by more than 9% in the updated Alaska norms, according to climatologist Brian Brettschneider, of the University of Alaska Fairbanks.

The mechanism behind these and other changes is simply the lopping off of one decade, the 1980s, and the addition of another decade, the 2010s. Such a shift can have surprisingly large effects, especially if the decades added and dropped included any unusually warm or cold periods that were particularly sharp and/or prolonged.

For example, the coldest December on record for the contiguous U.S. was in 1983, and the warmest June was in 2016.

Difference in precipitation between 1991-2010 and 1981-2010 periods
Most parts of the United States east of the Rockies were slightly wetter in 1991-2020 compared to 1981-2010, while the Southwest was notably drier. (Image credit: Brian Brettschneider, based on data from the PRISM Climate Group/University of Oregon)

Some of the changes reflect natural decade-to-decade variability. That may be the case for the area from Montana and the Dakotas into southern Canada, which are coming in slightly cooler in the new norms (see map above) despite longer-term expectations of warming.

Other shifts, such as moistening across much of the northern tier of states and widespread warming in the Southwest, are more in line with long-term climate-change projections. Heating across the Southwest is leading to “hot droughts” that add to the wildfire threat.

In most places, the bulk of the warming has occurred at night, again in line with long-held expectations. In Minneapolis, for example, one early estimate found that daily highs were about 0.2°F warmer in the new climatological period, but daily lows were about 0.9°F warmer.

Likewise, University of Georgia climatologist Pam Knox described her state’s temperature changes in an email: “Generally getting warmer, with more increases in minimum temperature than maximum temperatures (related to higher humidity and urbanization).”

Globally, the new period was about 0.18°C (0.32°F) warmer in the NOAA database and about 0.19°C (0.34°F) warmer in the NASA database. Differences in how data-sparse areas are handled, particularly the Arctic, account for most of the tiny difference between the NOAA and NASA datasets.

If it continues for another several decades, on top of the warming that had already happened by 1981-2010, this pace of warming would push global climate well past the Paris Climate Agreement benchmark of 1.5°C of warming over preindustrial times.

Finding the best guide to the future

The practice of using 30-year periods to describe climatology began just after World War II, when the forerunner of the UN-based World Meteorological Organization (WMO) asked each nation to calculate climate averages for the period 1901-1930. Those were followed by 1931-1960 norms. More recently, many nations, including the U.S., have been updating their 30-year climate norms each decade.

The idea is that a 30-year period is long enough for most annual and multiyear climate variations to get canceled out.

The 30-year norms have a far-flung influence on the nation’s economy. While about half of U.S. states have decoupled energy demand from their rate-setting practices, others rely heavily on the NOAA 30-year averages as a guide to future heating and cooling demands when setting rates. So it’s crucial that the data be as accurate as possible.

Global heating, however, throws a wrench into what “normal” means. For one thing, do the past 30 years really represent what we can expect in the 2020s? Researchers at NOAA and elsewhere have been exploring alternatives. One is an “optimal climate normal”: It’s developed by finding the length of the averaging period that best predicts the following year at each location.

Along these lines, NOAA has posted supplemental monthly temperature normals for hundreds of U.S. stations. These include 5-, 10-, 15-, and 20-year averages through 2010, along with optimal climate normals tailored for each station.

Several studies have found that a 15-year average tends to perform best in the current climate regime. With this in mind, NOAA is to soon replace its variety of supplemental norms with 15-year averages, this time not just for temperature but for a whole gamut of weather conditions. “A single alternative based on our recent research on optimal normals will provide a readily acceptable substitute to meet the needs of a variety of user communities,” said climate scientist Michael Palecki, who is managing the climate normals project at NOAA/NCEI.

‘Normalizing’ a human-caused warming

Each update of climate normals brings up another question: Does this practice serve as an inadvertent smokescreen, one that keeps us from fully seeing the relentless march of human-caused warming?

Some observers think so. “By updating every decade, this version of ‘normal’ hides a lot of change, and like the allegorical frog it makes it harder to notice the hot water we are in,” said Robert Rohde, lead scientist for the Berkeley Earth project, on Twitter.

In Miami, for example, December 2020 had an average temperature of 69.4°F. That was 1.1°F below the 1981-2010 climate norm – but when compared to the climate way back in 1931-1960, it was 1.3°F above average. Although urban heat island effects have contributed to some of the temperature rise in cities like Miami, rural areas are warming too.

Tweet

There’s clearly a need for updated climate norms. For some purposes, it’s most important and relevant to know where a particular day stands compared to present versus long-ago climate over a longer period.

Updated norms are also crucial to how certain phenomena are diagnosed. For example, NOAA and other agencies define El Niño and La Niña based on sea surface temperatures over the central and eastern Pacific Ocean, and whether those are running warmer or cooler than the seasonally adjusted average. If a fixed climate baseline were used, then long-term warming would eventually (and erroneously) make it look as if a permanent El Niño event were in place.

For such reasons, it’s best to see a fixed benchmark not as a replacement for the regular updates, but rather as an alternate measure that could be maintained and used to illustrate how far climate has strayed as a result of human activity.

What period would be best to use as a pre-climate-change benchmark? One option is 1951-1980. That’s the span used by NASA’s Goddard Institute for Space Studies (GISS) when calculating its monthly and yearly departures from average global temperature.

“The primary focus of the GISS analysis are long-term temperature changes over many decades and centuries, and a fixed base period yields anomalies that are consistent over time,” says GISS on its website. “However, organizations like the [National Weather Service], who are more focused on current weather conditions, work with a time frame of days, weeks, or at most a few years. In that situation it makes sense to move the base period occasionally, i.e., to pick a new ‘normal’ so that roughly half the data of interest are above normal and half below.”

According to New Jersey state climatologist David Robinson of Rutgers University, as cited in an online column in Forbes by Marshall Shepherd, of the University of Georgia, “the 1951-1980 period is a viable candidate [for a fixed benchmark], as it is an era with abundant observations and precedes recent decades where the signal of human-induced warming has emerged from a naturally noisy climate signal.”

Global temperatures from 1951 to 1980 were among the lowest of the century. Some natural variability was in the mix, but climate model reconstructions suggest that the main culprit was the industrial boom that came after World War II and the resulting sun-blocking aerosols that were being spewed at a furious pace by North America and Europe.

Later in the century, growing environmental awareness and regulation helped reduce the prevalence of sun-blocking soot, but the increases in invisible greenhouse-gas emissions continued apace, eventually putting the world on a sustained warming track.

Another option for a fixed benchmark is 1961-1990. That interval includes the relatively cool 1960s and 1970s and also the 1980s, when global warming began to manifest in earnest. Because the 1961-1990 period is also the last universal benchmark mandated by the WMO, those numbers are readily available across the globe.

In fact, the WMO now plans to preserve data for the period 1961-1990 as a reference period for climate change assessment.

“In a world in which the climate is changing rapidly, we need to update the climate normals more frequently than we did in the past to keep them useful,” said NOAA/NCEI principal scientist Thomas Peterson, president of the WMO Commission for Climatology, in a 2015 statement.

“At the same time, we need to keep the historical baseline for the sake of public and scientific understanding about the rate of climate change.”

Of course, the atmosphere doesn’t care how we measure its warming. The absolute temperature increase is the same no matter what we use to assess “above” and “below” normal. Still, there’s a case to be made for keeping two benchmarks: one fixed and one dynamic, each serving a different purpose.

Or, to invoke the venerable British phrase, horses for courses.

Website visitors can comment on “Eye on the Storm” posts (see below). Please read our Comments Policy prior to posting. (See all EOTS posts here. Sign up to receive notices of new postings here.)

Bob Henson is a meteorologist and journalist based in Boulder, Colorado. He has written on weather and climate for the National Center for Atmospheric Research, Weather Underground, and many freelance...

21 replies on “Updated yardstick begs question: What’s ‘normal’ in a changing climate?”

  1. Since the ‘normal’ is no longer a constant, it is time to up the game. A linear regression over 30 years seems the appropriate standard for ‘normal’ ! This can be easily applied in any case with a 30 year history of numeric data. Deviations of 2 or 3 standard deviations then become the interesting events. Simple linear extrapolations to unsupportable or impossible conditions become simple. (prefaced with – ‘If things continue as they are…’)
    The assumption of a static world does not work to describe a world driven by overpopulation!

  2. Those who say global temperature continues to increase cannot answer why these records still hold.  25 states have record highs temperatures from the period 1930-1937. I will even list them; Arkansas, Colorado, Delaware, Florida, Hawaii, Idaho, Indiana, Iowa, Kansas, Kentucky, Louisiana, Michigan, Mississippi, Montana, Nebraska, New Jersey, North Dakota, Ohio, Oklahoma, Pennsylvania, South Dakota, Tennessee, Texas, West Virginia & Wisconsin adds up to 25 states whose record HIGH have all occurred between 1930 & 1936.
     
    These 13 states listed below had their record highs occur BEFORE 1930 & 1936, such as; Alaska, California, Connecticut, Maine, Maryland, Minnesota, Montana, New Hampshire, New York, Oregon, Vermont, Virginia & Washington.
     
    A total of fourteen states set record highs in 1936 that obviously still stand. They occurred from July 5th to August 10th although July 10th has four of the records for high temperature in include Maryland which tied the record set in July 3, 1898 of 109°F, the rest that set records that year are New Jersey, Penn. & Virginia.
    The above in formation came from this source: http://www.ncdc.noaa.gov/extremes/scec/records
     

    1. That was the time period we could see fossil fuel use effects, but when there was still less aerosols. And that is still a very cherry picked by the record high heat amount. Now that aerosols are being reduced, why not look at overall extremes in maximum? Same data source…looking at more data. (though just looking at US is still cherry picking)..

  3. It is good to look at 30 year increments, but a lot can be learned from looking at millennial increments. Yes, that cannot included ‘recorded’ data, but core samples are fairly accurate in estimating temperature and carbon. See Richard Wolfson’s lectures on “Earth’s Changing Climate”.

  4. I mostly agree with Texas Hill County. I think the 20th century average is best, and people can see how much higher temperatures have risen. However, recent trailing 30-year averages are also useful for people living in the current day, and necessary for farmers and others. I think the frog in the pot is a good analogy for how the general public perceives climate change.

    The frog in the boiling pot is an old chestnut.

    German physiologist Friedrich Goltz demonstrated that a frog that had its brain removed will remain in slowly heated water, but an intact frog attempts to escape the water when it reached 25°C (from wikipedia)

  5. For print/online weather reports, I would suggest a “historic” value, running from the beginning of regular measurements (which will vary by location) through, say 1950; a “recent average” value based on a recent 30-year period as discussed here, and a “current average” of measurements since the end of the 30-year period. . With the record values as is commonly done now (as by WeatherUnderground). Measurements between 1950 and the “recent average” could be reported as a separate average, or just ignored. Providing the “historic” values (for those who care to look at them) makes it easier to see, in effect, what we have lost and, with the “recent average” values, how quickly it is happening.

  6. Different purposes need different measuring sticks, of course, but I’ve long wondered why “normal” is couched as a single number. Wouldn’t it be more accurate to define “normal” as, say, +/- one standard deviation? Then on any given day, the temperature could be described as being within the “normal” range, or outside it moderately, heavily, extremely, etc. based on how many standard deviations the variation represented. Just to be clear, I am not proposing this method for the daily TV weather report – -the general public wouldn’t understand it. Also, I understand why, say, a single number “benchmark” is better for some kinds of longer-term analysis. But in terms of whether or not a temperature is “normal”, standard deviations seem like the way to go. I would definitely be interested in hearing “cons” from perspectives I haven’t thought about.

  7. Thank You Mr. Henson for an outstanding break-down of the differences between the time-regimes and potential regional issues. My personal regional “lay” test-bed for warming issues is the State of Florida having lived in South Florida for 40 years and now in North Florida for the past 20 (and traveling back and forth between the two frequently). Florida is a unique warming test-bed, unto itself, because of how large/long the State is (400 mile or so differential from top to bottom), continental cold front access in N Florida, and S/Central Florida being surrounded by water (Gulf and Atlantic) on both sides which is subject to warmer flow off the waters as SSTs warm.

    My personal bottom line from personal observations, and particularly in the most recent 20 year stretch, is the S/Central Florida is clearly getting warmer (longer Summers with minimal Winter cold shots), North Florida is having much milder Winters since I moved here in 2001 with Spring temp shots sometimes as early as January and February in many instances, and sea levels in S Florida (having fished the entire coastline down there from the 1970’s to the present when I visit the coast now) are clearly rising (much more flooding during Spring Tides and even many regular high tides). Point being that the State of Florida, with these unique geographical issues, is on the front line of the global warming equation and most “lay” residents here can “feel it” and see it on any given day in any given season.

    1. WxManWannaBe maintains; “…. and sea levels in S Florida (having fished the entire coastline down there from the 1970’s to the present when I visit the coast now) are clearly rising (much more flooding during Spring Tides and even many regular high tides). Point being that the State of Florida, with these unique geographical issues, is on the front line of the global warming equation and most “lay” residents here can “feel it” and see it on any given day in any given season”.
       
      Just a short way down the coast at Miami Beach we see this projection made by NOAA. Can most “lay” residents “feel it” and see it on any given day in any given season” when the seal level raise is a change of 0.78 feet in 100 years?
      Mean Sea Level Trend
      8723170 Miami Beach, Florida
      The mean sea level trend is 2.39 millimeters/year with a 95% confidence
      interval of +/- 0.43 mm/yr based on monthly mean sea level data from
      1931 to 1981 which is equivalent to a change of 0.78 feet in 100 years.
      http://tidesandcurrents.noaa.gov/sltrends/sltrends_station.shtml?stnid=8723170
       
      This is the report and projections for what should be one of the most sensitive areas for sea level rise, Key West, Florida.
      Mean Sea Level Trend
      8724580 Key West, Florida
      The mean sea level trend is 2.37 millimeters/year with a 95% confidence
      interval of +/- 0.15 mm/yr based on monthly mean sea level data from
      1913 to 2015 which is equivalent to a change of 0.78 feet in 100 years.
      http://tidesandcurrents.noaa.gov/sltrends/sltrends_station.shtml?stnid=8724580 

  8. Thank you. This has been bothering me, as people’s memories appear to be shortening with the rise of the internet as well. [Few of them appear to take into account an age before television (1950s), let alone the internet (80s/90s), and the rise of mobile communication devices.] Raising the baseline distorts the comparison, so this is an important issue. Our access to more information is accompanied by more access to disinformation as well.

    1. Thanks for a thought-provoking article!

      I propose getting rid of the terminology of ‘normal’ altogether. If humans are significantly impacting something, can it really be said to be ‘normal’?

      For short-term weather reports and forecasts, as the article states, and in line with the National Weather Service, “it makes sense to move the base period occasionally, i.e., to pick a new ‘normal.’” But maybe let’s call them ‘expected’ temperatures instead of ‘normal’. Pick whatever moving base you prefer of 15-30 years, and geographic area of interest, large or small.

      But for purposes of really emphasizing climate change, why get stuck with moving 30-year intervals? Why not use the entire (and non-moving) century of 1900-2000? (I presume there are not enough reliable temperature records prior to the start of the 20th century to include earlier decades.) The first figure in this article makes that century look like a pretty good long-term average, at least for the U.S., and before significant warming occurred. Use the entire 20th century as your climate benchmark. Let’s call that ‘normal’ if you must, or alternatively the ‘pre-warming average’ or something similar. I’m a geologist, btw, not a climate scientist, so I may be way off in my thinking, which tends towards millions or even billions of years.

      1. Thanks sir Henson. The dang sun drove into a cluttered area of space it seems, it is sort of a cluster ,,, problem , we finally get to see, and, well, those that can, what a show. I would say that harping repetitive suggestion by homeland and fema, should be followed, they ask that we have extra food and water. And those that read your fine work should consider where they live in relation to your findings. Moving might be called for.

Comments are closed.