How Fast is the World Warming? | Open Mind
Posted on May 28, 2025 | 4 Comments The preprint of my latest (as yet unpublished) paper with Stefan Rahmstorf has attracted some attention, because in the final paragraphs we say:
“In conclusion, removing the best estimate of the influence of three natural variability factors on global temperature reduces the noise level of the data sufficiently to reveal a large and significant acceleration of global warming. The most important insight from these adjusted data is that there is no longer any doubt regarding a recent increase in the warming rate.
Although the world may not continue warming at such a fast pace, it could conceivably continue accelerating to even faster rates. But this much is clear: if the ending value of the smoothed version of adjusted data (either the lowess smooth or PLF10) is extrapolated into the future by the estimated rate over the last decade, it will exceed the 1.5°C limit in late 2026 for most data sets.”
We mentioned extrapolating the final rate estimate into the future (albeit only a few years), and that it is conceivable it might continue accelerating. Some think this was ill-advised, given that the central estimate for the most recent rate according to the given analyses is about 0.4°C/decade, quite a bit higher than expected and higher than most people believe. Our paper, it is said, will encourage some to indulge in doom & gloom, others to ridicule climate science.
I’m not afraid of doomerism or of ridicule, but if the issue becomes the focus of the discussion it might distract from what I consider the truly important message: that there is no longer any doubt regarding a recent increase in the warming rate. I think we need to say it, in print, and attract attention to it.
I’m hardly the first to suggest this, but in my opinion, until now the scientific community has been dragging its feet accepting the truth of it. The only prominent climate scientist I know of who has promoted the idea (although there could well be others, I’m not as up-to-date as I’d like) is James Hansen, in part because he believes the reduction in sulfate pollution from shipping fuels has decreased its cooling effect in a noticeable way. But there has been resistance to Hansen’s ideas, and perhaps some think of him as “over the hill.” I can’t help but remember how many times he has made surprising claims which inspire skepticism, only to end up hitting the nail squarely on the head.
Advertisement
The extreme heat of the years 2023 and 2024, all by itself, pushed the idea of accelerated warming. Yes there was an el Niño involved so we saw some increase coming, but what we got was much more than expected. Everybody talked about it, some estimated how much el Niño contributed to the warming, only to conclude that it’s not enough to account for how hot the last few years have been. Gavin Schmidt appeared on a podcast with Neil DeGrasse Tyson to discuss how computer models didn’t prepare us for this. And who can forget Zeke Hausfather’s immortal words: “gobsmackingly bananas.”
We analyzed what are probably the five best-known data sets for global mean surface temperature anomaly (GMST), from NASA, NOAA, HadCRU, Berkeley, and the reanalysis data from ERA5. ERA5 data show the most rapid warming, HadCRU the least, but the results for all five are very similar. I’ll use the Berkeley data to illustrate; here are yearly averages since 1946:

How do we estimate the rate of change? We fit a statistical model to the data. We might, for instance, use a model consisting of a straight line plus noise (note: we actually need a model for the noise as well as the signal, but this post isn’t about details, just the general outline of the process).
I usually proceed by fitting a statistical model which allows for a very general pattern. One is the lowess smooth, a popular method although it doesn’t usually include calculation of the rate of change of a time series, it only estimates the value of the time series. I wrote my own program specifically so I could include calculation of the rate of change.
But there is a catch: a lowess smooth requires choosing a “scale” on which to do the analysis. You do this by specifying the fraction of data points to be included in each local regression. If you specify a small fraction, it only uses very close neighbors when estimating the value at a given time; with a large fraction each momentary estimate considers most (if not all) of the data. Since the time series of global temperature are evenly sampled (meaning the same amount of time between successive data values), choosing a fraction of data is equivalent to choosing a time scale for the lowess smooth.
Another method I often use is the continuous piecewise linear fit (PLF, a.k.a. linear spline) with “knots” (moments when the slope changes) regularly spaced throughout the time span. The model is a set of straight lines, one for each interval, which meet at their endpoints to make a continuous curve for which the slope (the rate of increase) remains constant during intervals, but not from one interval to the next.
Both models fit the data better than a single straight line, and both indicate that the warming rate has changed over time. I started with a time scale of 15 years, so that the last leg of the PLF fit would estimate the trend over the time span from 2010 to the present, coinciding with what Hansen indicated for accelerated warming. I’ll show the rate according to both methods on the same graph, from lowess in red (with pink shading for 2s uncertainty range), from PLF in blue (light blue shading for uncertainty range):
Advertisement

Note that the PLF doesn’t estimate the rate at single moments, but the average rate during each interval. Note also that not all segments are 15 years because the total time span is not a multiple of 15; I allowed the first to be extra-long so the rest would all be 15.
Clearly the warming rate has not been constant over time; in particuar, the rate before 1946 was significantly smaller than after. There is also evidence that the final leg of the journey (2010-2025) is faster still, but when the idea is tested rigorously it doesn’t quite make 95% confidence for the usual standard of “statistical significance” — but it’s close. There are at least two distinct episodes of different rate, prior to and after about 1970, and there may be even more change than just that.
If we allow only two distinct slopes, i.e. a PLF with only one knot, and choose its timing by changepoint analsis, not only does it confirm the statistical signficance of the acceleration in the 1970s, it selects 1972 as the moment of slope change, when the warming rate changes from 0.0005 ± 0.0037 °C/year, to 0.0206 ± 0.0016 °C/year.

Is there more slope change? We might find it if we reduce the level of the noise, by removing some of the changes we know don’t relate to man-made climate change, instead they are due to known natural factors, specifically volcanoes, the el Niño southern oscillation, and variations of solar output. Doing so by a method not that different from what we used in Foster & Rahmstorf (2011), we get this:

When I apply the same analysis to the adjusted data that was used on the raw data, it gives:

Now there are at least three distinct slopes, and two times at which the warming rate definitely changed, one early (around 1980) and another late (around 2010). The most recent warming rate is estimated at 0.0338 ± 0.0066 °C/year (0.338 ± 0.066 °C/decade).
Advertisement
Naturally we seek the best-fit PLF with 3 slopes (two slope changes), and it selects 1974 and 2015 as the changepoint times, with the middle slope (what the warming rate used to be) of 0.00176 ± 0.0014 °C/year, the final slope (the warming rate now) 0.0432 ± 0.0090 °C/year. I also re-computed the lowess smooth, using a 10-year time scale (rather than 15) to match the final leg of the PLF-changepoint-analysis journey:

I can also re-compute the PLF using regularly spaced knots every 10 years:

So, how fast is the world warming?
The given analyses estimate the rate at about 0.043 ± 0.01 °C/year (0.43 ± 0.1 °C/decade), and those are sound computations but they are predicated on the assumption (or should I say “presumption”) that the signal follows our model exactly. I recognize that the model is extremely useful — as indicated by how hard it is to deny it statistically — but the idea that global temperature is exactly following straight lines plus random noise is just not believable. It can do so approximately, very approximately even, but exactly? No.
When it comes to statistical models, no matter how useful they might be, George Box was right: they’re all wrong. The uncertainty values we compute generally assume that they aren’t just useful, they’re right; for me that requires taking them with a grain of salt.
And a statistical model, especially one chosen to look for slope change, is hardly the only evidence we have of the present warming rate. There’s a lot of good reason to believe that the rate isn’t as high as 0.4°C/decade.
Advertisement
For my best estimate, I’ll say the rate is 0.33 -.1 +.2 °C/decade (i.e between 0.23 and 0.53, best guess 0.33).
I intend to re-write the final paragraph in order to emphasize that I don’t expect the world to continue warming at such a high rate and I don’t expect it to accelerate further. But I won’t rule out those possibilities. And I intend to retain the implication that we’ll hit 1.5°C a lot sooner than expected, because I think it’s true.
Essentially, keeping warming below 1.5°C is no longer possible; 1.5 is dead. Long live 2°C. But at the present rate we’ll hit is sooner than we expected.
|