| | | How AGW isn’t happening in the real Earth system … Nov15by okulaer
Specifically how is the AGW mechanism for global surface warming supposed to work? How is the global “ocean heat content (OHC)” supposed to be increasing under a strengthening “radiative greenhouse effect (rGHE)”?
By reducing the surface’s ability to cool via thermal radiation (IR).
Here’s the basic idea:
Assuming the mean solar input [Qin] stays the same and assuming changes in evaporative-convective losses [Qout ev] only ever come in the form of responses to preceding “greenhouse”-induced warming, that is, these losses stay constant until such warming occurs, then the only mechanism for warming (of surface and/or ocean bulk) is a reduction in surface radiative losses [Qout rad], i.e. in the ‘radiant heat loss’ or – same thing – the ‘net LWIR flux’ coming off the surface:
Balance: ?Qin = ?Qout ev + ?Qout rad ? 0 = 0 + 0
Imbalance: ?Qin = ?Qout ev + ?Qout rad ? 0 = 0 + (-1) = -1
When less heat goes out than what comes in, warming ensues. It’s that simple …
This is the theory.
Now, do we see this AGW warming mechanism at work in the Earth system today? Can we observe it empirically? Can we follow in the available data the ongoing strengthening of the rGHE resulting from our continued fossil fuel emissions?
Not really.
In fact, we observe the exact opposite of what the theory above says should happen!
Here’s how the ‘net LWIR flux’ (the radiant heat loss) of the global surface of the Earth evolved from March 2000 to February 2015 according to the CERES EBAF-Surface Ed2.8 dataset:

Figure 1. (Absolute values.)
Since ‘net LW’ goes out of the surface, meaning it’s an energy loss, then the more negative its value, the greater the loss, and so the more efficiently the surface manages to rid itself of heat via thermal radiation.
As can well be observed from the diagram in Fig.1 above, the mean radiant heat loss of the global surface of the Earth has increased substantially and robustly – by about 1.5 W/m2 – over the last 15 years; that’s 0.1 W/m2 per year on average.
In other words, there is absolutely no positive Qout rad contribution to the global surface and bulk ocean energy budget to be spotted in the 21st century. Rather, the IR contribution is decidedly negative!
The imbalance goes the ‘wrong’ way:
?Qin = ?Qout ev + ?Qout rad ? 0 = 0 + 1 = +1
More is going out than what’s coming in. The atmospheric ‘IR insulation’ is weakening!
And so we should have been cooling …
But, wait. We’re not. So where’s the problem in all this?
The other two ?Qs in the equation above, both assumed to be zero, probably aren’t.
Which means: Any accumulation of energy at or below the global surface of the Earth is caused by processes other than a hypothetically strengthening rGHE.
Because it evidently isn’t …
So what about atmospheric DWLWIR to the global surface, the infamous “back radiation flux”? Perhaps it has increased as predicted to induce surface warming?
I’m afraid not. Not according to CERES:

Figure 2. (Anomalies.)
It’s gone down. The ‘net LW’ is conceptually, after all, thought to be made up of a downwelling and an upwelling component. If the upwelling component stays unchanged, then if the downwelling component, the “back radiation”, increases, then the ‘net’ (the radiant heat loss) will decrease as a result. That’s the idea. Well, we now know that the ‘net’ hasn’t decreased at all; it has increased. And so part of the reason why can be seen in Fig.2 above: The downwelling component hasn’t increased; it has decreased.
Funny, isn’t it?
OK, what about the solar input to the global surface, then? The one that was supposed to stay constant over time:

Figure 3. (Anomalies.)
Not quite constant, is it? Despite the relatively weak solar cycle we’re in, compared to the previous one, the actual absorbed solar heat flux at the global surface of the Earth has increased somewhat in intensity. Not by much, but still a little bit. (Note the significantly raised mean level of the last 4-5 years.)
Something tells me that the variable global cloud cover has a hand in all this …
Finally, what about the global OLR at the ToA, and how does it compare to tropospheric temps over the last 15 years?

Figure 4. (Anomalies.)
Seeing how the OLR at the ToA (Earth’s final heat loss to space) is normally (unless major cloud anomalies appear, as happened in 2008 and 2010, both times ENSO-induced) simply a radiative effect of tropospheric temps, this diagram exhibits just that connection and no signal pointing to any strengthening rGHE is to be found. The OLR simply follows the tlt (with a mean lag over the period of 0-1 months). The overall tlt trend is flat, and – as a natural result – so is the OLR one.
If an enhanced “greenhouse effect” were to be inferred from the data in Fig.4, the trend of the blue tropospheric temp curve would have to rise over the period while the trend of the red OLR curve remained flat. That is how the AGW mechanism is supposed to work, after all, through the elevation of Earth’s ‘effective radiating level (ERL)’. The ERL always stays the same temperature, but keeps rising as the amount of IR-active gases like CO2 in the atmosphere goes up, and so this means that all layers below the rising ERL will have to warm to maintain (or try and restore) the heat balance. Once again, that’s the theory.
And once again, this evidently isn’t happening in the real Earth system. The OLR trend remains flat because the tlt trend does so. The causal direction is tlt > OLR, not OLR > tlt, as the hypothetical mechanism would demand.
So why do mainstream climate scientists continue to meticulously avoid discussing these data in the direct context of their own warming hypothesis? Why don’t they dare address these specific issues? Theory vs. reality …
|
|