Sea Level Model Fail

From a post by Arnout Jaspers at clintel.org.

In 2016, a headline grabbing study said that the Antarctic ice sheet is much more unstable than [previously] estimated.  As a consequence, projected sea level rise in 2100 would double. Computer modelers from Deltares and KNMI scrambled back to their drawing boards to upgrade their prognoses for the Dutch coastline. Have these prognoses survived the climate news cycle?

Rarely has a scientific publication, whose authors said that the results ‘should not be considered actual predictions’, had such a profound impact outside science.

The model was unrealistic and made little sense but was none-the-less splattered across the media in The Netherlands and elsewhere and the Dutch were told their children would not even have a country when they grew up due to sea level rise.

It was refuted by a 2018 paper in Science, that showed the base of the Antarctic ice sheet was much more durable than assumed in the earlier paper. Even in the unrealistic RCP8.5 IPCC scenario, the sea level rise due to melting Antarctic ice, was modest. Of course, this later paper was ignored in most of the media. As noted in the clintel.org post:

Nevertheless, the extreme two to three metres of sea level rise in 2100 from the Deltares report is still reigning free in the public debate. This is a recurrent phenomenon in climate reporting: the most extreme prediction is readily accepted and promoted as the new normal, which can no longer be falsified by later research. And then, it is a matter of waiting for the next level extreme prediction that the media will promote to being the new normal.

As usual, the media makes wild extremes seem real and ignores reality. The full post can be seen here.

How constant is the “solar constant?”

The IPCC lowered their estimate of the impact of solar variability on the Earth’s climate from the already low value of 0.12 W/m2 (Watts per square-meter) given in their fourth report (AR4), to a still lower value of 0.05 W/m2 in the 2013 fifth report (AR5), the new value is illustrated in Figure 1. These are long term values, estimated for the 261-year period 1750-2011 and they apply to the “baseline” of the Schwabe ~11-year solar (or sunspot) cycle, which we will simply call the “solar cycle” in this post. The baseline of the solar cycle is the issue since the peaks are known to vary. The Sun’s output (total solar irradiance or “TSI”) is known to vary at all time scales (Kopp 2016), the question is by how much. The magnitude of short-term changes, less than 11 years, in solar output are known relatively accurately, to better than ±0.1 W/m2. But, the magnitude of solar variability over longer periods of time is poorly understood. Yet, small changes in solar output over long periods of time can affect the Earth’s climate in significant ways (Eddy 1976) and (Eddy 2009). In John Eddy’s classic 1976 paper on the Maunder Minimum, he writes:

Continue reading

Can we predict long-term solar variability?

By Andy May

This post is a result of an online conversation with Dr. Leif Svalgaard, a research physicist at Stanford University. Leif knows a great deal about the Sun and solar variability and can explain it clearly. Our disagreement is over whether long-term solar variations could be large enough to affect Earth’s climate more than changes due to humans – or not. Thus, we are arguing about the relative magnitudes of two sources of radiative forcing (RF) that are not known accurately. The IPCC estimates that the total RF, due to humans, since 1750, is 2.3 W/m2 (IPCC, 2013, p. 696). This number is unverifiable and likely exaggerated, but we can accept it for the sake of this argument.

I’ve written on this topic before here. This post is an update and will not cover the same ground. Some readers will want to read the first post before this one.

Continue reading

Climate Sensitivity to CO2, what do we know? Part 1.

The first version of this post had an error in Figure 1. It has been fixed along with the associated text (7/5/2021).

By Andy May

The IPCC claims, in their AR5 report, that ECS, the long-term temperature change due to doubling the atmospheric CO2 concentration or the “Equilibrium Climate Sensitivity,” likely lies between 1.5° and 4.5°C, and they provide no best estimate (IPCC, 2013, p. 85). But their average model computed ECS is 3.2°C/2xCO2. Here, “°C/2xCO2” is the temperature change due to a doubling of CO2. They also claim that it is extremely unlikely to be less than 1°C. ECS takes a long time, hundreds of years, to reach, so it is unlikely to be observed or measured in nature. A more appropriate measure of climate sensitivity is TCR, or the transient climate response, or sensitivity. TCR can be seen less than 100 years after the CO2 increase, the IPCC claims this value likely lies between 1° and 2.5°C/2xCO2, their model computed average is 1.8°C/2xCO2 (IPCC, 2013, p. 818).

The CO2 climate forcing, or the net change in radiation retained by Earth’s atmosphere associated with these scenarios is 3.7 W/m2 (IPCC, 2007b, p. 140). Using these values, we can calculate a surface air temperature sensitivity to radiative forcing (RF) of 1.8/3.7 = 0.49°C per W/m2. These values are inclusive of all model-calculated feedbacks.

Continue reading

Climate Sensitivity to CO2, what do we know? Part 2.

By Andy May

In Part 1, we introduced the concepts of climate sensitivity to CO2, often called ECS or TCR. The IPCC prefers a TCR of about 1.8°C/2xCO2 (IPCC, 2013, p. 818). TCR is the short-term, century scale, response of surface temperature to a doubling of CO2, we abbreviate the units as “°C/2xCO2.” In these posts we review lower estimates of climate sensitivity, estimates below 1°C/2xCO2. In parallel, we also review estimates of the surface air temperature sensitivity (SATS) to radiative forcing (RF, the units are °C per W/m2 or Watts per square meter). The IPCC estimates this value to be ~0.49°C per W/m2.

The previous post discussed two modern climate sensitivity estimates, by Richard Lindzen and Willie Soon, that range below 1°C/2xCO2. Next, we review climate sensitivity estimates by Sherwood Idso, Reginald Newell and their colleagues.

Continue reading

The problem with climate models

By Andy May

In my last post, on Scafetta’s new millennial temperature reconstruction, I included the following sentence that caused a lot of controversy and discussion in the comments:

“The model shown uses a computed anthropogenic input based on the CMIP5 models, but while they use an assumed climate sensitivity to CO2 (ECS) of ~3°C, Scafetta uses 1.5°C/2xCO2 to accommodate his estimate of natural forcings.”

Continue reading

Clouds and Global Warming

Figures 4 and five have been updated to correct a programming error (7/5/2021).

By Andy May

This post is inspired by an old post on the CERES cloud data by Willis Eschenbach that I’ve read and re-read a lot, “Estimating Cloud Feedback Using CERES Data.” The reason for my interest is I had trouble understanding it, but it looked fascinating because Willis was comparing CERES measured cloud data to IPCC modeled cloud feedback. I love obscure, back-alley comparisons of models and data. They tend to show model weakness. I learned this as a petrophysical modeler.

Continue reading

Earth’s Ice Ages

By Andy May

The phrase “Ice Age” is poorly defined and often abused, so let’s first define the climate state during most ice ages. It is called “Icehouse Earth.” The earth is in an icehouse state when either or both poles are covered in a thick, permanent icecap (Scotese 2015). Today, both poles are covered in ice year-round, so you may be surprised to learn this is very rare in Earth’s history. In fact, out of the last 550 million years, the earth has had permanent ice caps on one or both poles only nine percent of the time.

An “Ice Age” is best defined as a geologically (or millions of years long) long period of low temperatures. This usually results in the presence of continental and polar ice sheets and alpine glaciers. We are currently living in the Quaternary Ice Age, this is only the fifth significant and severe ice age in Earth’s known history, and, so far it has lasted about 2.6 million years (technically 30+ million years ago when permanent ice appeared on Antarctica). It is the most severe ice age in the Phanerozoic, the geological name for the past 550 million years. Ice Ages are rare, but humans evolved during one, so it seems normal to us.

figure 1.JPG

Figure 1. Christopher Scotese’s geological interpretation of Phanerozoic global temperatures in degrees C. The vertical line on the right side, labeled “PAW” is a projection of possible anthropogenic warming according to a pessimistic IPCC climate model. In 2016 the actual global average surface temperature of the Earth was about 14.5 degrees C. as marked on the plot, in 2019 the temperature is slightly lower at 14.35 degrees according to NASA GISS. The names of the major ice ages were added by the author. After (Scotese 2015)

Continue reading

Facts and Theories, Updated

By Andy May

In 2016, I published a post entitled “Facts and Theories.” It has been one of my most popular posts and often reblogged. I updated the post extensively for my new book, Politics and Climate Change: A History. This post is a condensed version of what is in the book.

Sometimes people ask climate skeptics if they believe in evolution or gravity. They want to ridicule our skepticism by equating human-caused, aka anthropogenic, climate change to evolution or gravity. Evolution and gravity are facts and anthropogenic climate change is a hypothesis. Equating “climate change” to gravity or evolution is valid, as all three are facts. Climate changes, gravity holds us to Earth’s surface and species evolve.

Continue reading

How to compare today to the past

By Andy May

In the last post, I discussed the problems comparing modern instrumental global or hemispheric average temperatures to the past. Ocean temperature coverage was sparse and of poor quality prior to 2005. Prior to 1950, land (29% of the surface) measurements were also sparse and of poor quality. Only proxy temperatures are available before thermometers were invented, but, again, these are sparse and poorly calibrated prior to 1600. So how can we compare modern temperatures to the distant past? We can’t do it globally or hemispherically, the past data are too poor or too sparse or both. Why not pick the best proxies and compute comparable modern temperatures to compare to the proxies at the specific proxy locations? It is easier to lessen resolution than to increase it.

Continue reading