The Paleocene-Eocene Thermal Maximum or PETM

By Andy May

The PETM or Paleocene-Eocene Thermal Maximum was a warm period that began between 56.3 and 55.9 Ma (million years ago). The IPCC AR6 report (actually a draft, not a final edited report), released to the public on August 9, 2021, suggests that this warm period is similar to what is happening today and they expect to happen in the future (IPCC, 2021, pp. 2-82 & 5-14). During the PETM, it was very warm and average global surface temperatures probably peaked between 25.5°C and 26°C briefly, compared to a global surface temperature average of about 14.5°C today, as shown in Figure 1.

Figure 1. The estimated global average surface temperature for the past 150 million years. Modified from: Christopher Scotese, the Paleomap Project, link.
Continue reading

Holocene Antarctic CO2 Variability or Lack Of

Guest Post by Renee Hannon

Introduction

This post examines CO2 ice core measurements from Antarctica during the Holocene Epoch. The key CO2 dataset for paleoclimate studies is the EPICA Dome C (EDC) data also known as Dome Charlie or Dome Concordia. Dome C is located on the eastern Antarctic Plateau, one of the coldest places on Earth and our planet’s largest frozen desert. The average air temperature is -54.5 degrees C with little to no precipitation throughout the year. CO2 measurements from the unique conditions at Dome C are compared to other Antarctic ice cores.

Continue reading

Can we predict long-term solar variability?

By Andy May

This post is a result of an online conversation with Dr. Leif Svalgaard, a research physicist at Stanford University. Leif knows a great deal about the Sun and solar variability and can explain it clearly. Our disagreement is over whether long-term solar variations could be large enough to affect Earth’s climate more than changes due to humans – or not. Thus, we are arguing about the relative magnitudes of two sources of radiative forcing (RF) that are not known accurately. The IPCC estimates that the total RF, due to humans, since 1750, is 2.3 W/m2 (IPCC, 2013, p. 696). This number is unverifiable and likely exaggerated, but we can accept it for the sake of this argument.

I’ve written on this topic before here. This post is an update and will not cover the same ground. Some readers will want to read the first post before this one.

Continue reading

Sea Level Model Fail

From a post by Arnout Jaspers at clintel.org.

In 2016, a headline grabbing study said that the Antarctic ice sheet is much more unstable than [previously] estimated.  As a consequence, projected sea level rise in 2100 would double. Computer modelers from Deltares and KNMI scrambled back to their drawing boards to upgrade their prognoses for the Dutch coastline. Have these prognoses survived the climate news cycle?

Rarely has a scientific publication, whose authors said that the results ‘should not be considered actual predictions’, had such a profound impact outside science.

The model was unrealistic and made little sense but was none-the-less splattered across the media in The Netherlands and elsewhere and the Dutch were told their children would not even have a country when they grew up due to sea level rise.

It was refuted by a 2018 paper in Science, that showed the base of the Antarctic ice sheet was much more durable than assumed in the earlier paper. Even in the unrealistic RCP8.5 IPCC scenario, the sea level rise due to melting Antarctic ice, was modest. Of course, this later paper was ignored in most of the media. As noted in the clintel.org post:

Nevertheless, the extreme two to three metres of sea level rise in 2100 from the Deltares report is still reigning free in the public debate. This is a recurrent phenomenon in climate reporting: the most extreme prediction is readily accepted and promoted as the new normal, which can no longer be falsified by later research. And then, it is a matter of waiting for the next level extreme prediction that the media will promote to being the new normal.

As usual, the media makes wild extremes seem real and ignores reality. The full post can be seen here.

Climate Sensitivity to CO2, what do we know? Part 2.

By Andy May

In Part 1, we introduced the concepts of climate sensitivity to CO2, often called ECS or TCR. The IPCC prefers a TCR of about 1.8°C/2xCO2 (IPCC, 2013, p. 818). TCR is the short-term, century scale, response of surface temperature to a doubling of CO2, we abbreviate the units as “°C/2xCO2.” In these posts we review lower estimates of climate sensitivity, estimates below 1°C/2xCO2. In parallel, we also review estimates of the surface air temperature sensitivity (SATS) to radiative forcing (RF, the units are °C per W/m2 or Watts per square meter). The IPCC estimates this value to be ~0.49°C per W/m2.

The previous post discussed two modern climate sensitivity estimates, by Richard Lindzen and Willie Soon, that range below 1°C/2xCO2. Next, we review climate sensitivity estimates by Sherwood Idso, Reginald Newell and their colleagues.

Continue reading

Climate Sensitivity to CO2, what do we know? Part 1.

The first version of this post had an error in Figure 1. It has been fixed along with the associated text (7/5/2021).

By Andy May

The IPCC claims, in their AR5 report, that ECS, the long-term temperature change due to doubling the atmospheric CO2 concentration or the “Equilibrium Climate Sensitivity,” likely lies between 1.5° and 4.5°C, and they provide no best estimate (IPCC, 2013, p. 85). But their average model computed ECS is 3.2°C/2xCO2. Here, “°C/2xCO2” is the temperature change due to a doubling of CO2. They also claim that it is extremely unlikely to be less than 1°C. ECS takes a long time, hundreds of years, to reach, so it is unlikely to be observed or measured in nature. A more appropriate measure of climate sensitivity is TCR, or the transient climate response, or sensitivity. TCR can be seen less than 100 years after the CO2 increase, the IPCC claims this value likely lies between 1° and 2.5°C/2xCO2, their model computed average is 1.8°C/2xCO2 (IPCC, 2013, p. 818).

The CO2 climate forcing, or the net change in radiation retained by Earth’s atmosphere associated with these scenarios is 3.7 W/m2 (IPCC, 2007b, p. 140). Using these values, we can calculate a surface air temperature sensitivity to radiative forcing (RF) of 1.8/3.7 = 0.49°C per W/m2. These values are inclusive of all model-calculated feedbacks.

Continue reading

Greenland Ice Core CO2 during the past 1,000 years

Guest Post by Renee Hannon

Introduction

This post compares CO2 ice core measurements from Greenland to those from Antarctica over the last millennium. Paleoclimate studies typically use only Antarctic ice cores to evaluate past CO2 fluctuations. This is because the entire Greenland CO2 datasets were deemed unreliable due to chemical reactions with impurities in the ice and therefore have not been used in studies since the late 1990’s. This post will demonstrate that CO2 data from Greenland ice cores have scientific value and respond to key paleoclimate events such as the Little Ice Age and Medieval Warm Period.

Continue reading

Science, Philosophy and Politics

By Andy May

Greg Weiner has written a great essay in Law & Liberty, entitled: “Why We cannot Just ‘Follow the Science.‘” His point is that scientists and science are important, but relying only on “The Science” for decision making is both dangerous and foolish. Science is a methodology for proposing well-developed answers to questions about natural events, it is a tool for proposing answers, not the answer. Skeptical scientists will try and disprove any proposed answer or theory, it is their duty. Only the very best theories survive this process and gain universal acceptance, like Einstein’s theory of relativity. For further discussion of this idea, see my discussion of facts and theories here. Most proposed answers, like man-made climate change, are furiously debated. So, it is fair to ask, as Weiner does, “Which [scientific] experts should we listen to …?”

The phrase “follow the science,” is a way to duck responsibility. In Weiner’s word’s: “The slogan ‘follow the science’ is meant to exempt politicians from the duty of judgment.” Moral and political judgement must superintend science. It seems likely that the virus that causes COVID-19 was engineered in the Wuhan Virology lab. The U.S. NIAID, led by Dr. Fauci, supported this research. Was that money, distributed by a government scientist, wise or moral? Would the public or politicians have approved of sending money to a Chinese laboratory, connected to the Chinese military, to conduct research on a deadly virus? Was “following the science” wise in that case? I think not.

Scientists should not be making critical decisions; they should be in advisory roles and carefully supervised by elected political leaders, not unelected bureaucrats. Recently, Nature revealed that critical early SARS-CoV-2 (the COVID-19 virus) gene sequences were removed from a U.S. government database, at the request of Wuhan University scientific researchers. The gene sequences contained valuable information that shows the early viral sequences from the Wuhan seafood market are more distantly related to SARS-CoV-2’s closest relatives in bats than later sequences found in humans from China and the U.S. This makes it less likely the market is the source of the first human infection.

Dr. Jesse Bloom discovered that the sequences had been deleted and managed to recover them from archives. He says there is no plausible reason for the deletions and suspects they were to obscure their existence. Bloom also believes we should be skeptical that all early Wuhan sequences have been shared.

The amino acid arginine is typically used in laboratories to “supercharge” a virus and make it more communicable and deadly. Arginine can be built with 36 different gene sequences in DNA. SARS-CoV-2 contains the sequence CGG-CGG, or “double CGG.” The same lethality is achieved with any of the 36 sequences (or codons) in the same site, but the double CGG sequence is the least likely to occur in nature. In fact, the double CGG sequence has never been found in naturally occurring coronaviruses. A virus naturally obtaining a new skill, will pick it up from similar viruses, yet no similar viruses have this combination, it was almost certainly the result of engineering.

While it is clear that politics has corrupted climate science, it appears the reverse is also true. Science has corrupted politics, in the words of Jon Stewart on Stephen Colbert’s show recently:

“Here’s how the world ends, the last words man utters are somewhere in a lab. A guy goes, ‘Huhuh, it worked.'”

Some pundits have tried to defend scientists and experts, but most of us should remember what Richard Feynman once said,

“Science is the belief in the ignorance of experts.”

Science is scientific debate, with rules. Science is not wise words spoken from on high. Science, properly done, is so professionally researched and explained as to be self-evident. Once the paper is read, the data obtained and checked, anyone with the necessary skills can reproduce the author’s result and convince themselves that what the author said was true. “The Science” is not truth, it is a process that results in truth, if done properly. If no one can reproduce the result, if the underlying data aren’t available to the public, if the methodology is not clear, it is not a proper scientific theory, it is not “science.” Don’t follow it.

How to compare today to the past

By Andy May

In the last post, I discussed the problems comparing modern instrumental global or hemispheric average temperatures to the past. Ocean temperature coverage was sparse and of poor quality prior to 2005. Prior to 1950, land (29% of the surface) measurements were also sparse and of poor quality. Only proxy temperatures are available before thermometers were invented, but, again, these are sparse and poorly calibrated prior to 1600. So how can we compare modern temperatures to the distant past? We can’t do it globally or hemispherically, the past data are too poor or too sparse or both. Why not pick the best proxies and compute comparable modern temperatures to compare to the proxies at the specific proxy locations? It is easier to lessen resolution than to increase it.

Continue reading

Global Warming is happening, what does it mean?

By Andy May

The concepts and data used to make temperature and climate reconstructions, or estimates, are constantly evolving. Currently, there are over 100,000 global weather stations on land and over 4,500 Argo floats and weather buoys at sea. This is in addition to regular measurements by satellites and ships at sea. The measurement locations are known accurately, the date and time of each measurement is known, and the instruments are mostly accurate to ±0.5°C or better. Thus, we can calculate a reasonable global average surface temperature. However, the farther we go into the past the fewer measurements we have. Prior to 2005, the sea-surface measurements deteriorate quickly and prior to 1950 the land-based weather station network is quite poor, especially in the Southern Hemisphere. Before 1850, the coverage is so poor as to be unusable for estimating a global average temperature. Prior to 1714 the calibrated thermometer had not even been invented; the world had to wait for Gabriel Fahrenheit.

Is the global average temperature a useful climate metric? How do we compare today’s global or hemispheric temperature to the past? Modern instruments covering the globe have only been in place since 2005, plus or minus a few years. If we have accurate measurements since 2005, how do we compare them to global temperatures hundreds or thousands of years ago? The world has warmed 0.8°C since 1950 and 0.3°C since 2005, that doesn’t sound very scary. This two-part series will investigate these questions. We will propose a solution to the problem in a second post that should be up in a day or two.

Continue reading