Global Climate Models

By Andy May

Global Climate Models (GCM) are used to compute the social cost of carbon dioxide emissions and to compute man’s contribution to recent global warming. The assertion that most of “climate change” is due to man’s influence is based solely on these models. They are also the sole basis for concluding “climate change” is dangerous. Just how accurate are they? How close are their predictions to observations?

Dr. Judith Curry has written an important white paper, for the layman, describing how the models work. It is easy to understand and well worth reading.

Her key conclusions:

GCMs have not been subject to the rigorous verification and validation that is the norm for engineering and regulatory science.

There are numerous arguments supporting the conclusion that climate models are not fit for the purpose of identifying with high confidence the proportion of the 20th century warming that was human-caused as opposed to natural.

There is growing evidence that climate models predict too much warming from increased atmospheric carbon dioxide.

Some portions of the GCMs are rooted in fundamental physics and chemistry, but there are thousands of atmospheric and surface processes that cannot be deterministically modeled and must be “parameterized” using simple empirical formulas based on observations. These empirical formulas are “tuned” or “calibrated” to make the models match observations. They are tweaked to match the twentieth century, especially the warming period from 1945 to 2000. Even with all of the tuning, the models do a very poor job matching the warming from 1910-1945.

Since all models are “tuned” to the twentieth century (see Voosen, et al., Science, 2016) and since the “more than half of warming is due to man” conclusion is based upon comparing two model runs “from 1951 to 2010” the validity of the computation of man’s influence is highly questionable. Dr. Curry points out:

“GCMs are evaluated against the same observations used for model tuning.”

This is not something that inspires confidence. Further, the Earth has been warming for 300 to 400 years, as Dr. Curry writes:

“Understanding and explaining the climate variability over the past 400 years, prior to 1950, has received far too little attention. Without this understanding, we should place little confidence in the IPCC’s explanations of warming since 1950.”

She adds:

“Anthropogenic (human-caused) climate change is a theory in which the basic mechanism is well understood, but of which the potential magnitude is highly uncertain.”

Precisely so.

Risk and Nuclear Power Plants

By Andy May

The financial risk is too great.

Updated post (2/21/2017)

In any discussion of the future of energy, nuclear power generation is brought up. Once a nuclear power plant is built and operating, it can produce cheap electricity reliably for decades. Further, in terms of human health, some claim it is the safest source of energy in the U.S. Others, like Benjamin Sovacool, claim the worldwide economic cost (worldwide total: $177B) of nuclear accidents is higher than for any other energy source and nuclear power is less safe than all other sources of energy except for hydroelectric power. Some of the costs could be due to an over-reaction to nuclear accidents, especially Chernobyl and Fukushima.  Others have much lower fatality estimates than Sovacool, it is unclear how many later cases of cancer are, or potentially will be, due to Chernobyl.

Permitting a new nuclear power plant and building it is a problem because there have been more than 105 significant nuclear accidents around the world since 1952, out of an IAEA total of 2,400 separate incidents. Thirty-three serious nuclear accidents compiled by The Guardian are listed and ranked here and mapped in figure 1. As figure 1 shows these incidents have occurred all over the world, some are design flaws, like the Fukashima-Diachi 2011 disaster and some are due to human error, like the loss of a Cobalt-60 source in Ikitelli, Turkey.

Figure 1: All nuclear power plant incidents, source The Guardian.

Continue reading

Oil – Will we run out?

By Andy May

“Prediction is very difficult, especially about the future” (old Danish proverb, sometimes attributed to Niels Bohr)

In November, 2016 the USGS (United States Geological Survey) reported their assessment of the recent discovery of 20 billion barrels of oil equivalent (technically recoverable) in the Midland Basin of West Texas. About the same time IHS researcher Peter Blomquist published an estimate of 35 billion barrels. Compare these estimates with Ghawar Field in Saudi Arabia, the largest conventional oil field in the world, which contained 80 billion barrels when discovered. There is an old saying in the oil and gas exploration business “big discoveries get bigger and small discoveries get smaller.” As a retired petrophysicist who has been involved with many discoveries of all sizes, I can say this is what I’ve always seen, although I have no statistics to back the statement up. Twenty or thirty years from now when the field is mostly developed, it is very likely the estimated ultimate hydrocarbon recovery from the field will be larger than either of those estimates.

The technology for producing this sort of shale oil was invented very recently, well after Marion King Hubbert produced the “Hubbert curve” predicting that U.S. oil production would peak in the early 1970’s.  As Daniel Yergin points out in The Quest:

“Hubbert got the date right, but his projection on supply was far off. Hubbert greatly underestimated the amount of oil that would be found – and – produced in the United States. By 2010 U.S. production was four times higher than Hubbert had estimated- 5.9 million barrels per day versus Hubbert’s 1971 estimate of no more than 1.5 million barrels per day.”

A comparison of actual oil production versus a version of Hubbert’s curve is shown in figure 5 (this curve is slightly different than the one Yergin used):


Figure 5, source

Technically Recoverable Reserves

So clearly Hubbert’s Malthusian curve did not predict oil supply correctly, new technology has allowed us to tap into oil that was not part of the potential supply when he did his calculation. Paul Ehrlich’s ominous 1968 prediction in The Population Bomb that 65 million Americans would starve to death in the 1980’s was incorrect for the same reason. He could not have predicted the green technology revolution that included natural gas based fertilizer (the Haber-Bosch process) and Nobel Prize winner Norman Borlaug’s new hybrid strains of wheat, rice and corn. Some might say, well Hubbert was wrong then; but what about tomorrow? Isn’t oil still a finite resource? Let’s examine that idea. Table 1 shows a rough estimate of the technically recoverable reserves of oil and gas known today, using only known oil and gas technology. More deposits will obviously be found and technology will improve in the future.


Table 1

The reserve estimates are in billions of barrels of oil equivalent. NGL and oil volumes are presented as is and natural gas is converted to oil-equivalent using the USGS conversion of 6 MCF to one barrel of oil. The table includes the “proven” worldwide oil, gas and NGL reserves from BP’s 2016 reserves summary. It also includes the 2012 USGS estimate of undiscovered “conventional” oil and gas reserves fully risked, the EIA estimate of unconventional shale oil and gas reserves, and the IEA oil shale (kerogen) and oil sands (bitumen) reserve estimates. Our estimate of 1,682 BBOE in world-wide unconventional shale oil and gas reserves is lower than the IEA estimate of 2,781. The spread in these estimates gives us an idea about how uncertain these numbers are. Our estimate of 781 BBO in oil sand bitumen reserves is lower than the IEA estimate of 1,000 to 1,500 BBO. So, please consider this table very conservative.  Yet, it results in a 148-year supply!

The moral of the story? Never underestimate the ingenuity of mankind and never assume that technology is static. Also, the resources that technology recognizes today are not all the planet’s resources.

More here!

Energy and Society from now until 2040

By Andy May

ExxonMobil released its 2017 Outlook for Energy, A View to 2040 in mid-December. David Middleton has written that the report reveals wind and solar will supply a whopping 4% of global energy by 2040! He also reports that wind and solar capacity will grow, but we will only be able to utilize 30% of the wind capacity and 20% of the solar capacity due to their intermittent nature. This is true, but the report has much more to say and this year the nomination of ExxonMobil CEO Rex Tillerson for Secretary of State makes it even more important. Here we will cover some the other numbers in the report.Continue reading “Energy and Society from now until 2040”

At the length truth will out

By Andy May

Some will recognize the title as part of a line by Launcelot Gobbo in Act II, scene 2 of Shakespeare’s The Merchant of Venice. In plainer and much less poetic language it means that the truth will be known eventually. The film version, with Al Pacino playing Shylock, is the best in my amateur opinion. The play is considered anti-semitic by some, but is actually a carefully considered analysis of bias and prejudice and their awful effect on people. It opens the dark world of prejudice to the light, uncovering it for all to see. It does this in the same way Huckleberry Finn opens the injustice of slavery to the light of day. Sometimes we need to see all the bad out in the open, as uncomfortable as it may be to look at or discuss.

We are all seeing a shift in opinion about the so-called dangers of man-made climate change. It has taken a long time, but there is an awakening to the idea that climate changes, man’s activities may play some role in the changes, but the role of man is likely small and not very dangerous. In any case, the public believes (correctly in my opinion) it is way too early to panic.

So, with all this in mind, I’d like to share a short lecture on bias, prejudice and intimidation in climate science by Dr. Willie Soon. The December 8, 2016 lecture can be seen on Youtube here or by clicking on figure 1, the slide deck he uses can be downloaded here.

Figure 1, Dr. Willie Soon delivering his talk on bias and intimidation in climate science. Click on the figure to hear the talk at Youtube.com

The viciousness of the media attacks on the reputation of Dr. Soon are amazing, considering his long-established credentials as an astrophysicist and climate scientist. It was gratifying that Myron Ebell (Trump’s pick to lead the EPA transition) leapt to Dr. Soon’s defense when the New York Times shamefully published an unwarranted and untrue attack on him. The article was written by Greenpeace. The New York Times placed it on their front page (see Dr. Soon’s second slide) and unethically disguised it as an article by two of their reporters. Mr. Ebell’s expose can be read here. Other sources that uncover the dishonesty in the New York Times reporting can be read here and here.

As bad as the attacks on Dr. Soon are, they pale in comparison to the attacks on science itself. Dr. Soon’s talk details the complete corruption of the peer-review process in climate science. I was once on the SPWLA (Society of Petrophysics and Well Log Analysts, the journal Petrophysics) publications committee and can assure you that the actions described by Dr. Soon are far more egregious and corrupt than any I have seen in a long publishing career.

Further, even though Dr. Soon had no direct support from the Southern Company as Greenpeace and the New York Times claimed, what if he did have? Why does the government have to be the only source of research funding? Clearly, the climate change debate is political and no longer scientific. We have the proponents of dangerous man-made climate change claiming the science is settled and a “consensus” of (government funded) scientists certifying it is dangerous. Since when has science ever been “settled?” Settled issues are political issues, not scientific issues. Likewise, consensus is a political word, not a scientific one. We have politicians, pundits and environmentalists who wish to jail “deniers.” Of course, it is political, as a result one would hope that funding for climate research was not all from the government. The government is clearly biased.

Dr. Soon clearly shows the bias against one point of view on climate change and the suppression of dissenting views, no matter how well they are supported. He shows that prejudice against dissenting views exists and that dissenters will be punished, humiliated and tormented. I have no illusion that my small post will heal his wounds or repair his reputation, but I hope it brings some daylight to this issue and illuminates the abusers. The only way to confront this sort of behavior, behavior that thrives in darkness, is to force it into the light. Shakespeare and Twain knew this, we need to follow their lead.

Global Cooling and Wikipedia Fake News

By Andy May

There is an excellent new post up at notrickszone.com on the global cooling scare of the 1970’s and the efforts to erase it from the record by the climate alarmists at realclimate.com. For some the scandal at Wikipedia over William Connolley deliberately posting false articles and altering factual ones on climate is old news. This is for those who missed the story. William Connolley created or rewrote 5,428 unique Wikipedia articles. “Fake news” is an old story, used extensively by radical climate alarmists and environmentalists. Indeed, Greenpeace seems to be based on the concept of fake news.

The following anecdote by author Lawrence Solomon is instructive. He tried to correct an article that stated Naomi Oreskes infamous 97% paper in Science had been vindicated and Dr. Bennie Peiser had conceded that she was correct. He had spoken with Dr. Peiser and confirmed he had said no such thing.

“Of course Oreskes’s conclusions were absurd, and have been widely ridiculed. I myself have profiled dozens of truly world-eminent scientists whose work casts doubt on the Gore-U.N. version of global warming. Following the references in my book The Deniers, one can find hundreds of refereed papers that cast doubt on some aspect of the Gore/U.N. case, and that only scratches the surface.

Naturally I was surprised to read on Wikipedia that Oreskes’s work had been vindicated and that, for instance, one of her most thorough critics, British scientist and publisher Bennie Peiser, not only had been discredited but had grudgingly conceded Oreskes was right.

I checked with Peiser, who said he had done no such thing. I then corrected the Wikipedia entry, and advised Peiser that I had done so.

Peiser wrote back saying he couldn’t see my corrections on the Wikipedia page. I made the changes again, and this time confirmed that the changes had been saved. But then, in a twinkle, they were gone again. I made other changes. And others. They all disappeared shortly after they were made.”

Connolley was hardly the only offender, Kim Dabelstein Petersen and many others are also guilty. Rewriting history is not their only offense. They also slander eminent scientists such as Dr. Fred Singer, the first director of the U.S. National Weather Satellite Service; Dr. Richard Lindzen a former MIT Professor of atmospheric physics, and Professor William Happer a professor at Princeton. Many, many others like Willie Soon, Roy Spencer, John Christy and Judith Curry have also been unfairly slandered.

Probably fake news has been with us for a very long time, but thanks to the Internet it is produced quicker and debunked quicker these days. I get a sense of déjà vu when reading this Brittanica.com article on yellow journalism.

As noted in the notrickszone.com post William Connolley and his team tried to show that the global cooling scare of the 1970’s was a myth. They also tried to scrub Wikipedia of any mention of the Little Ice Age or the Medieval Warm Period. A perfect example of fake news, along the lines of the 97% consensus myths. They claimed only seven scientific papers of the period discussed global cooling, when there are 163 papers on the subject, including seven that claim CO2 is causing global cooling. These include an article by the CIA. A complete list of the papers can be found in the post, it is well worth the time it takes to scan the informative summaries at the end of the post.

Detection and Attribution of Man-made Climate Change

By Andy May

Chapter 10 of the 2013 IPCC Working Group 1 Assessment Report (WG1 AR5) report on climate change deals with how man-made climate change is detected and how much of the total change is due to man. They call the chapter “Detection and Attribution of Climate Change: from Global to Regional,” but in the critical calculation they assume the natural contribution is zero, so we consider “man-made” an appropriate addition to the title of this post. In summary, it says that the Earth’s surface has warmed since 1880 and over half of the warming from 1951 to 2010 is due to man. That humans have some influence on climate is not in dispute, all major species have some influence on climate. Phytoplankton occupy most of the Earth’s surface and, since they photosynthesize, they consume CO2 and produce sugars and oxygen. In all probability, they have the largest effect on climate, but we don’t know how much. Humans mostly live in urban areas that occupy 3% of the Earth’s land area and 1.3% of the Earth’s surface. We burn fossil fuels and biomass, producing greenhouse gases (GHGs), that may have some net warming effect on the climate. Some laboratory measurements show a warming effect from CO2 and methane, but no measurements have been made in the real world (see pages 883-884 in WG1 AR5).

Using satellite data, we can show that the radiative effect of greenhouse gases, has increased from 1970 to 1997. But, measuring the net surface temperature effect of this increase has proven elusive. For an excellent discussion of the problems of predicting the warming effect of GHG’s see Richard Lindzen’s Remarks on Global Warming. In Lindzen’s remarks he notes that the measurements of global warming that we have are ambiguous regarding man’s GHG emissions and:

“Finally, we must turn to the models. It is from model results that our fear of profound greenhouse warming arises. …doubling CO2 will increase the downward flux at the surface by about 4 Watts/m2/sec; the solar flux in existing models must be adjusted by many times this quantity simply in order to get the present day global temperature correct.”

In the classic paper Lindzen and Choi, 2011, they argue that CERES satellite data suggests that natural feedback to an increase in CO2 is negative. That is, it reduces the temperature increase due to CO2 rather than increasing it as the CMIP5 global climate models predict. So, despite the absence of measurements how has the IPCC separated the warming due to man from natural warming? After all, surface temperatures have been rising since the Little Ice Age which only ended in the late 19th century just as we began to keep track of surface air temperatures worldwide.

Continue reading

A Summary of Meehl, et al., 2016 and the Interdecadal Pacific Oscillation

By Andy May

In a comment to my earlier post on ocean cycles, Nick Stokes challenged my interpretation of a quote from the new Nature Climate Change paper by Meehl, et al. The quote is from the abstract:

“Here we show that the largest IPO [Interdecadal Pacific Oscillation] contributions occurred in its positive phase during the rapid warming periods from 1910-1941 and 1971-1995, with the IPO contributing 71% and 75% respectively, [to the difference between the median values of the externally forced trends and observed trends.]”

Immediately after the quote, I wrote “This directly contradicts the IPCC conclusion that man caused most of the warming between 1951 and 2010.” Nick posted more of the quote, adding the part in bold. He also contested my interpretation that this contradicts the IPCC conclusion that man caused most of the warming from 1951 to 2010. We were both working from the abstract only and couldn’t reconcile our differing interpretations without the full paper. Professor Curry has kindly sent me the full text, so I will summarize it and try to resolve the issue in this post. There are two lessons here. The first is, it is dangerous to draw conclusions from an abstract. The second is that abstracts should be written as stand-alone documents. They should not include jargon, for example “median values of externally forced trends” that cannot be understood without reading the whole paper. Many people are only able to see the abstract.

Indeed, reading the entire paper, we can see what the authors mean. The authors compare observed global mean surface temperatures from HadCRU 4.4, GISS and NOAA to two global climate models. The first model, shown in figure 1a (the author’s Figure 2a) as a black line and in figure 1b as a black dot, is the normal multi-model CMIP5 ensemble mean used in the 2013 IPCC WG1-AR5 report. The observations are shown in yellow (HadCRU 4.4), blue (GISS), and red (NOAA).

As pointed out in my previous post, I want to draw your attention to the poor fit between the model and the observations between 1910 and 1945. This period of warming is very similar to the recent warming from 1975 to 2009, but the ensemble model mean matches the later observed trend and does not match the earlier observed trend.

To produce the data shown in figure 1b, the authors used a computer model (CCSM4) that has

“…been analysed the most extensively of any current climate model with regards to IPO processes and mechanisms, and compares favourably in those aspects to observations.”

They support this statement by referring to four papers here, here, here and here. Once the CCSM4 model output was generated they added the “IPO-mediated trend distributions” from it to the multi-model ensemble of CMIP5 simulations. The ensemble mean includes both the CMIP5 natural and anthropogenic forcings. But, as we saw in figure 3 of the last post, the CMIP5 natural forcings are essentially zero.

Figure 1b then compares the IPO (Interdecadal Pacific Oscillation) warm and cold phases to the observations (colored dots) and to the multi-model ensemble mean (gray boxes). The gray boxes contain the CMIP5 “unadjusted” median value as a heavy black line and encompass the 25th to 75th percentile values. The whiskers encompass the 5th to 95th percentiles. The IPO-adjusted values are shown as red boxes when the IPO is in a warm phase and blue boxes in the cold phase. Again, the median value is a line, the colored box encompasses the 25th to 75th percentile and the whiskers the 5th to 95th percentile.

Figure 1

Continue reading

Ocean cycles, The Pause and Global Warming

By Andy May

h/t Joachim Seifert

There is a new post by Dr. Sebastian Lüning and Professor Fritz Vahrenholt, translated by Pierre Gosselin, on the effect of ocean cycles on 20th century warming and the 21st century pause. They had previously written about this in their popular book The Neglected Sun, in English here. Marcia Wyatt and Judith Curry have also written about the effect of ocean cycles here. These roughly 60 to 65-year cycles have the advantage of explaining the warming from about 1910 to 1944 and the warming from 1975 to 2005 with a similar mechanism. This is important, because the two warming events are very similar, as shown here and in figure 1.

Figure 1

In the IPCC WG1 AR5 document (page 887), they have a hard time explaining the earlier 20th century warming. The text is so confusing we will not attempt to paraphrase it:

“Nonetheless, these studies do not challenge the AR4 assessment that external forcing very likely made a contribution to the warming over this period. In conclusion, the early 20th century warming is very unlikely to be due to internal variability alone. It remains difficult to quantify the contribution to this warming from internal variability, natural forcing and anthropogenic forcing, due to forcing and response uncertainties and incomplete observational coverage.”

Why is this warming “difficult to quantify” and the later warming “extremely likely” due mostly to man? The two temperature profiles are nearly identical after removing the secular trend out of the Little Ice Age.  But, the first is a mystery and the second is understood? We have a problem with this.

Continue Reading

CO2, Good or Bad?

By Andy May

The Earth’s dry atmosphere is 78% nitrogen, 21% oxygen and 0.9% argon. These are not greenhouse gases and they total 99.9%, leaving little space for the greenhouse gases methane, carbon dioxide and water vapor. The amount of water vapor in the atmosphere varies a lot with altitude and temperature. At low altitude and high temperatures (greater than 30°C or 86°F), over the ocean, it can reach 4.3% or more of the atmosphere and is less dense than dry air, causing it to rise. It will rise until the temperature is low enough for it to condense to a liquid or solid state and form clouds, rain or snow.

Carbon dioxide is emitted when animals and some microbes breathe, from the oceans (which contain 93% of the carbon dioxide on Earth) and when plants or fossil fuels are burned. In the 1990’s fossil fuel emissions were about 3% of the carbon dioxide entering the atmosphere according to the EPA. About half of the fossil fuel emissions were absorbed by the environment. Mostly the CO2 emissions were absorbed by the oceans, land plants, and marine algae. Additional carbon dioxide in the atmosphere is a powerful fertilizer, for a dramatic illustration of the effect, see this short youtube video. Figure 1 shows the impact of additional carbon dioxide on pine trees under controlled conditions. The four CO2 levels tested are, from left to right, 385 ppm, 535 ppm, 685 ppm, and 835 ppm.  The researcher holding the signs is Dr. Sherwood Idso.

Figure 1 (Pine trees grown at ambient CO2 and three higher CO2 concentrations under controlled conditions, source)

Continue reading