Monthly Archives: May 2024

Job 1 in Dealing With Climate Change

If you accept my premise (from my previous post) that the best way of thinking about human contributions to climate change is to focus on fuels rather than emissions, it becomes fairly easy to prioritize the actions we should take to deal with the issue.

Here are CO2 emissions for each of the types of (non-renewable) fuels we use:

FuelEmissions in g CO2/kWhPEEmissions in g CO2/MJPE
Wood1)00
Wood2),3)367.6102.1
Lignite3)398.7110.8
… Lusatia3)399.6111.0
… Central Germany3)371.6103.2
… Rhineland3)407.3113.1
Peat3)366.5101.8
Hard coal3)338.293.9
Gasoline3)263.73.3
Fuel oil3)266.574.0
Diesel3)266.574.0
Crude oil3)263.973.3
Kerosene3)263.973.3
Liquid petroleum gas3)238.866.3
Natural Gas3)200.855.8

We can tell each other we want to stop using fossil fuels all we want–but even if we all agree (and many don’t… at all…) it won’t happen all at once. We have to start somewhere.

And that table makes it screamingly obvious where we should start. Remember that billions of people in the developing world use firewood and kerosene as their primary fuel. These fuels are not only about the worst in terms of CO2 emissions–they are horribly polluting and cause hundreds of thousands of deaths every year. Burning firewood not only emits CO2–it destroys forests which absorb CO2.

Clearly we should target those two fuels for elimination. And frankly, I don’t really care what replaces them. I would love for solar or wind power to magically pop up in villages across the globe (and NGOs and faith-based organizations have been putting solar panels up in villages for more than 40 years–just not enough of them), but if we can’t get pure green solutions I will accept natural gas, oil or even coal–just to get them off firewood and kerosene.

This won’t solve climate change. But it is a first step that just about everybody should agree is good and feasible. And if you want to do something closer to home, here are two suggestions:

  1. If you own a home in the northern hemisphere, paint your roof white. This will reflect sunlight back to space and decrease global warming. Not by much… but tell your neighbor.
  2. Wash your clothes with cold water. This will reduce fuel consumption and lower CO2 emissions. Not by much, but tell your neighbor.

Climate Change–15 Years After

I started researching climate change in 2009. I was working as an energy analyst covering renewables–mostly solar, but a bit of wind power and biofuels.

The more I read about climate change and CO2 emissions, the more I became convinced that the discussion was horribly miscast. It was fairly obvious that temperatures were rising, as were our emissions of CO2 and it was pretty logical to look for a link between the two. Indeed, scientists had done that and were fairly convinced the link was there. I saw no reason then to dispute this–nor do I now. Climate change is real and we are a major contributor to the change. It seemed that my study of energy might cast a useful light on the subject.

But in terms of understanding what the impacts would be and what we could do to either prevent further changes or deal with any changes already happening, looking at CO2 emissions was not really helpful.

CO2 emissions, the end culprit for much of the damage we are causing, is itself caused by energy consumption. The world burns a ‘portfolio’ of fuels to power daily life. Some fuels emit more CO2 when consumed than others. Orienting this portfolio away from the more emissive fuels to the lesser (or non) emissive fuels seemed like an easier way of understanding both the problem and any potential solutions.

The worst fuels we can burn are firewood, which is still the primary fuel source for about 2 billion people, kerosene, which serves a similar number of people. Both fuels are dominant in the developing world.

Coal is next and the world uses a lot of it, both in the developed and developing world. Oil follows and is the fuel of choice for transportation and a lot of heating. Natural gas is much cleaner, in terms of CO2 emissions–it puts out about half as much CO2 as oil or coal and is often talked about as a ‘bridge fuel’ to help us clean up our mess while waiting for true renewables to dominate our portfolio.

Zero emission fuels are not exactly equivalent to renewables–hydroelectric power and nuclear power are zero emission, but are not counted as true renewables for various reasons, mostly political. Solar, wind and biofuels are the renewables of choice.

It was clear to me in 2009 that we could switch to a much cleaner portfolio of fuels. My calculations showed that it would take about 50 years and cost about $23 trillion US dollars. (Those figures have held up pretty well, if you adjust for inflation. I’m pretty proud of that.)

The real problem my research identified way back then was that all the big energy monitoring agencies seemed to be wildly underestimating how much energy we were going to be using in 50 years. The US Department of Energy estimated in 2010 (my research was taking quite a long time) that the world would consume about 800 quadrillion BTUs in 2040. My calculations showed a figure of about 965 Quads. My fear was (and still is) that we would not be preparing for a world with much higher fuel use and that would lead us by default into using dirtier fuels, as they are easier to extract and get ready for use. (Turns out that I was pretty close to right–the Department of Energy just updated their forecast for 2040 to 940 Quads.)

But my major point is that by focusing on fuels instead of emissions, we will find it far easier to plan the energy transition, far easier to measure our successes and failures, and far easier to assist the developing world in helping us deal with a problem that they sure as hell didn’t create–but whose help we desperately need in solving the problem.

Happy Wednesday, everybody! We will get there–my calculations also show that we will eventually solve this problem and well before we cause permanent or widespread damage to this planet. More on that later.

Sense and Sensitivity

There is only one important scientific question regarding climate change: Is atmospheric sensitivity high or low? If it is high the planet has big problems. If it is low we can deal with global warming with the technologies and societal mechanisms that currently exist. Equilibrium climate sensitivity is defined as the “change in global mean temperature, T2x, that results when the climate system attains a new equilibrium with the forcing change F2x resulting from a doubling of the atmospheric CO2 concentration. (There is another ‘sensitivity’ metric called Transient Climate Sensitivity, which measures the reaction of the climate system over the period where concentrations double.)

Those most alarmed by the prospect of future warming naturally incline to the belief that sensitivity is high—that our climate will react quickly to changes and the addition of massive quantities of CO2 will cause the creation of yet more greenhouse gases, principally in the form of water vapor. This will set off a chain reaction that leads to skyrocketing temperatures—many of those most concerned talk frequently about temperature rises in this century of 6 degrees Celsius, or 11 degrees Fahrenheit. They arrive at these numbers by assuming that atmospheric sensitivity is even higher than the 4.5C that is at the high end of the IPCC projections. As human emissions of CO2 are certain to increase, the feedback loop will be mutually reinforcing—more CO2 leads to more heat and more water vapor creating even more heat.

Skeptics believe sensitivity is low—many believe it is below 1C. If they’re right then global warming this century will be about the same as in the 20th Century, which was not extreme and did not pose much of a problem.

Today we’re diving deep into the world of climate modeling – specifically, the CMIP (Coupled Model Intercomparison Project) and its poster child, Equilibrium Climate Sensitivity (ECS).

For years, CMIP models have been the darlings of the climate establishment, trotted out to paint a picture of runaway warming driven by CO2 emissions. But are these models all they’re cracked up to be? Let’s take a critical look and see if the numbers truly stack up.

Why CMIP Models Even Exist

First, a little background. CMIP models are essentially elaborate computer simulations designed to mimic the Earth’s climate system. They factor in atmospheric, oceanic, and land-based processes, all in an effort to predict future climate scenarios.

The idea behind CMIP is to have a standardized framework for these simulations. This allows for “apples-to-apples” comparisons between models developed by different research groups. In theory, this should lead to a more robust understanding of future climate.

The Cracks Start to Show: Problems with CMIP Models

But here’s the rub: These models are far from perfect. Here are some key issues to consider:

  • Complexity vs. Accuracy: CMIP models are undeniably complex, but complexity doesn’t equal perfect accuracy. They rely on a myriad of assumptions and simplifications to represent the Earth’s climate system. These simplifications can introduce errors and uncertainties into the final projections.
  • Resolution Limitations: Computational power restricts the models’ spatial resolution. This means they might struggle to capture regional climate variations or small-scale phenomena that could influence global trends.
  • Natural Variability vs. Human Influence: The Earth’s climate naturally fluctuates. Separating this inherent variability from the long-term warming trend caused by human activity is a challenge. Short-term observational data might not fully capture this variability, potentially leading to skewed projections.
  • Feedback Loops: Climate change can trigger feedback loops, where changes in one aspect of the system influence others. Modeling these feedback loops accurately remains a challenge for CMIP models. If a crucial feedback mechanism is missing, it can significantly impact the model’s sensitivity to CO2 increases.
  • Data Gaps: CMIP models rely on observational data to parameterize processes and validate simulations. Gaps or limitations in this data can introduce uncertainties into the projections.

The Ensemble Fallacy: Averaging Out the Answer?

Now let’s talk about how CMIP utilizes ensembles, collections of multiple models. While ensembles seem like a good idea on the surface, there are potential dangers associated with using them:

  • Averaging Out the Truth: Imagine throwing darts blindfolded at a dartboard. Some might hit the bullseye, some might miss entirely. Averaging the landing points wouldn’t tell you where the true center is. Similarly, averaging the results of CMIP models with potentially flawed assumptions can mask the “true” answer, especially if some models are significantly off-base.
  • Garbage In, Garbage Out: If all the models in the ensemble share fundamental flaws in their underlying physics or data, the ensemble results will still be flawed, regardless of the number of models included. An ensemble is only as strong as its weakest member.
  • Difficult Interpretation: Ensembles can produce a wide range of projections, which can be difficult for policymakers and the public to interpret. Focusing solely on the average value might downplay the potential for more extreme scenarios projected by some models.

The latest CMIP6 models project a higher ECS value compared to previous iterations. This suggests that the Earth might be more sensitive to CO2 increases than previously thought.

However, some highly respected scientists, like Sherwood et al. (2020), argue that these high-sensitivity models might be overestimating the true effect. They point to discrepancies between model projections and observational data.

The scientific community is still grappling with this issue. While the vast majority agree on human-caused climate change, the exact degree of sensitivity remains a point of debate.

So where do Lukewarmers stand on this issue? Enthusiasts for both extremes have very wrongly claimed the issue has been settled in their favor. This has not helped anyone. We don’t know what atmospheric sensitivity is. We don’t even know if there is one and one only value for atmospheric sensitivity. We are not likely to know for sure for 30 or more years, according to climate scientist Judith Curry, among others.

But remember that the very definition of Lukewarmer is one who believes that atmospheric sensitivity is lower than claimed by the Alarmists–I for one think it is around 2.1C or a bit lower. Why on earth, if I say it cannot be determined with accuracy at this point, am I willing to ‘bet the planet’ on my belief/intuition/opinion/prayer/limited understanding of incomplete science that it is low? Alarmists are actually more upset with Lukewarmers than with skeptics for this very reason.

Well, for starters, remember that the range of potential sensitivity values given by the Intergovernmental Panel on Climate Change (IPCC) includes low values as well as high. They estimate that the value of sensitivity could fall anywhere between 1.5C and 4.5C. So my ‘preference’ of 2.1C is well within the accepted range of possibility.

So, climate models are not ‘evidence.’ They are simulations. Neither our lukewarm guess (hope?) of below 3C nor CMIP models’ extension of the upper range of possible sensitivities to 6C are evidence, or even really have evidence for or against.

So in my next piece we will look at observations to see if they indicate something that the models cannot.

The Lukewarmers Win

A decade later. I think it’s past time to look at what was said here and elsewhere a decade ago, to see how well we all held up.

My thesis will be that we Lukewarmers decisively won the debate about most elements of the climate conversation–that our acceptance of the basic physics explaining anthropogenic contributions to climate change was appropriate. That our identifying a number of actors that were over-hyping the climate ‘threat’ as existential and demanding of our utmost efforts was useful and that our criticism of these actors was proper–indeed, necessary.

But most of all the Lukewarmers’ win needs to look hard at atmospheric sensitivity to a doubling of the concentrations of CO2. After all, this was Steve Mosher’s stake in the ground–that given an over/under bet on sensitivity of 3C, he would take the under. That’s really all it takes to be a Lukewarmer, but it’s kind of a big thing.

Our next post will look at the estimates of sensitivity generated by CMIP6 models and the criticism those models have received.

Ooh, it’s nice to be back!

I’m baaaack!

In 2013 the Energy Information Administration (part of the US Dept. of Energy) projected that the world would consume 810 quadrillion BTUs (Quads) in 2040.

I disagreed. My rough calculations led me to the conclusion that the figure would be about 965 Quads in 2040.

As of this year, the EIA has changed its projection to 940 BTUs.

I’m really glad to be correct. I’m really worried about the fuel portfolio we will be using to deliver those Quads.

C’mon, solar! C’mon, nuclear!