What would happen if we deployed a sensible “bundle” of policies all at once?
How do individual policies compare, within that bundle?
In this post, I’ll continue by looking at:
What about at a “maximum” level?
What does maximum deployment look like?
In the previous post, I looked at a few of the “big impact” policies. At the time, I focused on the “aggressive” deployment level. In this section, I want to switch gears to the “maximum” deployment level and dive a little deeper. Continue reading Moving Cooler, part 2→
Three years ago, an excellent report advanced my understanding of which transportation/land use policies can really help to tackle climate change. From all appearances, that report has disappeared beneath the waves without a trace; I’ve met few policy advisors who have read it.
The report is Moving Cooler, written by consultants at Cambridge Systematics in 2009. It’s a non-academic technical piece with good math but poor messaging and graphics, and while there was some promotion during the study process, I’ve seen no follow-through. Together, this probably explains why it went unnoticed.
I’ll walk through the report findings in four stages:
Scope and approach
What would happen if we deployed a sensible “bundle” of policies all at once?
How do individual policies compare, within that bundle?
What about at a “maximum” level? (Discussed in part two.)
My own perspective in this post will be to understand two questions:
How quickly can we realistically hope to reduce emissions within the transportation sector?
Which policies offer the greatest potential at a reasonable cost? (This will necessarily ignore other considerations, such as equity or acceptability. To me, the first question is “does it work?” and only then is it worth asking “is it fair?” and “is it politically realistic?”
The report itself was never freely available, but executive summary material and appendices were online until quite recently. As the report website has recently disappeared, I’ve reposted a few of the freely available items for reference:
In the web world, the public conversation about climate change on mainstream discussion and media websites is quite distorted. The comments sections are filled with poorly informed opinions (on all sides of the debate), personal attacks and disinformation. (It’s widely documented that the fossil fuel industries and other motivated groups are pouring significant amounts of money into propagating disinformation.) There’s really no meaningful “conversation” happening on most sites, the comments usually shed little real light on the subject at hand, and I don’t think anyone changes their mind by reading the comments. See some examples here
I find it fascinating how real conversations are underway in some places on the web, though. Slashdot is a computer-science-geeks site I’ve been reading for 10+ years, with a very politically diverse spectrum of people on it. If you looked at most political articles there, the discussion is all over the place — left wing vs. right wing vs. libertarians vs. communists vs. anarchists vs. more libertarians.
But the forum is moderated, and well-moderated. And when it comes to climate change articles, the audience is technical, science-reading, and smart. Of the thousands of comments posted on each article, the readers vote the “best” ones up in a fairly effective manner using volunteer moderators and meta-moderators. Most heavily-debunked “denier” talking points are thoroughly rebutted and often voted right out of view. See for example
My recollection from reading Slashdot 5-6 years ago was the opposite — that the debate was much more vigorous, with denier talking points being moderated up equally with climate science talking points. But after scanning several 2004-07 posts, I don’t actually see that — Slashdot’s community and moderation system seems to have effectively rebutted the denier talking points for quite a while. See for example
It speaks to me about the value of websites with a real defined “community” and good moderation systems. It’s possible to learn a lot about the subject by reading Slashdot’s comment sections today, even though the site’s articles alone are minimal or poor. It’s also possible to take an interest in the subject without being repulsed by the constant name-calling, YELLING and bad writing in most newspapers’ comment sections…
This also makes me wonder if our political system can gauge the public opinion effectively. Is the “public’s opinion” the raw, unmoderated comments of a newspaper — or is it what would emerge if online newspapers with a fairly democratic / meritocratic moderation system like Slashdot’s? (Sure, political offices rely primarily on polling; but they still learn the nuances of public opinion from letters to the editor and other such sources.)
Joe Romm discusses a plausible worst case scenario for climate change. The UK’s Hadley Centre model was used to look at the effects of continued growth in fossil fuel use, including carbon feedbacks – and the results aren’t pretty. In 10% of the model runs, very high temperatures are seen as soon as 2060, and the North American numbers are 6-12°C rises.
I’ve been busy reading more on transportation and climate change over the last few months. Before I post any more “big” articles, I wanted to take a moment to praise one pair of figures from David MacKay’s great book, Sustainable Energy Without the Hot Air. It’s available free online, but I thoroughly recommend buying the elegantly typeset, figure-filled hard copy.
MacKay’s great strength lies in communicating numbers: using these simple, visual representations, a single consistent set of units throughout the book, and back-of-the-envelope calculations that are designed to illustrate the cases at hand, he’s assembled a solid book. MacKay doesn’t cover the politics or economics, and he doesn’t frame the book in terms of climate change, although that’s clearly the key underlying motivation.
Here’s the background: an example of a simple visual story, showing why burning fossil fuel reserves represents a massive change to the climate system. I’m going to paraphrase MacKay to cut to the chase, but if you prefer, please read MacKay’s original text, pages 241-243.
Throughout the book, MacKay uses a type of figure where the area of a box represents its size. The bigger the box, the bigger the number, and it’s really easy to visually compare different boxes.
“Until recently, all these pools of carbon [shown in the above figure] were roughly in balance: all flows of carbon out of a pool (say, soils, vegetation, or atmosphere) were balanced by equal flows into that pool. The flows into and out of the fossil fuel pool were both negligible. Then humans started burning fossil fuels. This added two extra unbalanced flows, as shown [in Figure 2].”
The flows shown in Figure 2 (right) are from fossil fuels to the atmosphere, and from the atmosphere to the surface waters. As shown, roughly one quarter of the fossil carbon winds up in the oceans on a short timescale of 5-10 years; recent research indicates that this may be reducing as the surface waters saturate, however. Some carbon is moving into vegetation and soil, perhaps 1.5 GtC/y, but this is less well measured. What is observed today is that roughly half of the fossil carbon stays in the atmosphere.
Now, throw one more number into the mix: if fossil fuels remain our dominant energy source, then carbon pollution under “business as usual” is expected to emit 500Gt of carbon over the next 50 years, with roughly 100Gt expected to be stored in the surface waters of the ocean (at 2Gt/year). By looking at the figure, it is clear that the amount of carbon in the fossil fuel supply dwarfs the carbon in the atmosphere.
The figure shows quite clearly that if we’re talking about burning a large fraction of the remaining fossil fuels, we’re talking about major disruptions in the carbon balance. Something has to take up the carbon—either the atmosphere, surface waters, soils or vegetation, and all data to date suggests that the atmosphere is where the majority of the carbon winds up. Given the size of the fossil fuel slice, it’s easy to see how a doubling or tripling of atmospheric CO2 could happen within 50-100 years.
And yes, there’s a lot of carbon stored in the deep ocean—but as MacKay makes plainly clear in his discussion, the timescale for ocean mixing is thousands of years. “On a time-scale of 50 years, the boundary [between the surface waters and the rest of the ocean] is virtually a solid wall.”
Plus, the figure shows two risk areas: vegetation and soils from vicious cycles. If the high temperatures (caused by high atmospheric carbon) kill off vegetation or melt the tundra and release methane, large stores of carbon from the vegetation and soils categories could wind up in the atomsphere easily—and from the size of those boxes, it’s clear that could be a catastrophe. (As it happens, the IPCC A1FI scenario expects emissions of 1800-2500 GtC by 2100—I’d guess that means that much of the easily accessible fuels are burned, plus a substantial net release of carbon from vegetation and soils.)
At any rate, I like these figures because they clearly show this concept. By using a simple visual representation of quantity, and separating the steady state out from human-driven changes, MacKay cuts to the heart of the matter. Now, for comparison, here’s an IPCC figure (from their technical reports) showing carbon stores and fluxes (source is AR4 figure 7.3):
Confused yet? Black is preindustrial, red is human-caused. The IPCC figure can certainly be understood if you spend the effort, but it doesn’t leap off the page. Admittedly, it’s from a technical chapter that’s trying to communicate carbon fluxes rather than carbon stores; the comparison isn’t entirely fair. The IPCC has produced simpler figures (for other topics) in their policy-oriented chapters, but they could still borrow a few pages from MacKay’s book.
I’ve been a little baffled by the tar sands’ villainization in the climate literature for some time. While a few photographs can clearly show that the tar sands have dire impacts on the local environment (as a recent National Geographic special showed), I haven’t really understood why the tar sands have been singled out for so much venom in the climate literature. About two years ago, I first saw the stats: if you measure all emissions from extraction to the tailpipe (a “well-to-wheel” basis), the tar sands are between 15–40% worse than conventional oil. Why, then, is tar sands oil so much worse than—say—a vehicle with 15–40% poorer fuel efficiency? Here in Canada, the tar sands represent a lot of potential wealth and jobs; why should our oil get singled out relative to Saudi crude?
My reading recently took me past a figure that illuminated the problem for me in many ways. The figure below is adapted slightly from Farrell and Brandt, “Risks of the Oil Transition,” Environmental Research Letters 1(1), 2006, although I’m sure it can be found in many places in the literature.
The horizontal axis here represents the number of barrels of oil possible with each source, the vertical axis represents the greenhouse gas emissions per barrel, and the area of each bar is therefore the total emissions if all possible fuel of that type is extracted and used.
The black bar to the left of the axis represents the emissions from all oil burned to date. Everything to the right of the axis represents the potential emissions from conventional and unconventional oil, in order of price per barrel. The oil used to date is dwarfed but what remains in the ground and could be emitted in the future.
“In a worst-case scenario, where no action is taken to check the rise in greenhouse gas emissions, temperatures would most likely rise by more than 5°C by the end of the century.”
—Dr. Vicky Pope, head of climate change predictions at the UK’s Hadley Centre, Dec. 2008
“Without a change in policy, the world is on a path for a rise in global temperature of up to 6°C.”
—International Energy Agency, World Energy Outlook, Nov. 2008 [traditionally a very conservative agency]
“We are on the precipice of climate system tipping points beyond which there is no redemption.”
—James Hansen, director, Goddard Institute for Space Studies (NASA), December 2005
One year ago in a previous post, I thought these types of projections were alarmist and unwarranted. Since then, I’ve found steadily more people suggesting similarly dramatic numbers. Many believe that an average global warming of 1°C is unavoidable, and some claim that—if we do nothing to stop it—we could be headed for 5°C or more. [Update: and that’s just the global average. Over inland North America, you can add roughly another 50%, for a total of 7.5°C. Coastal North America would see lower temperature changes, but would face the dire impacts of sea level rise of 0.8 – 2.0 metres.]
The IPCC Numbers
Is there a definitive scientific source for these projections, however? With a little digging, I’ve found similar estimates from the Intergovernmental Panel on Climate Change (IPCC) itself, a source that I consider authoritative. The figure below shows annual emission scenarios (left) and temperature effects (right) for six different scenarios. For reasons I’ll explain later, I believe A1FI and A2 are the two scenarios that correspond most closely with a “business-as-usual” approach to carbon emissions. A1B is also worth considering, as a scenario “business-as-usual” is not possible due to limited fossil fuel supplies. (While the peak of conventional oil production is almost certainly around the corner, this scenario could well require constrained supplies of all fossil fuels—including coal, tar sands oil, shale oil and natural gas.)
I’ve finally found the climate change book that I can recommend widely: Hell and High Water by Joseph Romm, 2006. I urge you to read it, as soon as you can.
Why do I recommend this particular book?
Clear and accessible. A wide audience can appreciate this book: the language is simple, the opening sections are organized into a story arc, and the numbers and scientific details are kept to a minimum.
Smart science and smart politics. There are dozens of excellent books on the science alone, and a few good political ones, but very few that really understand both.
Solutions, not just problems. If you’re going to paint a picture of a monumental challenge to humanity, have the grace to show how we might solve it.
Alarming, but not alarmist. The problem is a daunting one, but Romm does not exaggerate it or lend any more terror to the issue than warranted by the science.
Urgency. The problem is truly urgent, and demands immediate action within the next decade. The reasons for this are challenging to explain, but absolutely necessary.
Achievable fixes, not impossible dreams and moral lessons. This isn’t an environmentalist’s rant about the evils of consumerism. Romm spares us the sermons and gives a clear illustration of the way forward, without requiring massive change from hundreds of millions of unwilling people.
Good references. The book itself is not complete, but Romm’s blog answered many of my follow-on questions after I finished the book.
Joe Romm has a blog (ClimateProgress), but I really recommend reading his book first. The blog is more technical, but it’s also less suitable just because it’s a blog and not a book: less structured, more opinionated, and including news that is often depressing. That said, it’s a great place to continue learning about the subject after you finish the book. The package of policies he proposes in the book for 2010-2060 are part of path leading to 550ppm CO2, based on the 2006 consensus. Since then, a number of scientific findings have pushed Romm to prefer a 450ppm target, and he has put forward a series of posts on the subject; I’d suggest starting with the [Updated:]climate change impacts discussion, then the full solutions package[updated March 2009] and then reading the rest of the series. But really, get the book first.
If you’re a climate skeptic who is willing to be persuaded by good arguments, please give the book a try. You may need to suspend the disbelief for a while, and particularly forget some of the disinformation you’ve likely absorbed from the popular media. Keep the skepticism—it’s healthy!—hear the argument out, and then follow up with one of the dozens of books on the science itself, or one of the documentaries showing how the climate denial industry has waged a relentless P.R. campaign to prevent action on this issue. If you’re a science major or researcher, you can probably understand how the IPCC process works, and can dig deep into the IPCC reports themselves and inspect the arguments first-hand. If you’re not a science major, the whole process is much more opaque and the denial industry arguments can sound persuasive; I don’t have a good book to make the case to skeptical non-scientists yet.
Coming up soon: a review of the urgency, and a look at the transportation component of Romm’s plan.
This is a follow-up to my earlier post about Monbiot’s book on climate change. In that post, I stated that I was interested in long-term emissions targets because they will probably constrain transportation planning over the course of my career. Now that I’m looking at the issue more closely, I’ve found some relevant research: a great report from Robin Hickman and David Banister in the UK, Visioning and Backcasting for UK Transport Policy or VIBAT. (Reference courtesy of Todd Litman, VTPI.) It looks at the transport problem in the UK through a similar lens as Monbiot, but with considerably more rigour. For the record, VIBAT is not yet published in a peer-reviewed journal, although it has been presented at academic conferences. To date, I have only read the executive summary and skimmed the rest.
Domestic transportation in the UK emitted 39 MtC/year (megatonnes of carbon per year) in 1990, the Kyoto baseline. It rose slightly to 41 MtC/year by 2000, and is projected to rise to 52 MtC/year by 2030 in a “business-as-usual” scenario. A recent Department for Transport white paper suggested new policies for the UK, and projected that the 2030 level would be 38 MtC/year if those policies were adopted, a very small reduction from 1990 levels.
Hickman and Banister took a more dramatic approach. They chose a target of 60% reduction in domestic UK transportation emissions by 2030 from the standard 1990 baseline, aiming for a 15 MtC/year emissions level. This is not Monbiot’s target of a 90% cut by 2030, but it’s still an ambitious choice, somewhat more aggressive than the official UK goal of 60% by 2050.
In the early framing of the paper, the problems of air travel are abundantly clear: UK international air emissions are currently 8 MtC, and might be projected to rise to 20 MtC by 2030. I can’t imagine a scenario where it would be politically acceptable for air travel to be given a bigger slice of emissions than all domestic transportation. As the authors state, “Reducing carbon emissions from international air travel should be a priority for research and action.” In the report, they focus on domestic emissions alone, and leave air travel and international shipping outside their scope.
The authors came up with two scenarios for the policy climate in 2030.
I’ve recently finished reading George Monbiot’s Heat: How to Stop the Planet from Burning. It’s an interesting piece from a widely-syndicated journalist, and it left me both alarmed and unsatisfied. I’ve held back on writing about it until I had a chance to read up a bit more on the underlying science and discuss the topic with more knowledgeable types, but I think I’m ready to comment.
For the anonymous visitors to this page, let me preface this by saying that I have zero expertise in this topic; I have no climate science training, and this represents my first stab at investigating the area. I am interested in this subject largely in terms of its implications for transportation/land use policy: to decide suitable long-term policies, I need a reasonable idea of the likely long-term acceptable level of carbon emissions.
At its heart, I think the book must be understood in terms of its intended audience: the environmental movement. Monbiot’s book is a call-to-reason for the movement, a plea to avoid the “aesthetic fallacy” of pleasing but impractical solutions: think high-speed rail, biofuels, voluntary conservation, or exclusive reliance on solar/wind power. He also urges some revisiting of traditionally demonised technologies, such as nuclear power and carbon sequestration. On these terms, the book is a valuable shaking-up of traditional environmental viewpoints. After establishing a target for emissions cuts, he devotes most of his book to investigations of the feasibility of current technologies for achieving these targets. His discussion of air travel and “love miles” is particularly prescient. The context for his book is exclusively the United Kingdom, and the solutions he proposed cannot necessarily be directly transferred elsewhere.
In this posting, I want to focus more on Monbiot’s target. He based the book on a simple argument, originally published in a 2006 column he wrote for The Guardian, which you can read yourself.
A global temperature rise higher than 2°C would likely mean runaway positive feedbacks and unstoppable large amounts of warming.
If we want mean global temperature rises no higher than 2°C (with 70% probability), we need to aim for atmospheric concentrations of CO2 to stabilise at ~380ppm (or CO2-equivalent gases at 450ppm).
Stabilisation requires emissions (sources) to equal absorptions (sinks). Land-based ecosystems’ ability to absorb carbon drops at higher temperatures. This will reduce the total land+ocean sink from past levels (4 GtC/year) to 2.7 GtC/year. Note: 1 GtC means “Giga-tonne of carbon”.
Therefore, we need to cut annual global emissions by 60% to 2.7 GtC/year.
Furthermore, this cut needs to be completed by 2030.
Finally, if emissions are to be fairly distributed to individual nations on a per capita basis, then the UK needs to cut its emissions by 90%. Monbiot argues in favour of a carbon rationing system with an equal emission entitlement for each individual.