I don’t want to alarm you, but earlier this summer, the North Pole melted. Here’s a picture taken from North Pole Environmental Observatory of a buoy floating on water there. Floating, because it melted. You might have seen headlines about it. But you probably didn’t. There were a few reasons for that. I’ll explain later.
First, some history. Because although it wasn’t until 1975 that we really got the term “global warming” – US scientist Wallace Broecker used it in the title of a scientific paper – we’ve had a sense of the phenomenon the term describes for a lot longer. We’ve also been making social changes and political statements about related issues for years too. Way before Thatcher did her 1989 speech to the UN, a 1965 US President’s Advisory Committee panel warned that the greenhouse effect is a matter of “real concern“. And we’ve been dealing with challenges of pollution for much, much longer than that.
It was in 1824 that scientists first started talking about the earth’s “greenhouse effect” (or at least French physicist, Joseph Fourier did). The existence of this thing we called an atmosphere which protected us from the sun was already established, and Fourier argued the composition of that atmosphere could change and lead to warming, similar to that one might experience sitting in a greenhouse on a hot day. In 1861, Irish physicist John Tyndall identified particular gases which might causes such an effect; he’s remembered today with a research centre named after him. By the end of the 19th century, with the industrial revolution in full swing across much of Western Europe, Swedish chemist Svante Arrhenius concluded the key gas involved was CO2.
In the 1930s, Guy Callendar – a professional steam engineer, but keen meteorologist as a hobby – used weather stations’ records to show that temperatures have risen over the previous century. He also notes the increase in CO2 concentrations over the same period, suggesting a link. Although largely dismissed by meteorologists at the time, in hindsight, scientists now see this work as fundamental. Fast forward to the 1950s, and people are gradually putting themselves back together after the war and worrying about the newer, Cold one. There’s a lot of money poured into geophysical work in this period, partly because scientific work was seen as a way people from different countries could peaceably work together. There was Geophysical Year in 1957, a significant post-war event in the interaction of scientists between East and West. The Antarctic Treaty followed in 1959, declaring a special bit of the world as space for scientific research, keeping a resource war off it in the process, or at least delaying one. As Jacob Darwin Hamblin’s recent book outlines, there was also a fair bit of interest in environmental science from the US military (and not just when they thought about nuking the ice caps to raise sea levels, though that totally was a thing for a while, and the Soviets had some plans up their sleeves too).
New equipment, including early computers, allowed physicists to see further and in more detail. There was an increasing amount of interdisciplinary work. This wasn’t always easy, but allowed geologists, chemists, mathematicians, physicists and more to share ideas, techniques and approaches making what we now still slightly loosely consider the field of “Climate Science.” Using some of these speedy new computers, American physicist Gilbert Plass concluded doubling CO2 concentrations will increase temperatures by 3-4C, and oceanographer Roger Revelle, working with chemist Hans Suess, argued that the initial hope that seawater would absorb excess CO2 was probably unrealistic. Revelle wrote slightly haunting line: “Human beings are now carrying out a large scale geophysical experiment.” That was 1957.
In 1958, a Dr Charles D. Keeling began a regular measure of the concentration of carbon dioxide in the atmosphere, from an observatory in Hawaii. Today, his son Ralph works on the same research project; a nice example of multi-generational science of environmental change. The full graph (above) shows the long sweep from 1958 – when it was around 317 parts per million – till today’s readings skirting around 400. That’s what graphs show: change. They are stories in a way. If you want empirical evidence from further back, no one has a time machine to start a new monitoring station in 1650, but we can use natural time capsules. Drill deep into the Antarctic ice and look at the bubbles of gas left in ice decades ago, and see what the CO2 concentration in these are. There’s an ice core on display at the Science Museum if you want to see one.
And then there was the “hole” in the ozone layer. I love this bit of history most of all. I love it because they nearly didn’t find it. As Joe Farman – co-discover of the hole – told the British Library’s Aural History of Science project, the British Antarctic Survey gradually started to get unexpected observations of ozone in the early 1980s and at first assumed the instruments were faulty. Even when they did realise there was something up, it was hard to connect that data with the chemists and other analysts who could work out what it meant. Even when they did put it all together and sent off the now iconic 1985 paper– on Christmas Eve – the head of Farman’s division tried to suppress the work, writing to the Met Office to say it shouldn’t be published in case it was wrong and caused embarrassment. I also love this story because it’s not quite as simple as a ‘hole’, or at least (again according to Farnman) no-one ever really owned up to coining the phrase. The idea of a hole in the ozone layer – as opposed to something more complex – seemed to have appeared somewhere between a NASA press office and a journalist at the Washington Post looking for a way to explain what was going on.
I also love the story of the “hole” in the ozone layer because we actually did something about it: the Montreal Protocol. 50 years ago CFCs were seen as miracle chemicals, safe at least in terms of immediate use. Then we found the hole in the ozone layer, and realised it was CFCs which had been causing that. So we banned them. As a world we collected together and said nah, this is really dumb, let’s find other ways to make fridges and hairspray. We’ve made steps towards this on climate change, but only steps. In 1972, there was the first UN environment conference. It might have been more interested in whaling than climate, but it was a start, and the United Nations Environment Programme is set up. In 1988 the Intergovernmental Panel on Climate Change (IPCC) formed to collate and assess evidence on climate change. In 1992 there was the Earth Summit in Rio de Janeiro, governments agreed the United Framework Convention on Climate Change, aiming to stablise greenhouse gas concentrations at a level that’d prevent dangerous warming. And in 1997 the Kyoto Protocol was agreed. OK, so the US Senate immediately declares it will not ratify the treaty and we’re nowhere near the vision of Rio 1992 (read Monbiot’s eulogy and weep). But we put something in place. We have done things before, we can do things again, and build on what we have, if we let ourselves see it.
Running alongside the story of discovering climate change is a history of humans causing it. In some ways, it is only recently that we’ve really started using fossil fuels. Recently, certainly, in terms of how long they’ve been around, but recent in human terms too. The industrial revolution of the late 18th century onwards marks a point where our uptake of coal and then oil and gas really took off.
Our current energy system wasn’t always the case. Petroleum was once used to paint canoes. It was later sold in small quantities for medical purposes (for this, and more, see Yergin’s The Prize). It took a while before anyone thought of burning it, though whale oil had been used for lamps before that. It wasn’t always this way. For example, did you know Shell are called Shell because they started off selling seashells? Back in the 1830s, Marcus Samuel, an East London antique dealer, realised there was a market for using shells in interior design. They were so popular he had to import them from abroad, laying the foundations for an international import/export business which – via trading of rice, silk, china, copperware, sugar, flour and wheat – by the 1890s was transporting oil.
The history of coal can be especially useful in telling us something of how we’ve reacted to problems involved in fossil fuels before. As Barbara Freese’s book, Coal: A Human History describes, as London grew, it craved energy and soon sucked up the forests surrounding for fuel and space. Still hungry, it found coal and shipped it into the capital instead. Except coal is poison. You don’t need modern chemistry and ideas about climate change to see that. The smoke is just nasty. By the 1660s, London was not pleasant. The rich people still used wood, and would leave the city when it got smoggy, but the poor coughed on. Or died. Then we started building houses with chimneys and the smoke went up, so people could use coal without such immediate pollution in their homes. It still polluted the city, but that was less immediate. Because this was more of a collective problem of composite emissions, it needed a collective answer. This came in the form of the 1956 Clean Air Act which made parts of the county ‘smoke free’, a response to the great smog of 1952 which had people falling over in the street.
Next time you’re on the top deck of a bus or train going through the older bits of a town, with all its chimneys, think about what they did for people, and how now they are a bit redundant. They are artefacts of other energy ages. As Maggie Koerth-Baker puts it in her review of Alexis Madrigal’s history of American green tech: “Energy isn’t just what it is. Energy is what we have decided we want it to be”. We could probably be a bit more proactive about this kind of decision making.
Today, there is a way in which these grubby, smelly, powerful fuels are increasing discussed in the almost immaterial terms of economic abstractions. In 2006 the Stern Review concluded climate change could damage global GDP by up to 20% if left unchecked and this financial framing has been a powerful one. There has been some sense that carbon emissions have dropped because of the recession, but more recent research argues that the long-standing consequences of the recession will really be quite bad for the environment, as politicians make more short-term decisions. Activists have also thought about the economy as a way to campaign, cutting off the financial lifeblood of the fossil fuel industry. Much of this recent work has been inspired by the idea of “unburnable carbon”. In the 1990s we’d often talk about running out of gas and oil, compared to “renewables” of solar or wind. But we can’t burn the fossil fuels we’ve got, and yet we keep finding more. This is why fracking is such a big deal, and drilling the Arctic -it would let us access fuel we couldn’t get to before. “Keep it in the ground” has become a mantra of environmentalism, with activists, following Bill McKibben’s lead, looking to financial systems as a way to leave them “stranded“.
I gave a talk about all of this at Demand the Impossible earlier in the summer. At the end, one of the participants asked when I thought the point would come when climate change would be so bad the politicians would start to take notice and do something. But the thing is that point happened long ago. We just don’t notice. Or if we do, we somehow managed to dismiss them. Maybe we’re all “deniers”.
I’ve occasionally heard climate activists get a bit carried away and talk wistfully about how maybe, just maybe, there’ll be a really cold winter here during a shocking heatwave in Australia, and then a hurricane will hit a rich city in the US along with some devastating floods elsewhere and this will manage a tipping point for actual action on climate. I hate this. It sounds like they are wishing for people suffering. Also, we can’t wait. Climate change isn’t an instantaneous process. We’re already likely looking at quite horrific consequences of the warming we’ve set in motion with the carbon we’ve already pumped out (see the recent World Bank report, for one version of this). What we did in the past has already made the future worse. What we do now can only limit that damage. By the time there is that “event” it will be way, way too late.
Instead of waiting for the environment to enact some terrible moment to galvanise us into action we need to construct a shock of our own, a social shock where we collectively take action. We are already doing this, but we need to do more. The people need to rise before the seas do.
But back to the North Pole. And that buoy floating on a lake where you might expect to see ice. Why wasn’t that on every front page? One answer is because the media doesn’t like to talk about climate change much. That’s undoubtedly part of it. But it’s also because it’s not really news. It does it every year. It’s summer. Ice melts. In a way this is normal, or at least it’s not a sudden big deal (and according to The Atlantic this actual picture was a bit south of the actual pole anyway). If you look closely at the data over time, you can see it’s melting more than it was and is likely to melt more. The “balding Arctic” crept up on us. It won’t be some big iconic moment that will inspire us, but simply more of the incremental action, observation, anaysis, imagination and debate by scientists, engineers, politicians, writers and activists. Isn’t that enough?
Alice Bell is a co-editor at New Left Project.