Feeds:
Posts
Comments

Posts Tagged ‘climate change’

Ahead of the upcoming IPCC report into global climate, and climate change, the news agenda seems to been largely dominated by stories asking why global warming has paused for the last 15 years (see the BBC, the BBC again, the torygraph and the NZ herald among countless other examples).

A substantial part of this seems to be the repeat of familiar claims that 1998 was the hottest year on global record, and if global warming scientists were right there is no way that we should not have a seen a hotter year during the past 15 years. Hence, climate change has paused, the models and data suggesting that human fossil fuel emissions were to blame for late 20th century warming were wrong, and that consequently any argument for restricting emissions in future are null and void.

Which of course ought to lead to the question, who says that 1998 was the hottest year on record? Well the answer to this is somewhat complicated, but also somewhat revealing. It aint NASA, who run GISStemp (the Goddard Institute for Space Studies Surface Temperature Analysis) who have 2010 as the hottest year on record followed by 2005, with 9 of the 10 hottest years occurring after the year 2000 (with 1998 as the only pre-2000 year in that list).  It also isn’t NOAA (the US National Oceanic and Atmospheric Administration) who compile a global temperature record at the National Climactic Data Centre (NCDC), whose data again places 2010 as the hottest year on record, followed by 2005, with 1998 in third, and 9 of the hottest 10 years on record occurring after the year 2000 (ie after global warming has allegedly paused). Which leaves the UK Met Office’s Hadley Centre and and University of East Anglia’s Climactic Research Unit record HadCRU. The CRU is of course the unit who were the subject of the Climategate faux controversy where sceptics hacked emails, published some excepts from private correspondence out of context claiming fraud and data manipulation generating global headlines, and which were subsequently found by numerous independent investigations to have found no evidence of wrongdoing. The latest version of this temperature series is HadCRUT4v which again shows that 2010 was the hottest year on record, followed by 2005, followed by 1998.

So where does the 1998 was the hottest year claim come from? Well, HadCRUT4v is the latest, and most accurate temperature record maintained by the Met Office and CRU (for a detailed explanation of what’s changed look here). If we ignore that and instead use their previous version, HadCRU3v, then, and only then does 1998 appear to be the warmest year on record. So why did this old record suggest a different year to the NASA and NCDC records (and indeed the latest version of the CRU record)? Well the main reason for this was the different methods used to generate global temperatures. Of course none of these institutions are able to measure the temperature in every place in the world, they use stations in various locations, and the places where there tend to be the fewest stations tend to be the polar regions (where there also tend to be the fewest people). And one of the things we know quite well, is that the Arctic has been the fastest warming region on the planet. Whereas GISStemp interpolates values between measured locations in the Arctic, HadCRU3v left them blank as unknown, which introduced a cold bias into their dataset compared with the others, and explaining why it has been replaced by a dataset which features a greater number of stations and which correlates much more strongly with the other datasets.

So the ‘pause’ in climate change is something that only exists if you exclusively look at a now obsolete and known to be biased dataset generated by a group who those using this data have previously claimed to be frauds. And decide to ignore that 1998 was in any case a super El-nino which had a dramatic short term effect on global weather – hence the other 9 of the 10 hottest years on record all occurring since the year 2000. If you used 1997 or 1999 as start dates there wouldn’t appear to be any pause in any dataset (outdated or otherwise), but cherry-picking the year when specific short-term conditions made things abnormally hot added to cherry-picking a now obsolete dataset allows sceptics to make the global warming has paused argument (see this excellent skeptical science post for details on cherry-picking)

So why are so many mainstream media outlets focussing upon this as the main story in the lead up to the IPCC report? Probably because it’s a more sensationalist and conflict-driven story than one which reads science has been slowing progressing, turning a 90% confidence in predictions in 2007 into a 95% confidence by 2013, allied with a big PR drive from a number of the main players in the climate denial industry.

 

Read Full Post »

On Friday I was in London for an unconference hosted by Furtherfields.org at the University of Westminster, London which approached the subject of re-rooting digital culture from an ecological perspective. Here’s the brief for the event:

Over the last decade the awareness of anthropogenic climate change has emerged in parallel with global digital communication networks. In the context of environmental and economic collapse people around the world are seeking alternative visions of prosperity and sustainable ways of living.

While the legacy of the carbon fuelled Industrial Revolution plays itself out, we find ourselves grappling with questions about the future implications of fast-evolving global digital infrastructure. By their very nature the new tools, networks and behaviours of productivity, exchange and cooperation between humans and machines grow and develop at an accelerated rate.

The ideas for this transdisciplinary panel have grown out of Furtherfield’s Media Art Ecologies programme and will explore the impact of digital culture on climate change, developing themes adopted in grass-roots, emerging and established practices in art, design and science.

One thing which left me somewhat confused was why the event was billed as an unconference, when in reality it was a fairly straightforward event with three speakers and a short Q+A after. Listening to three presentations (with accompanying powerpoints and prezis) and then having the chance to ask a few questions at the end is not a participant driven meeting, its the same format as you find at any conventional conference panel.

The first speaker was Michel Bauwens, founder of the Foundation for Peer to Peer Alternatives. Bauwens began by prescribing the central problems of the contemporary socio-economic system with regards to sustainability and equity. The first problem he outlined was that of pseudo-abundance: the aim of achieving infinite economic growth on a planet with finite resources and the externalisation of ecological costs from our limited understanding of economics. The second problem he delineated was that of artificial scarcity: the ways in which intellectual property is enforced via patents and copyrights which create scarcity around assets whose cost of reproduction often approaches zero with digital networked technology. This Bauwens argued, leads to the stifling of innovation, which prevents the types of solutions to ecological crises being developed as commonwealth, outside of a profit driven market framework. The final problem Bauwens diagnosed was that of social justice, as exemplified by the cavernous (and growing) divide between rich and poor on a global level.

Bauwens’ suggestions around potential solutions to these problems is primarily through commons based peer production. In commons based peer production, individuals are able to voluntarily self-aggregate into distributed networks based on coordination through networked telecommunications. While Bauwens presents this as an entirely new phenomena, afforded by the massive increase in computational power and networked connectivity associated with the information revolution, it is worth mentioning that voluntary self-aggregation and democratised and decentralised ownership of projects has long been a foundational concept of anarcho-syndicalist thought. What appears to be different about P2P networks in the contemporary context however are their ability to connect peers outside of a localised context through digital telecommunications networks, and also for the projects to be scaled up accordingly in terms of size and scope. These affordances have the potential to enable commons based peer production to out-compete market based initiatives in many circumstances, however what is potentially of greater significance than the efficiency gains P2P networks can provide is the alternative set of values they tend to embody.

Bauwens used examples of numerous forms of consumer electronics as instantiations of planned obsolescence, whereby the company making the product has a financial incentive to create a product which has a highly limited shelf life, and whose design is not modular, so that failure of individual components leads to users replacing the entire device. While the manufacturer profits each time this cycle continues and new items are bought, the ecological costs increase, however these are externalised from the market transaction. By contrast the open design methodology is based around values whereby the user/designer (the term prouser was suggested) wants their device to be as durable and long lasting as possible, and for a modular design to exist which eanables them to easily replace any parts which are damaged over time. Consequently the argument Bauwens promoted was that the values of the open design movement present an ethical alternative to market production whereby ecological sustainability and social justice can be built into the production process itself.

Bauwens argued that this argument was not merely utopianism but was based on a material analysis of the prescient features of contemporary capitalism, which he argued already needs commons based peer production in order to remain profitable.

The second speaker was Catherine Bottrill of Julie’s Bicycle, an organisation which works with arts ‘buisnesses’ to reduce their carbon footprint. While I’m sure the organisation does good work, the scheduling seemed somewhat odd. Following a talk about the problems of contemporary capitalism and the necessity to replace it with a system with alternative ethical values created via grassroots and decentralised P2P networks we had a talk which seemed to imply that if the major record labels reduced their carbon footprint slightly and their star acts planned their world tours slightly differently there would be no ecological crisis.

It was problematic that Bottrill didn’t address any of the concerns or solutions Bauwens had just raised, and one slide in particular caused (presumably) inadvertant entertainment with her diagnosis of contemporary challenges to society. First came recession, second came the Middle East Crisis, followed by cuts to Arts Council funding. I’m not sure what came next because I was laughing too hard. On a more serious note though, for a group of uniformly white middle class people at a posh London university to listen to someone raise arts funding cuts as a major social problem above the other aspects of the government’s austerity programme; cuts to disability benefits, cuts to welfare, cuts to education, the privatisation of the NHS etc was somewhat depressing.

The final presentation was from Ruth Catlow of Furtherfields.org on ecological approaches to networks, tools and digital art. Catlow began with a delineation of network topography, referring to a 1964 RAND corporation diagram on various forms of structure

Catlow argued that while mass media networks resemble the centralised structure on the left, the Internet is a mixture of the decentralised (via the cables and gateways that make up the material apparatus of connectivity) and distributed (as each computer functions as a node in a distributed network). While this has been a traditional way that the Internet, and its potential for creating a democratic media system has been trumpeted for over two decades now, this analysis misses a crucial part of the picture. Recent research into the structure and connectivity of complex networks such as the World Wide Web (which is the most common encounter people have with the Intenet) reveals that far from a distributed system in which all nodes are equal or every blogger is a pamphleteer, the structure of these networks is that of a power law, with a few preferentially attached ‘superstars’ such as Google, Facebok, Twitter and Amazon, while the vast majority of content resides in the ‘long tail’ where it receives scant attention.

Systems as diverse as genetic networks or the World Wide Web are best
described as networks with complex topology. A common property of many
large networks is that the vertex connectivities follow a scale-free power-law
distribution. This feature was found to be a consequence of two generic mech-
anisms: (i) networks expand continuously by the addition of new vertices, and
(ii) new vertices attach preferentially to sites that are already well connected.
A model based on these two ingredients reproduces the observed stationary
scale-free distributions, which indicates that the development of large networks
is governed by robust self-organizing phenomena that go beyond the particulars
of the individual systems
Barabasi and Albert 1999

When we talk about network topology we need to engage with these findings, as while the power law functions as an attractor which partially determines the distribution of the network, there has been some research which suggests that this is not a fixed and finite determinism and that there may be methods or tactics which communities can use to make these networks more equitable. But for me, that discussion is the interesting one to be having about network topologies now, not merely a recapitulation of the earliest models.

Following this, Catlow went on to detail a number of projects which Furtherfields have been involved in, including the Zero Dollar Laptop Project; which is an innovative way of both mitigating the ecological cost of contemporary computing hardware while also providing social benefits to disadvantaged groups, We Wont Fly For Art, a project designed to mitigate the carbon emissions created by artists and the Feral Trade Cafe, a project by Kate Rich which establishes social networks to ethically trade goods.

Overall the event was worth attending, Bauwens’ talk in particular being a highlight.

Read Full Post »

While being ill over the last week or so, I’ve spent some time playing the beta version of Fate of the World, a forthcoming PC game based around climate change which has generated quite a lot of interest in the mainstream media ( see here and here)

The game essentially puts you in command of a global organisation whose mission is to prevent catastrophic climate change (defined in the beta mission as 3 degrees of warmth), maintain a human development index above 0.5 and keep people happy enough to stop more than a certain number of geographical regions from kicking you out by the year 2020.

To achieve these goals you have a variety of tools at your disposal; projects which include boosting renewable energy, deploying CCS technology and subsidising electric cars, environmental and social adaptation/mitigation measures such as drought prevention measures, healthcare programmes and advancing regional water infrastructure, policy measures such as banning oil from tar sands or deploying algae based biofuels, and political measures such as deploying peacekeepers to troubles regions and black ops (including covert steralisation programmes?!).

From a game studies perspective the game is interesting as it provides users with a complex simulation whereby numerous interdependent factors are required to be dynamically balanced in a way that goes far beyond the usual kill or be killed binary prevalent inb most computer games. While there are of course alternatives, particularly in the realm of sandbox simulation games from SimEarth to Civilisation 5, Fate of the World is interesting insofar as it uses data taken from climate models to simulate not a fictional alternative world, but possible futures of this planet, presuming that the current data from climate models are broadly accurate. As such by experimenting with different variables users can glean a different kind of insight pertaining to the challenges posed by Anthropogenic Climate Change to engaging with traditional forms of media, such as watching a documentary or reading the scientific literature. By being able to manipulate how regions react through play, users get a different kind of experience, one driven by feedback, configuration and systemic thinking rather than narrative, affect or rhetoric. While such models will always be highly reductive simplifications of real world complexity, they could provide a useful way of approaching some of the complex social and environmental issues currently facing us, and indeed this kind of argument has been powerfully advanced by game studies scholars such as Stuart Moulthrop, who have advanced the argument that when dealing with complexity, configurational thinking is likely to present users with a better understanding of the area than linear narrative based approaches.

One criticism I have of the game in its current state is that the processes of feedback which reveal how a user’s interventions are effecting the relevant systems are often relatively obscured by their placement two menus deep, and I suspect that many players will struggle to find the data which actually spells out what the the consequences of their actions have been, and without this crucial information actions can appear opaque and indeed this criticism has been made on gaming forums discussing the beta. Hopefully this will something that is addressed before the game is released, as if players don’t understand what effects their decisions have entailed, then the game isn’t achieving its goals.

One aspect of the game which I found highly intriguing is the disparity between the aims and activities the game sets for users and the claims and actions of really existing nation states and supranational institutions. The beta mission in the game sets success as avoiding a rise of 3 or more degrees over pre-industrial temperatures by 2120, which is  below the midpoint of the IPCC projections of 1.5-6 degrees of warming this century (dependent on a range of factors, but primarily human measures), but which is considerably higher than the figure of a 2 degree rise which nation states couldn’t agree upon at the COP15 conference at Copenhagen last year. The reason states couldn’t agree upon that figure wasn’t the complete lack of concrete measures designed to practically bring about that change, but because a large number of nations, primarily the 131 countries represented by the G77 group, declared that a 2 degree temperature rise was too high. Earlier this year those nations convened in Bolivia at the World People’s Conference on Climate Change and the Rights of Mother Earth, where they drafted a people’s agreement which stated

If global warming increases by more than 2 degrees Celsius, a situation that the “Copenhagen Accord” could lead to, there is a 50% probability that the damages caused to our Mother Earth will be completely irreversible. Between 20% and 30% of species would be in danger of disappearing. Large extensions of forest would be affected, droughts and floods would affect different regions of the planet, deserts would expand, and the melting of the polar ice caps and the glaciers in the Andes and Himalayas would worsen. Many island states would disappear, and Africa would suffer an increase in temperature of more than 3 degrees Celsius. Likewise, the production of food would diminish in the world, causing catastrophic impact on the survival of inhabitants from vast regions in the planet, and the number of people in the world suffering from hunger would increase dramatically, a figure that already exceeds 1.02 billion people. The corporations and governments of the so-called “developed” countries, in complicity with a segment of the scientific community, have led us to discuss climate change as a problem limited to the rise in temperature without questioning the cause, which is the capitalist system…

Our vision is based on the principle of historical common but differentiated responsibilities, to demand the developed countries to commit with quantifiable goals of emission reduction that will allow to return the concentrations of greenhouse gases to 300 ppm, therefore the increase in the average world temperature to a maximum of one degree Celsius.

With atmospheric CO2 concentrations currently at 387ppm, and even the most ambitious campaigners in the developed world calling for a reduction to 350ppm, the aims set out at the World People’s Conference appear laudable, but completely unrealistic. Indeed, the goal loosely set out at COP15 of reducing warming to no more than 2 degrees, but with no mechanisms to try to achieve this have been widely criticised by groups such as the International Institute for Environment and Development;

The Accord is weak. It is not binding and has no targets for reducing greenhouse gas emissions (countries that signed it have until 31 January to list their voluntary actions in its appendix). The low level of ambition will make preventing dangerous climate change increasingly difficult. What countries have so far proposed will commit us to a 3 to 3.5-degree temperature increase, and that is just the global average.

In some ways this may be the most useful role the game plays; highlighting the distance between the rhetoric of political and business leaders who are currently seeking to greenwash the issue, and present their inaction beyond rhetoric as somehow being constitutive of a viable solution to the problems posed by ACC. Despite taking concerted action throughout the game it is hard to maintain warming of under 3 degrees without society collapsing due to a lack of mitigation and adaptation measures, widespread war and civil unrest or widespread poverty and famine in the face of increasingly severe climate related disasters as the next 110 years unfold. In some ways this isn’t that much fun; being told that your actions are resulting in millions starving and armed conflict doesn’t spread warmth and joy, but it does give some indication of how hard things are likely to get as time passes.

One thing that becomes abundantly clear from the game is that the sooner action is taken to dramatically curb CO2 emissions (particularly in wealthy nations where emissions per capita are far higher), the less severe the consequences will be further down the line. This is a lesson we would do well to heed.

World People’s Conference on Climate Change

and the Rights of Mother Earth

April 22nd, Cochabamba, Bolivia

PEOPLES AGREEMENT

Today, our Mother Earth is wounded and the future of humanity is in danger.

If global warming increases by more than 2 degrees Celsius, a situation that the “Copenhagen Accord” could lead to, there is a 50% probability that the damages caused to our Mother Earth will be completely irreversible. Between 20% and 30% of species would be in danger of disappearing. Large extensions of forest would be affected, droughts and floods would affect different regions of the planet, deserts would expand, and the melting of the polar ice caps and the glaciers in the Andes and Himalayas would worsen. Many island states would disappear, and Africa would suffer an increase in temperature of more than 3 degrees Celsius. Likewise, the production of food would diminish in the world, causing catastrophic impact on the survival of inhabitants from vast regions in the planet, and the number of people in the world suffering from hunger would increase dramatically, a figure that already exceeds 1.02 billion people. The corporations and governments of the so-called “developed” countries, in complicity with a segment of the scientific community, have led us to discuss climate change as a problem limited to the rise in temperature without questioning the cause, which is the capitalist system.

Read Full Post »

So the other day I wrote about why I don’t think the 10:10 campaign in general works, and why Richard Curtis’s promotional film for it was destined to be a spectacular own goal which offended people and put them off environmentalism.

And today the Guardian has a piece describing what’s happened since…

The charities that backed a Richard Curtis film for the 10:10 environmental campaign said today that they were “absolutely appalled” when they saw the director’s four-minute short, which was withdrawn from circulation amid a storm of protest.

The charity ActionAid, which co-ordinates the 10:10 schools programme, today welcomed the move. “Our job is to encourage proactive decisions at class level to reduce carbon emissions. We did it because evidence shows children are deeply concerned about climate change and because we see the impacts of it in the developing world where a lot of our work is. So we think the 10:10 campaign is very important, but the moment this film was seen it was clear it was inappropriate.

The questions we ought to be asking now are how did the 10:10 team ever think that a promotional film featuring authority figures such as a teacher and an office manager blowing up children and workers who dont sign up to their campaign was a good idea and how much money and carbon were wasted by their celebrity packed own goal?

Read Full Post »

The 10:10 campaign championed by the likes of the Guardian has got a lot of airtime recently, and today there’s a new article promoting their new campaign video on the Guardian website.

I’m not suggesting that the campaign wont have achieved anything good (after all any emissions reductions aren’t a bad thing) but the whole ethos of the campaign is something I find deeply problematic. The central idea of course is that individuals, businesses and governments cut their carbon emissions by 10% over the course of 2010. Which on the surface sounds like a good thing, surely cutting carbon emissions across the board is a good thing?

Well yes. Kind of. But what 10:10 doesn’t address is the massive inequality of emissions between different individuals and businesses. For someone who takes six flights a year, drives 15k miles in a 25mpg 4×4 and has a massive house with no insulation and incandescent light bulbs and who eats factory farmed imported meat twice a day and has an otherwise high level of consumption cutting 10% of their emissions is pretty straight forwards. Take a couple of flights less, drive 13.5k miles in their 4×4 and get some energy saving light bulbs or insulation. That individual still has a gargantuan carbon footprint, but it’s 10% smaller than it was before. Are they now sustainable, and have they ‘done their bit’ towards ‘saving the world’ (as the nauseating Guardian article describes the campaign). Of course not.

Now compare that to someone who has been eco-conscious for a number of years, who doesn’t fly, doesn’t drive, buys local produce, who pays extra for renewable-generated electricity and consequently is already has a carbon footprint way below the national average, and may well be living in a sustainable way. What are they meant to do to cut 10% of their emissions? Live without a fridge? Leave the heating off for the first half of winter? Shower only ever using cold water?

So you see the problem, 10:10 is trivial to achieve for the heaviest polluters and extremely hard to achieve for people who actually made an effort to live sustainably because it expects both groups of people to make the same percentage change. It’s the same trick that bourgeois environmentalists like Richard Heinberg have persuasively argued in favour of: a universal percentage reduction, which means that those who have done the most damage make a trivial gesture towards sustainability while those who aren’t really part of the problem have to make the same percentage cut. This has nothing to do with bringing emissions down to a sustainable level. The way to do that is to agree upon what that level should be and then to get people to work towards it, with a cap and dividend system so those who live unsustainably compensate those who are. 10% of totally unsustainable is still totally unsustainable. Accusing people who are already living sustainably of destroying the world because they aren’t going to make further cuts while most of those around them have emissions 3-5 times higher than them is just stupid.

And of course that focusses purely at the level of the individual. While governments are still happy to support costly fossil fuel extraction schemes such as deepwater oil field exploration and tar sands development while failing to adequately support renewable energy generation (Vestas being a case in point) allied with their failure to come to any kind of international agreement to supplant the soon to expire Kyoto Protocol, the actions of individuals are rendered entirely insignificant. How unsurprising then that free marketeers love the idea of 10:10, not only does it mean that the heaviest individual polluters have to take trivial action, but it also means that middle class liberals can feel good about having ‘saved the world’ without the need for any kind of national or international regulation.

And so now the campaign has a new promotional film, in which people who don’t pledge support to the campaign are blown up by figures of authority such as a schoolteacher or boss. Given that many denialist arguments centre on the alleged coercive centralised authority of the warmist movement this video is not very likely to win anyone over. On the Guardian page commentators have remarked that it’s humorous to suggest executing anyone who doesn’t agree with your position. I’m not really seeing the joke.

The video comes across as patronising and highly un-funny. It will undoubtedly offend and alienate people. The campaign itself has achieved something in terms of emissions reductions, but an optional 10% carbon emission decrease has nothing to do with the ridiculous notion of ‘saving the planet’ or even the more sensible notion of avoiding some of the worst of the predicted effects of anthropogenic climate change. Really it’s little more than a way of avoiding middle class guilt at the lack of meaningful action over climate change.

Read Full Post »

IPCC Errors

There’s a really good piece over on the Nature website about the recommendations made by an independent assessment of the way in which the IPCC currently functions. (hat tip Realclimate)

It’s worth reading because a lot of the coverage of the report has been over the top sensationalist nonsense about how the IPCC needs drastic reform as a result of serious errors. When these mainstream and sceptic pieces say errors, they all point to one example – the erroneous claim that Himalayan glaciers could disappear by 2035. One error, which was not in working group 1 (which covers the science of ACC) but appeared in WG2 (impacts adaptation and vulnerability) does not amount to a series of errors.

While right-wing media outlets and sceptic blogs spent a lot of time propagating the idea that there have been other, similar errors with the IPCC 2007 report, such as the Amazongate saga, whose main claims the Times has now retracted (the Times version is now however behind the Murdoch pay wall), the truth is that the science behind the IPCC 2007 report has stood up pretty well to the intense scrutiny which it has come under. On a related note, the IPCC chairman Rajendra Pachauri was subjected to an entirely baseless series of accusations claiming that he was making a fortune from his links with carbon trading companies. According to climate sceptics and the right-wing press he was profiteering from his position of head of the IPCC. An independent report into his financial dealings have found all of those claims to be completely and utterly false. Still, as George Monbiot demonstrates, the accusations continue to be made by right-wing media outlets.

In fact the major story about the IPCC getting something wrong is actually about sea level rise. The IPCC admitted in their assessment that they weren’t taking all factors into account, as some were too uncertain, but the reality of empirically observed sea level rise is way above the IPCC’s uppermost prediction.  This prediction wasn’t made using grey literature, it was supposedly based on solid science, and yet it turns out to be totally wrong. So why has there not been widespread outcry in the media about this erroneous prediction? Why are there not legions of sceptical bloggers using this as evidence that the IPCC reports are equal parts guesswork and science, and that their conclusions should be viewed with a large dose of scepticism?

Because the message we get from the IPCC’s most serious error is that their estimates err on the side of conservatism and caution rather than the alarmism they are routinely accused of. Which is really not any great surprise given the consensus based framework they operate within. But if the IPCC’s predictions are too conservative, then the entire sceptic argument goes up in smoke, so very little is heard about the most serious error the IPCC have made. This is somewhat telling, as it suggests that those criticising the IPCC have little interest in improving its procedures and improving its predictions. More often than not it turns out to be the same few voices attempting to sell generalised doubt about climate change in order to prevent any kind of legaslative action being take to mitigate the predictions the science makes.

Read Full Post »

The US National Oceanic and Atmospheric Administration has recently released a report entitles State of the Climate 2009, which is downloadable in its entirety as a pdf from here, with a web page summarising the report available online here.

The findings of this report, which involved over 300 scientists from 48 countries around the world are further corroboration of the evidence that ‘t the past decade was the warmest on record and that the Earth has been growing warmer over the last 50 years.’ What is interesting about the report, is that it uses a methodology which is well suited to examining a broad range of climactic indicators to build a fairly comprehensive overview of numerous ecological systems and so to be able to map the changes these systems are currently undergoing.

Based on comprehensive data from multiple sources, the report defines 10 measurable planet-wide features used to gauge global temperature changes. The relative movement of each of these indicators proves consistent with a warming world. Seven indicators are rising: air temperature over land, sea-surface temperature, air temperature over oceans, sea level, ocean heat, humidity and tropospheric temperature in the “active-weather” layer of the atmosphere closest to the Earth’s surface. Three indicators are declining: Arctic sea ice, glaciers and spring snow cover in the Northern hemisphere.

By utilising a broad range of indicators, and finding such broad agreement between them the NOAA report provides a good example of the way in which the science behind anthropogenic climate change is based on a vast array of observations which corroborate with the physics behind the greenhouse effect, which gives a causal mechanism to the observations seen in these global climactic systems.

As John Cook from SkepticalScience points out in an article for the Guardian, this kind of rigorous scientific research which seeks to give a good overview of global systems by combining and comparing the observed data for numerous global systems is diametrically opposed methodologically to the kind of material promoted by most climate change skeptics, who seek to prevent meaningful action being taken to reduce the severity of ACC by entirely ignoring the bigger picture – the enormous breadth of data from global systems – instead focusing on minute details such as the choice of proxies used in decade old papers on paleoclimatic reconstructions (the repeatedly vindicated Hockey Stick paper published by Mann Hughes and Bradley) or the choice of wording in private emails between climate researchers (such as Phil Jones’ phrase ‘hide the decline’ to describe the divergence between one particular set of trees used as as a climate proxy and the observed temperature record, which was covered in the published literature on that proxy set).

As the NOAA report highlights, when you look at the big picture, rather than concentrating on minute details, the evidence is that the planet is heating up, that human activity is largely the cause, and that the medium to long term ramifications of these changes will make life far harder for hundreds of millions of people as well as causing the extinction of innumerable other species less capable of adapting to a changing climate. As time goes by and more research is being done, that evidence is only getting stronger as more and more datasets which confirm the findings of the IPCC emerge. However despite this mountain of evidence, the political action that would make meaningful action to mitigate the worst of the potential consequences is still a long way off, with the US Senate’s decision not to even try to get a massively compromised bill through following in the wake of the inability of the world’s political leaders failure to reach any deal at COP15 in Copenhagen last December to succeed the emissions cuts of developed nations agreed under the Kyoto Protocol. With the US seemingly steadfast in its refusal to make any kind of cuts to its emissions, it seems unlikely that other developed nations are going to volunteer further reductions to their own emissions, no matter what the science says, as politicians are too fearful for their own careers, which are largely dependent on short term economic success rather than longer term sustainability.

Read Full Post »

Yesterday I wrote about the Penn State University investigation into Michael Mann’s research conduct which stemmed from the Climate Research Unit email hack widely known as Climategate, and criticising media outlets such as the Guardian, which gave a huge amount of coverage to the ‘scandal’ whilst then failing to give anything like equal attention to the three subsequent investigations which have found that there was no research misconduct on the part of climate scientists, and absolutely no evidence of fraud, data manipulation or inventing climate change to receive funding money.

Today, Fred Pearce, who wrote an utterly abysmal 10 part special on Climategate for the Guardian, and has now written a book on the subject, has a new piece linked off the front page of the paper. Shockingly, this latest piece on Climategate repeats the claims that

Critics say the emails reveal evasion of freedom of information law, secret deals done during the writing of reports for the UN’s Intergovernmental Panel on Climate Change (IPCC), a cover-up of uncertainties in key research findings and the misuse of scientific peer review to silence critics.

However Pearce decides to entirely omit from his report that three independent investigations have found every one of these claims not to be true. One would have thought that mentioning that the scientists have now been vindicated by three separate investigations into the accusations Pearce restates would be an important part of the story, but apparently this is not the case. He does mention the Muir Russell investigation, the fourth inquiry which is due to report its findings on Wednesday, and then says that

whatever Sir Muir Russell, the chairman of the Judicial Appointments Board for Scotland, concludes on these charges, senior climate scientists say their world has been dramatically changed by the affair.

Unbelievably it would seem that Fred Pearce thinks that it doesn’t matter whether climate scientists are guilty of research misconduct, fraudulently manipulating data or any of the other charges of which they have been accused. In the surreal world of the mainstream media, what matters is not the facts surrounding the issue, and certainly not the results of independent investigations into these matters, but the number of bogus accusations already made by journalists and bloggers.

Read Full Post »

Earlier this year the Sunday Times published a piece by Jonathan Leake claiming that a claim in the 2007 IPCC report about the sensitivity of the Amazon rainforest to changes in precipitation had been invented by environmental activists at the WWF who sought to alarm the public about non-existent dangers which they sought to attribute to anthropogenic climate change.

At the time the piece, based on research by Richard North, was widely criticised on places like Deltoid for failing to report that the WWF report was in fact based on peer reviewed scientific research, and for misrepresenting Dr Simon Lewis’s comments that the IPCC should have referenced the original works rather than the WWF report, claiming that Lewis disputed the scientific basis of both the IPCC and WWF reports.

At the time Lewis tried to comment on the Times website to clarify that his position on the issue had been misrepresented by Leake, so the Times duly deleted his comment. He wrote to the newspaper, and heard nothing back. So he took his complaint to the Press Complaints Commission. In response the Sunday Times has issued the following retraction.

The article “UN climate panel shamed by bogus rainforest claim” (News, Jan 31) stated that the 2007 Intergovernmental Panel on Climate Change (IPCC) report had included an “unsubstantiated claim” that up to 40% of the Amazon rainforest could be sensitive to future changes in rainfall. The IPCC had referenced the claim to a report prepared for the World Wildlife Fund (WWF) by Andrew Rowell and Peter Moore, whom the article described as “green campaigners” with “little scientific expertise.” The article also stated that the authors’ research had been based on a scientific paper that dealt with the impact of human activity rather than climate change.

In fact, the IPCC’s Amazon statement is supported by peer-reviewed scientific evidence. In the case of the WWF report, the figure had, in error, not been referenced, but was based on research by the respected Amazon Environmental Research Institute (IPAM) which did relate to the impact of climate change. We also understand and accept that Mr Rowell is an experienced environmental journalist and that Dr Moore is an expert in forest management, and apologise for any suggestion to the contrary.

The article also quoted criticism of the IPCC’s use of the WWF report by Dr Simon Lewis, a Royal Society research fellow at the University of Leeds and leading specialist in tropical forest ecology. We accept that, in his quoted remarks, Dr Lewis was making the general point that both the IPCC and WWF should have cited the appropriate peer-reviewed scientific research literature. As he made clear to us at the time, including by sending us some of the research literature, Dr Lewis does not dispute the scientific basis for both the IPCC and the WWF reports’ statements on the potential vulnerability of the Amazon rainforest to droughts caused by climate change.

In addition, the article stated that Dr Lewis’ concern at the IPCC’s use of reports by environmental campaign groups related to the prospect of those reports being biased in their conclusions. We accept that Dr Lewis holds no such view – rather, he was concerned that the use of non-peer-reviewed sources risks creating the perception of bias and unnecessary controversy, which is unhelpful in advancing the public’s understanding of the science of climate change. A version of our article that had been checked with Dr Lewis underwent significant late editing and so did not give a fair or accurate account of his views on these points. We apologise for this.

As Tim Lambert points out however, Leake’s bogus story soon was doing the rounds on both the climate change denialist blogosphere , and also in other right wing newspapers such as Rupert Murdoch’s Australian and the Daily Mail. It seems highly unlikely that these secondary authors who relied on Leake’s falsified story will all be publishing similar retractions, and consequently much of the impact that the story had as being a widely publicised example of the IPCC allegedly using alarmist predictions from non-scientific sources will likely remain despite the factual basis of the story turning out to be untrue.

This seems a good example of how making entirely baseless accusations can essentially tarnish the work of honest researchers; it is far easier to make unfounded accusations, and then to get these accusations widely reprinted and disseminated by people whose ideological preconceptions the claims resonate with, than it is for those on the wrong end of bogus reporting to clear their names and set the record straight. The affective impact of headlines proclaiming that ACC is a myth invented by environmental activists to create redistributive taxes is somewhat stronger than a retraction of the article  some four and half months after the event.  And as the retraction only covers the place it originated from, The Times, and excludes the plethora of places the untrue assertions were republished, it’s doubtful whether many of those who read those claims will end up reading the retraction. While climate change blogs such as Deltoid, RealClimate and DeSmogBlog have covered the story, as have eco-activists in the (mainly left-wing sections of the) mainstream media such as George Monbiot and Roy Greenslade, it is highly doubtful that these commentators will have the same audience that was reached by the original Times article, and its spin offs in places such as the Mail, the Australian and climate change denial blogs.

Read Full Post »

I’ve just been reading a piece by John Vidal on the Guardian Website about the new draft text prepared by the UN Secretariat at the end of their discussions in Bonn over the past couple of weeks

In the piece Vidal states

The new draft text is also guaranteed to infuriate the US, which has so far only pledged to cut its emissions 17% by 2020 on 2005 emission levels – far less than European Union countries who have committed themselves to 20% cuts by 2020 and a 30% cut if other countries show similar ambition. “If this text were to be adopted, then the US would find it particularly difficult. It means they would have to do very much more,” said one European diplomat.

Are pledged cuts of 17% of emissions really ‘far less’ than pledged cuts of 20%? Well if you were getting your information solely from this article you might think perhaps not. What Vidal fails to explain however, is that whereas the US has pledged cuts of 17% of 2005 emissions, the EU figure relates to 1990 levels. As the unlike the EU the US didn’t ratify the Kyoto Protocol, its emissions grew by about 15% between 1990 and 2005, meaning that if they were measured from the same baseline as everyone else in the world – that is their 1990 levels – the cuts the US has pledged amount to a rather pathetic 4% And this will of course include emissions savings from carbon trading and other schemes which are designed to allow developed countries to avoid actually cutting emissions.

Vidal is comparing apples (cuts based on 1990 levels) to oranges (cuts based on 2005 levels) to paint an entirely confusing picture of what is going on and who has pledged what.

Rather ridiculously though, the actual UN text they are referring to is not much clearer on what exactly it proposes

Developed country Parties shall undertake, individually or jointly, legally binding nationally appropriate mitigation commitments or actions, [including][expressed as] quantified economy-wide emission reduction objectives [while ensuring comparability of efforts and on the basis of cumulative historical responsibility, as part of their emission debt] with a view to reducing the collective greenhouse gas emissions of developed country Parties by [at least] [25–40] [in the order of 30] [40] [45] [49] [X*] per cent from [1990] [or 2005] levels by [2017][2020] [and by [at least] [YY] per cent by 2050 from the[1990] [ZZ] level].

Developed country Parties’ quantified economy-wide emission reduction objectives shall be formulated as a percentage reduction in greenhouse gas emissions [for the period] [from 2013 to 2020]
compared to 1990 or another base year [adopted under the Convention] [, and shall be inscribed in a legally binding agreement].

So the actual UN text seems to suggest that the UN thinks it’s okay for the US to invent a new baseline date for any emission cuts which means that its cuts will be minute compared to those of other developed nations, despite the US per capita emissions figure being far higher than that in Europe and elsewhere. Quite why Vidal and others think that this will infuriate the US is fairly odd, the provision for dates other than 1990 to act as a baseline appears to have been inserted purely to appease the US.

No wonder developing nations are calling this another stitch up along the lines of the Copenhagen Accord. Expecting details such as this to be picked up by the mainstream media when they can’t even give their readers figures based on the same start date seems like wishful thinking though.

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 33 other followers