Feeds:
Posts
Comments

Posts Tagged ‘media’

Ahead of the upcoming IPCC report into global climate, and climate change, the news agenda seems to been largely dominated by stories asking why global warming has paused for the last 15 years (see the BBC, the BBC again, the torygraph and the NZ herald among countless other examples).

A substantial part of this seems to be the repeat of familiar claims that 1998 was the hottest year on global record, and if global warming scientists were right there is no way that we should not have a seen a hotter year during the past 15 years. Hence, climate change has paused, the models and data suggesting that human fossil fuel emissions were to blame for late 20th century warming were wrong, and that consequently any argument for restricting emissions in future are null and void.

Which of course ought to lead to the question, who says that 1998 was the hottest year on record? Well the answer to this is somewhat complicated, but also somewhat revealing. It aint NASA, who run GISStemp (the Goddard Institute for Space Studies Surface Temperature Analysis) who have 2010 as the hottest year on record followed by 2005, with 9 of the 10 hottest years occurring after the year 2000 (with 1998 as the only pre-2000 year in that list).  It also isn’t NOAA (the US National Oceanic and Atmospheric Administration) who compile a global temperature record at the National Climactic Data Centre (NCDC), whose data again places 2010 as the hottest year on record, followed by 2005, with 1998 in third, and 9 of the hottest 10 years on record occurring after the year 2000 (ie after global warming has allegedly paused). Which leaves the UK Met Office’s Hadley Centre and and University of East Anglia’s Climactic Research Unit record HadCRU. The CRU is of course the unit who were the subject of the Climategate faux controversy where sceptics hacked emails, published some excepts from private correspondence out of context claiming fraud and data manipulation generating global headlines, and which were subsequently found by numerous independent investigations to have found no evidence of wrongdoing. The latest version of this temperature series is HadCRUT4v which again shows that 2010 was the hottest year on record, followed by 2005, followed by 1998.

So where does the 1998 was the hottest year claim come from? Well, HadCRUT4v is the latest, and most accurate temperature record maintained by the Met Office and CRU (for a detailed explanation of what’s changed look here). If we ignore that and instead use their previous version, HadCRU3v, then, and only then does 1998 appear to be the warmest year on record. So why did this old record suggest a different year to the NASA and NCDC records (and indeed the latest version of the CRU record)? Well the main reason for this was the different methods used to generate global temperatures. Of course none of these institutions are able to measure the temperature in every place in the world, they use stations in various locations, and the places where there tend to be the fewest stations tend to be the polar regions (where there also tend to be the fewest people). And one of the things we know quite well, is that the Arctic has been the fastest warming region on the planet. Whereas GISStemp interpolates values between measured locations in the Arctic, HadCRU3v left them blank as unknown, which introduced a cold bias into their dataset compared with the others, and explaining why it has been replaced by a dataset which features a greater number of stations and which correlates much more strongly with the other datasets.

So the ‘pause’ in climate change is something that only exists if you exclusively look at a now obsolete and known to be biased dataset generated by a group who those using this data have previously claimed to be frauds. And decide to ignore that 1998 was in any case a super El-nino which had a dramatic short term effect on global weather – hence the other 9 of the 10 hottest years on record all occurring since the year 2000. If you used 1997 or 1999 as start dates there wouldn’t appear to be any pause in any dataset (outdated or otherwise), but cherry-picking the year when specific short-term conditions made things abnormally hot added to cherry-picking a now obsolete dataset allows sceptics to make the global warming has paused argument (see this excellent skeptical science post for details on cherry-picking)

So why are so many mainstream media outlets focussing upon this as the main story in the lead up to the IPCC report? Probably because it’s a more sensationalist and conflict-driven story than one which reads science has been slowing progressing, turning a 90% confidence in predictions in 2007 into a 95% confidence by 2013, allied with a big PR drive from a number of the main players in the climate denial industry.

 

Read Full Post »

I’ve just had an article published as part of the spring/summer edition of Necsus, the European Journal of Media Studies. Necsus is an open access journal, so you can find the full text HERE.  My text is a look at how notions of scale and entanglement can productively add to media ecologies as an emergent way of exploring media systems. If looks at case studies of Phone Story and Open Source Ecology, and examines how in both cases a multiscalar approach which looks across content, software and hardware can be productively applied.

The journal also features an interview with Toby Miller and Richard Maxwell, the authors of Greening the Media, a book released last year which is one of the first full-length pieces to look at issues pertaining to the ecological costs of media technologies (both old and new), and a series on interesting essays which look at the intersection of media/film studies and ecology from diverse perspectives. Outside of the Green material, there are essays by Sean Cubitt (who was my PhD external examiner a few months back) and Jonathan Beller which are well worth a read.

 

Read Full Post »

Last year’s iDocs conference at the Watershed in Bristol was a lively and engaging event which looked at a range of critical, conceptual and practical issues around the emerging field of interactive documentary. It focused on several key themes surrounding the genre: participation and authorship, activism, pervasive/locative media and HTML 5 authoring tools.

The conference featured a number of practitioners involved in fantastic projects, such as Jigar Metha’s 18 Days in Egypt, Brett Gaylor, who made the excellent RIP: a remix manifesto and is now at Mozilla working on their popcorn maker, an HTML 5 based javascript library for making interactive web documentaries,  and Kat Cizek (via Skype) whose Highrise project is well worth a look. There were also more theoretically inflected contributions from the likes of Brian Winston, Mandy Rose, Jon Dovey and Sandra Gaudenzi (among many others) which made for a really stimulating couple of days.

The Digital Cultures Research Centre at UWE asked me to document the event and produce a short video summary, and the video above is the outcome of that.

Read Full Post »

This week I start doing some work as a researcher for the Digital Cultures Research Centre at the University of the West of England, looking at a range of notions surrounding postdigitality.

The working hypothesis I’ve been given to function as a jumping off point is that ‘the digital’ in ‘digital cultures’ is on the verge of becoming a redundant term since all significant global cultures are all already digital.’ If this is the case how should the research centre strategically reconfigure its interests to maintain relevance within this postdigital moment.

My main experience with notions around postdigitality thus far comes from documenting the Postdigital Encounters Journal of Media Practice Symposium in 2011, which featured a range of interestingly contradictory takes on the postdigital:

 

 

I’m looking forwards to engaging with the DCRC staff around this issue, and spending some time thinking about the underlying value of a discourse which is currently fragmented and largely dominated by some fairly insubstantial rhetoric on blogs and newspaper articles, but which appears to touch on some far more interesting material around the rematerialisation of technologies, the Internet of things, pervasive media, and smart cities and connected communities.

Read Full Post »

Over the weekend of the 11th and 12th of June I was at BarnCamp, a fun filled weekend of tech activist related tomfoolery organised by the Hacktionlab network. On the Sunday I gave a talk about the Social and Ecological costs of Technology, which was recorded by the Catalyst radio collective, who launched a 24 hours a day seven days a week at BarnCamp. The talks are available for streaming here.

The first talk on the audio stream is on the Luddite movement and the contemporary relevance of their actions ahead of the bicentenary of the Luddite rebellion in 2012, the second is some thoughts on technology, the self and upgrade culture, and my talk is on third, about 35 minutes in.

Read Full Post »

On Friday I was in London for an unconference hosted by Furtherfields.org at the University of Westminster, London which approached the subject of re-rooting digital culture from an ecological perspective. Here’s the brief for the event:

Over the last decade the awareness of anthropogenic climate change has emerged in parallel with global digital communication networks. In the context of environmental and economic collapse people around the world are seeking alternative visions of prosperity and sustainable ways of living.

While the legacy of the carbon fuelled Industrial Revolution plays itself out, we find ourselves grappling with questions about the future implications of fast-evolving global digital infrastructure. By their very nature the new tools, networks and behaviours of productivity, exchange and cooperation between humans and machines grow and develop at an accelerated rate.

The ideas for this transdisciplinary panel have grown out of Furtherfield’s Media Art Ecologies programme and will explore the impact of digital culture on climate change, developing themes adopted in grass-roots, emerging and established practices in art, design and science.

One thing which left me somewhat confused was why the event was billed as an unconference, when in reality it was a fairly straightforward event with three speakers and a short Q+A after. Listening to three presentations (with accompanying powerpoints and prezis) and then having the chance to ask a few questions at the end is not a participant driven meeting, its the same format as you find at any conventional conference panel.

The first speaker was Michel Bauwens, founder of the Foundation for Peer to Peer Alternatives. Bauwens began by prescribing the central problems of the contemporary socio-economic system with regards to sustainability and equity. The first problem he outlined was that of pseudo-abundance: the aim of achieving infinite economic growth on a planet with finite resources and the externalisation of ecological costs from our limited understanding of economics. The second problem he delineated was that of artificial scarcity: the ways in which intellectual property is enforced via patents and copyrights which create scarcity around assets whose cost of reproduction often approaches zero with digital networked technology. This Bauwens argued, leads to the stifling of innovation, which prevents the types of solutions to ecological crises being developed as commonwealth, outside of a profit driven market framework. The final problem Bauwens diagnosed was that of social justice, as exemplified by the cavernous (and growing) divide between rich and poor on a global level.

Bauwens’ suggestions around potential solutions to these problems is primarily through commons based peer production. In commons based peer production, individuals are able to voluntarily self-aggregate into distributed networks based on coordination through networked telecommunications. While Bauwens presents this as an entirely new phenomena, afforded by the massive increase in computational power and networked connectivity associated with the information revolution, it is worth mentioning that voluntary self-aggregation and democratised and decentralised ownership of projects has long been a foundational concept of anarcho-syndicalist thought. What appears to be different about P2P networks in the contemporary context however are their ability to connect peers outside of a localised context through digital telecommunications networks, and also for the projects to be scaled up accordingly in terms of size and scope. These affordances have the potential to enable commons based peer production to out-compete market based initiatives in many circumstances, however what is potentially of greater significance than the efficiency gains P2P networks can provide is the alternative set of values they tend to embody.

Bauwens used examples of numerous forms of consumer electronics as instantiations of planned obsolescence, whereby the company making the product has a financial incentive to create a product which has a highly limited shelf life, and whose design is not modular, so that failure of individual components leads to users replacing the entire device. While the manufacturer profits each time this cycle continues and new items are bought, the ecological costs increase, however these are externalised from the market transaction. By contrast the open design methodology is based around values whereby the user/designer (the term prouser was suggested) wants their device to be as durable and long lasting as possible, and for a modular design to exist which eanables them to easily replace any parts which are damaged over time. Consequently the argument Bauwens promoted was that the values of the open design movement present an ethical alternative to market production whereby ecological sustainability and social justice can be built into the production process itself.

Bauwens argued that this argument was not merely utopianism but was based on a material analysis of the prescient features of contemporary capitalism, which he argued already needs commons based peer production in order to remain profitable.

The second speaker was Catherine Bottrill of Julie’s Bicycle, an organisation which works with arts ‘buisnesses’ to reduce their carbon footprint. While I’m sure the organisation does good work, the scheduling seemed somewhat odd. Following a talk about the problems of contemporary capitalism and the necessity to replace it with a system with alternative ethical values created via grassroots and decentralised P2P networks we had a talk which seemed to imply that if the major record labels reduced their carbon footprint slightly and their star acts planned their world tours slightly differently there would be no ecological crisis.

It was problematic that Bottrill didn’t address any of the concerns or solutions Bauwens had just raised, and one slide in particular caused (presumably) inadvertant entertainment with her diagnosis of contemporary challenges to society. First came recession, second came the Middle East Crisis, followed by cuts to Arts Council funding. I’m not sure what came next because I was laughing too hard. On a more serious note though, for a group of uniformly white middle class people at a posh London university to listen to someone raise arts funding cuts as a major social problem above the other aspects of the government’s austerity programme; cuts to disability benefits, cuts to welfare, cuts to education, the privatisation of the NHS etc was somewhat depressing.

The final presentation was from Ruth Catlow of Furtherfields.org on ecological approaches to networks, tools and digital art. Catlow began with a delineation of network topography, referring to a 1964 RAND corporation diagram on various forms of structure

Catlow argued that while mass media networks resemble the centralised structure on the left, the Internet is a mixture of the decentralised (via the cables and gateways that make up the material apparatus of connectivity) and distributed (as each computer functions as a node in a distributed network). While this has been a traditional way that the Internet, and its potential for creating a democratic media system has been trumpeted for over two decades now, this analysis misses a crucial part of the picture. Recent research into the structure and connectivity of complex networks such as the World Wide Web (which is the most common encounter people have with the Intenet) reveals that far from a distributed system in which all nodes are equal or every blogger is a pamphleteer, the structure of these networks is that of a power law, with a few preferentially attached ‘superstars’ such as Google, Facebok, Twitter and Amazon, while the vast majority of content resides in the ‘long tail’ where it receives scant attention.

Systems as diverse as genetic networks or the World Wide Web are best
described as networks with complex topology. A common property of many
large networks is that the vertex connectivities follow a scale-free power-law
distribution. This feature was found to be a consequence of two generic mech-
anisms: (i) networks expand continuously by the addition of new vertices, and
(ii) new vertices attach preferentially to sites that are already well connected.
A model based on these two ingredients reproduces the observed stationary
scale-free distributions, which indicates that the development of large networks
is governed by robust self-organizing phenomena that go beyond the particulars
of the individual systems
Barabasi and Albert 1999

When we talk about network topology we need to engage with these findings, as while the power law functions as an attractor which partially determines the distribution of the network, there has been some research which suggests that this is not a fixed and finite determinism and that there may be methods or tactics which communities can use to make these networks more equitable. But for me, that discussion is the interesting one to be having about network topologies now, not merely a recapitulation of the earliest models.

Following this, Catlow went on to detail a number of projects which Furtherfields have been involved in, including the Zero Dollar Laptop Project; which is an innovative way of both mitigating the ecological cost of contemporary computing hardware while also providing social benefits to disadvantaged groups, We Wont Fly For Art, a project designed to mitigate the carbon emissions created by artists and the Feral Trade Cafe, a project by Kate Rich which establishes social networks to ethically trade goods.

Overall the event was worth attending, Bauwens’ talk in particular being a highlight.

Read Full Post »

Yesterday I gave a guest lecture for to MA students at the University of Western England on Media Ecologies… Here’s a copy of the Prezi that went along with the lecture, and a selected bibliography for some key readings associated with the field. I’ll try and find some time to write my notes up into a blog post sometime soon

Link to Prezi: Media Ecologies on Prezi

Media Ecologies Bibliography

Eco-Philosophy

Bateson, Gregory, (1972) Steps to an Ecology of Mind; Collected essays in Anthropology, Psychiatry, Evolution and Epistemology, Northvale New Jersey, Aronsen Inc.

Capra, Fritjof (1996) The Web of Life: A New Scientific Understanding of Living Systems, New York, Anchor Books

DeLanda, Manuel, (1992) ‘Nonorganic Life’, in Jonathan Crary & Sanford Kwinter (eds), Zone 6: Incorporations, New York: Urzone, pp. 129-67

DeLanda, Manuel (2002) Intensive Sciences and Virtual Philosophies, London and New York, Continuum

Deleuze, Gilles and Guattari, Felix (1972, trans 1977) ‘Anti-Oedipus: Capitalism and Schizophrenia,’ Translated by Robert Hurley, Mark Seem and Helen R Lane, London & New York, Continuum

Deleuze, Gilles and Guattari, Felix (1980, trans 1987) A Thousand Plateaus: Capitalism and Schizophrenia, Translation by Brian Massumi, London & New York, University of Chicago Press

Guattari, Felix (2000) The Three Ecologies, trans Ian Pindar and Paul Sutton, London, Athelone Press

Maturana, Humberto and Varela Fransisco (1980) Autopoiesis and Cognition, Dordrecht, Holland, Reidel,

Prigogine, Ilya in collaboration with Stengers, Isabelle (1997) The End of Certainty: Time, Chaos and the New Laws of Nature, New York, The Free Press

Posthumanism

Haraway, Donna (1982) ‘Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century,’ in Simians, Cyborgs and Women: The Reinvention of Nature New York; Routledge, 1991, pp.149-181.

Hayles, N Katherine (1999) How We Became Posthuman Chicago, University of Chicago Press

Hayles N Katherine (2007) Deep and Hyper Attention: The Generational Divide in Cognitive Models, http://www.mlajournals.org/doi/abs/10.1632/prof.2007.2007.1.187

Latour, Bruno (1991) We have never Been Modern, Translated by Catherine Porter
Cambridge, Massachusettes, Harvard University Press,

Stiegler, Bernard (2009) For a New Critique of Political Economy, translated by Daniel Ross, Polity Books, UK

Media Ecologies

Fuller, Matthew (2005) Media Ecologies: Materialist Energies in Technoculture, Cambridge MA, MIT Press

Matthew Fuller (ed) (2008) Software Studies: A Lexicon, Cambridge, MA, MIT Press

Jussi Parikka (2007) Digital Contagions: A Media Archeology of Computer Viruses, Peter Lang Books

Jussi Parikka (2011) Insect Media: An Archeology of Animals and Technology, University of Minnesota Press

Jussi Parrika (ongoing) Machinology http://jussiparikka.net/

Rawlings, Tomas (ongoing) A Great Becoming http://agreatbecoming.wordpress.com/

Taffel, Sy (ongoing) Media Ecologies and Digital Activism http://mediaecologies.wordpress.com/

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 39 other followers