Feeds:
Posts
Comments

Archive for the ‘politics’ Category

Arts of the Political is the new release penned by cultural geographers Nigel Thrift and Ash Amin, and which explores various manifestations of left wing politics and political movements in order to consider why movements based around equity and community have seemingly achieved so little over the past thirty odd years in the face of neoliberalism. Indeed, this question is particularly pertinent given the financial crisis of 2008, and the inability of the left in places such as the UK (where both Thrift and Amin teach at Warwick and Cambridge respectively) to form a movement seemingly capable of enacting widespread positive changes, or even mounting a serious campaign to challenge the Conservative narrative of enforced austerity as a means for enacting further cuts to public services – a policy David Cameron has recently felt sufficiently emboldened to openly state is a reflection of ideology rather than a situation enforced by economic circumstances.

One might argue that the sweeping cuts made by the Tory/Lib Dem coalition are a prime exemplar of what Naomi Klein has termed the shock doctrine – the neoconservative leveraging of moments of critical instability to enact sweeping changes which increase inequality and benefit elites through privatisations which would likely be too unpopular to pass outside of these specific moments. The question then, is why has the right been so successful at exploiting these opportunities whilst the left has not.

For Thrift and Amin, the answer is primarily that the left has historically been successful when it has been able to articulate new visions, new desires and new organisations which expand the terrain of what is understood to be politics itself, and by doing so energises mass movements through articulating the possibility for a better collective future.

 Movements campaigning for the rights ofwomen, the working class, and other neglected and downtrodden subjects managed to turn engrained orthodoxies on their head in the quarter-century before the First World War by building mass support and accompanying socio-political reform. Although these movements applied particular principles and practices, the record shows that their acts of redefinition went far beyond what was originally intended. These movements freed up new imaginations, invented new political tools, pointed to elements of existence that had been neglected or concealed, and created a constituency that, once constructed, longed for another world. In other words, these movements produced a new sense of the political and of political potential. 1he emerging Left both opened the doors of perception and provided the tools with which to do something about these new perceptions. This is what was common, in our view, in the disparate examples we consider, from the American Progressive Movement and British feminism to German Marxism and Swedish social democracy. In their own way, each of these movements disclosed new desires.

The thesis that drives this book is that progressive movements should pay more attention to such world-making capacity, understood as the ability not just to produce a program in the future but also to open up new notions of what the future might consist of. The most important political movements, in our estimation, are those that are able to invent a world of possibility and hope that then results in multiple intertventions in the economic, social, and cultural, as well as the political sphere. They free thought and practice and make it clear what values are being adhered to, often in quite unexpected ways. p9

Thrift and Amin contend that three areas (or arts) of the political which it is crucial for the left to pay close attention towards are invention, the process of bringing forth tangible futures which hold the promise of a better life; organisation, the practices which are used to bind and articulate these movements; and the mobilisation of affect, considering the ways that political decision making goes beyond rational information processing:

In particular, we consider the whole phenomenon of what Walter Lippmann (1961) called the manufacture of consent: how it is being bent to the needs of the Right and how it could be mobilized more effectively by the Left. At the same time, we attend to how the consideration of affect brings space into the frame. A whole array of spatial technologies has become available that operate on, and with, feeling to produce new forms of activism, which literally map out politics and give actors the resources to kick up more and across more places.3 In other words, the practical mechanics of space must be part of the politics of the Left. p15

Thrift and Amin begin by exploring a range of historical examples whereby left wing politics was able to achieve the kind of redefinition of the political they seek, considering the German Socialism of the SPD before world war 1, Swedish Social Democracy, the British Suffragette movement, and progressive capitalists in the US circa 1900. Thrift and Amin contend that:

In all cases, progress depended on prizing open newpolitical ground and filling it with real hope and desire. Appeal and effectiveness-at a time heavily laden with the weight of tradition, vested power, restricted social force, and new capitalist imperative-had to come from an ability to imagine and build community around the yet to come or the yet to be revealed. This meant inventing new historical subjects, new technologies oforganization and resistance, newvisions of the good life and social possibility, new definitions ofhuman subjectivity and fulfillment, and new spaces of the political (such as “direct action,” “voting,” “public involvement,” “class struggle,” “welfare reform,” “government for the people,” “women’s rights”). A possible world had to be fashioned to render the old unacceptable and the new more desirable and possible. The Left today seems to have less desire or ability to stand outside the given to disclose and make way for a new world.

In seeking to formulate areas where there is the potential for opening up analogous new political spaces, Thrift and Amin incorporate theoretical material from Bruno Latour regarding the status of democracy and agency with regards to nonhumans, arguing that the traditional binaries between sovereign human subjects and inert and passive nonhuman objects is an area which can productively challenged by a revitalised left wing politics.

bow. We want to take up Bruno Latour’s (1999) call for a new parliament and constitution that can accommodate the myriad beings that populate the world, a call that entails acts of definition and redefinition of “actor” so that many humans and nonhumans can jostle for position, gradually expanding the scope and meaning of”collective”
politics. p41

This leads T&A to consider the human as a distributed being, whose processes of cognition stretch far beyond the boundaries of the skin, coming close to Deleuze and Guattari’s positions around ecological machinic flows of matter. Arguing from a position which begins to sound fairly close to some of Bernard Stiegler’s work, they contend that:

Human being is fundamentally prosthetic, what is often called “tool-being.” We are surrounded by a cloud ofall manner of objects that provide us with the wherewithal to think. Much ofwhat we regard as cognition is actually the result ofthe tools we have evolved that allow us to describe, record, and store experience. Take just the example of the craft of memory. 1his has extended its domain mightily since the time paintings were made on the walls of caves, and as a result, a whole new means of thought has come into being…

Memory is a compositional art depending on the cultivation of images for the mind to work with. This state of affairs has continued but has been boosted by modern media technology and its ability to produce communal rhetorics that would have been impossible before and that are inevitably heavily political, especially in their ability to keep inventing new variants of themselves that can be adapted to new situations. p50/51

This sense of distributed being and agency is used to reinforce the Latourian argument surrounding the agency of objects, and thus their importance in a new an enlarged sense of politics and democracy. Using Gilbert Simondon’s notion of transduction T&A explore:

The way in which tools and other symbiotes can produce environments that are lively in their own right, that prompt new actants to come into existence. To illustrate this point, we need to look no further than the types of digital technology that have become a perpetual overlay to so many practices and the way in which they are changing political practices. Here we find a domain that has gained a grip only over the past ten years but is now being used as part of an attempt to mass-produce “ontological strangeness” (Rodowick 2007) based on semiautomatic responses designed into everyday life through a combination of information technology based tools and the practices associated with them (from implants and molecular interventions to software-based perception and action). In particular, these automatisms are concerned with the design and prototyping of new kinds of space that can produce different affective vibrations. p64

T&A bring this discussion back in to realm of the more conventionally political by using distributed agency and co-evolutionary strategies as a way of opening up though surrounding ecological crises and how a coherent left political response to climate change requires precisely the type of expanded politics which they characterise as world making:

What is needed instead is a leftist politics that stresses interconnection as opposed to the “local,” however that is understood. What is needed is “not so much a sense of place as a sense ofplanet” (Heise 2008, ss) that is often (and sometimes rather suspectly) called “eco-cosmopolitanism.” Thus, to begin with, the experience of place needs to be re-engineered so that its interlocking ecological dimensions again become clear. This work of reconnection is already being done on many levels and forms a vital element in the contemporary repertoire of leftist politics: slow food, fair trade, consumer boycotts, and so on. Each of these activities connects different places, and it is this work of connection that is probably their most important outcome. Environmental justice then needs to be brought into the equation. The privileges of encounters with certain ecologies, as well as the risks associated with some branches of industry and agribusiness, are clearly unevenly distributed, and it may well be that certain environmentally unsound practices have been perpetuated because their effects go unnoticed by the middle class. Again, environmental justice movements have to refigure spaces, both practically and symbolically, so that interconnection becomes translucent. Finally, we need new ways to sense and envisage global crowds that are dynamic. The attempts to produce people’s mapping and geographic information systems, to engage in various forms of mash-up, and to initiate new forms of search are all part and parcel of a growing tendency to produce new lands of concerned and concernful “Where are we?” Politics starts from this question. p75

This is followed by a a chapter which claims to look at contemporary leftist politics, surveying the landscape through the apertures of anti-capitalism, reformist capitalism, post-capitalism and human recognition. What is striking about the majority of these contemporary left wing political movements is that they aren’t actually political movements.  Anti-Capitalism is not approached through Occupy or Climate Camp, it is Zizek, and Badiou alongside Hardt and Negri – which conflates two very different theoretical perspectives on anti-capitalism – and is summarily dismissed as hopelessly over-optimistic and unable to visualise a future. Reformism is not Syriza/The Five Star Movment/Bolvarian Socialism, it is Ulrich Beck and Anthony Giddens’s reflexive modernity and third way. By post-capitalism T&A mean ‘A third leftist stance on the contemporary world can be described as “poststructuralist,” in that it draws on feminist, postcolonial, antiracist,and ecological thinking, much of which heavily influenced by poststructuralist ideas’ p91. Conceptually that would seem to fit Hardt and Negri quite well, but here T&A refer instead to GIbson-Graham’s work on small scale, local, co-operative ethical and sustainable, which could have been productively mapped onto the actions of groups and initiatives such as transition towns, permaculture groups, feminist networks, Greenpeace and other NGOs and the broad range of groups and movements who actually practice some of these ideas, but instead is again explored as a mere theoretical argument rather than political praxis. Human recognition is used to refer to a liberal left based around ethics derived from Wendy Brown’s writings – again rather than exploring groups who actually employ this mode of left politics, probably best embodied by online liberal campaign groups such as Avaaz or 38 Degrees. Finally T&A return to Latour and the notion of Dingpolitik and the role of bringing objects into democracy, a position which has been criticised within academia for being politically conservative as Latour’s works tend to entirely ignore issues surrounding inequalities and exploitation, content instead to simply map actor networks, in contrast with more politically engaged posthuman scholarship from the likes of Felix Guattari or Manuel DeLanda. Perhaps there could have been an interesting dialogue here between T&A’s Latourian positions and the actions and ideologies of animal rights groups or deep ecologists, but again for T&A the left today does not consist of movements of people actually campaigning, occupying, protesting and organising, it simply appears to be a disparate collection of academics.

Put simply, this was what was most frustrating about Arts of the Political, rather than engaging with the broad and varied range of social and ecological activisms which currently exist, the left is reduced to academic thought, whilst the authours proclaim themselves to be engaged in materialist analysis. Perhaps it is simply indicative of the fact that the book’s authors are ageing men living and working in universities who are so totally detached from the actual practices of the left wing groups they claim to represent that they are barely able to acknowledge their existence. Indeed, Thrift has seen protests and occupations from students at Warwick surrounding his astronomical pay increases as Vice Chancellor of the University of Warwick over the past couple of years (from 2011 to 2013 Thrift’s salary has increased from £238,000 to £316,000 at a time when tuition fees have tripled for his institution’s students). That as a background perhaps helps explain why the actually existing left is almost entirely absent from T&A’s exploration of left wing politics.

In the following chapters where T&A discuss organisation, there is a mixture of some interesting thoughts surrounding ecology, using Stengers, Deleuze and Guattari, to consider the notion of ‘addressing the political as an ecology of spatial practices’ p133 alongside a consideration of the organisation of the EU as a potentially fruitful model for the left, as it involves multiple parties across different scales having to cooperate. Such a politics of pragmatic cooperation could of course be understood as a mainstay of anticapitalist politics since the 1990s – the alter-globalisation movement and its manifestations within the world social forum, the peoples global assembly and Indymedia all sought to embody a politics of the multiple, as theorised by Hardt and Negri, and similar claims could be made regarding the anti-war movement, climate change activism and Occupy. But in keeping with their refusal to actually engage with left wing movements, we instead get a lionisation of the EU at a time where elements of the actually existing left are campaigning against the EU’s proposed free trade deal with the US which would effectively allow corporations to sue governments using secret panels to bypass parliaments.

This is a shame, as some of theoretical material around affect, space and that relating to the need to build positive visions of a left wing future articulated by T&A are in places very strong. The central argument that the left needs to find a way to escape what Mark Fisher has called Capitalist Realism, the notion that neoliberalism is the only possible game in town (with the alternative being an eco-apocalypse), is undoubtedly correct, and the politicisation of affect and the reorientation of politics towards an ecology of ethical practices are both concepts worth pursuing. However, they require consideration in relation to the actual practices of political movements, rather than simply as abstract theoretical constructs.

Read Full Post »

While being ill over the last week or so, I’ve spent some time playing the beta version of Fate of the World, a forthcoming PC game based around climate change which has generated quite a lot of interest in the mainstream media ( see here and here)

The game essentially puts you in command of a global organisation whose mission is to prevent catastrophic climate change (defined in the beta mission as 3 degrees of warmth), maintain a human development index above 0.5 and keep people happy enough to stop more than a certain number of geographical regions from kicking you out by the year 2020.

To achieve these goals you have a variety of tools at your disposal; projects which include boosting renewable energy, deploying CCS technology and subsidising electric cars, environmental and social adaptation/mitigation measures such as drought prevention measures, healthcare programmes and advancing regional water infrastructure, policy measures such as banning oil from tar sands or deploying algae based biofuels, and political measures such as deploying peacekeepers to troubles regions and black ops (including covert steralisation programmes?!).

From a game studies perspective the game is interesting as it provides users with a complex simulation whereby numerous interdependent factors are required to be dynamically balanced in a way that goes far beyond the usual kill or be killed binary prevalent inb most computer games. While there are of course alternatives, particularly in the realm of sandbox simulation games from SimEarth to Civilisation 5, Fate of the World is interesting insofar as it uses data taken from climate models to simulate not a fictional alternative world, but possible futures of this planet, presuming that the current data from climate models are broadly accurate. As such by experimenting with different variables users can glean a different kind of insight pertaining to the challenges posed by Anthropogenic Climate Change to engaging with traditional forms of media, such as watching a documentary or reading the scientific literature. By being able to manipulate how regions react through play, users get a different kind of experience, one driven by feedback, configuration and systemic thinking rather than narrative, affect or rhetoric. While such models will always be highly reductive simplifications of real world complexity, they could provide a useful way of approaching some of the complex social and environmental issues currently facing us, and indeed this kind of argument has been powerfully advanced by game studies scholars such as Stuart Moulthrop, who have advanced the argument that when dealing with complexity, configurational thinking is likely to present users with a better understanding of the area than linear narrative based approaches.

One criticism I have of the game in its current state is that the processes of feedback which reveal how a user’s interventions are effecting the relevant systems are often relatively obscured by their placement two menus deep, and I suspect that many players will struggle to find the data which actually spells out what the the consequences of their actions have been, and without this crucial information actions can appear opaque and indeed this criticism has been made on gaming forums discussing the beta. Hopefully this will something that is addressed before the game is released, as if players don’t understand what effects their decisions have entailed, then the game isn’t achieving its goals.

One aspect of the game which I found highly intriguing is the disparity between the aims and activities the game sets for users and the claims and actions of really existing nation states and supranational institutions. The beta mission in the game sets success as avoiding a rise of 3 or more degrees over pre-industrial temperatures by 2120, which is  below the midpoint of the IPCC projections of 1.5-6 degrees of warming this century (dependent on a range of factors, but primarily human measures), but which is considerably higher than the figure of a 2 degree rise which nation states couldn’t agree upon at the COP15 conference at Copenhagen last year. The reason states couldn’t agree upon that figure wasn’t the complete lack of concrete measures designed to practically bring about that change, but because a large number of nations, primarily the 131 countries represented by the G77 group, declared that a 2 degree temperature rise was too high. Earlier this year those nations convened in Bolivia at the World People’s Conference on Climate Change and the Rights of Mother Earth, where they drafted a people’s agreement which stated

If global warming increases by more than 2 degrees Celsius, a situation that the “Copenhagen Accord” could lead to, there is a 50% probability that the damages caused to our Mother Earth will be completely irreversible. Between 20% and 30% of species would be in danger of disappearing. Large extensions of forest would be affected, droughts and floods would affect different regions of the planet, deserts would expand, and the melting of the polar ice caps and the glaciers in the Andes and Himalayas would worsen. Many island states would disappear, and Africa would suffer an increase in temperature of more than 3 degrees Celsius. Likewise, the production of food would diminish in the world, causing catastrophic impact on the survival of inhabitants from vast regions in the planet, and the number of people in the world suffering from hunger would increase dramatically, a figure that already exceeds 1.02 billion people. The corporations and governments of the so-called “developed” countries, in complicity with a segment of the scientific community, have led us to discuss climate change as a problem limited to the rise in temperature without questioning the cause, which is the capitalist system…

Our vision is based on the principle of historical common but differentiated responsibilities, to demand the developed countries to commit with quantifiable goals of emission reduction that will allow to return the concentrations of greenhouse gases to 300 ppm, therefore the increase in the average world temperature to a maximum of one degree Celsius.

With atmospheric CO2 concentrations currently at 387ppm, and even the most ambitious campaigners in the developed world calling for a reduction to 350ppm, the aims set out at the World People’s Conference appear laudable, but completely unrealistic. Indeed, the goal loosely set out at COP15 of reducing warming to no more than 2 degrees, but with no mechanisms to try to achieve this have been widely criticised by groups such as the International Institute for Environment and Development;

The Accord is weak. It is not binding and has no targets for reducing greenhouse gas emissions (countries that signed it have until 31 January to list their voluntary actions in its appendix). The low level of ambition will make preventing dangerous climate change increasingly difficult. What countries have so far proposed will commit us to a 3 to 3.5-degree temperature increase, and that is just the global average.

In some ways this may be the most useful role the game plays; highlighting the distance between the rhetoric of political and business leaders who are currently seeking to greenwash the issue, and present their inaction beyond rhetoric as somehow being constitutive of a viable solution to the problems posed by ACC. Despite taking concerted action throughout the game it is hard to maintain warming of under 3 degrees without society collapsing due to a lack of mitigation and adaptation measures, widespread war and civil unrest or widespread poverty and famine in the face of increasingly severe climate related disasters as the next 110 years unfold. In some ways this isn’t that much fun; being told that your actions are resulting in millions starving and armed conflict doesn’t spread warmth and joy, but it does give some indication of how hard things are likely to get as time passes.

One thing that becomes abundantly clear from the game is that the sooner action is taken to dramatically curb CO2 emissions (particularly in wealthy nations where emissions per capita are far higher), the less severe the consequences will be further down the line. This is a lesson we would do well to heed.

World People’s Conference on Climate Change

and the Rights of Mother Earth

April 22nd, Cochabamba, Bolivia

PEOPLES AGREEMENT

Today, our Mother Earth is wounded and the future of humanity is in danger.

If global warming increases by more than 2 degrees Celsius, a situation that the “Copenhagen Accord” could lead to, there is a 50% probability that the damages caused to our Mother Earth will be completely irreversible. Between 20% and 30% of species would be in danger of disappearing. Large extensions of forest would be affected, droughts and floods would affect different regions of the planet, deserts would expand, and the melting of the polar ice caps and the glaciers in the Andes and Himalayas would worsen. Many island states would disappear, and Africa would suffer an increase in temperature of more than 3 degrees Celsius. Likewise, the production of food would diminish in the world, causing catastrophic impact on the survival of inhabitants from vast regions in the planet, and the number of people in the world suffering from hunger would increase dramatically, a figure that already exceeds 1.02 billion people. The corporations and governments of the so-called “developed” countries, in complicity with a segment of the scientific community, have led us to discuss climate change as a problem limited to the rise in temperature without questioning the cause, which is the capitalist system.

Read Full Post »

Franco Berardi was a key member of the Italian Autonomist movement, alongside the likes of other authors such as Antonio Negri, Christian Marazzi, Mario Tronti and Paulo Virno, and was a close associate of Felix Guattari, the French philosopher. Berardi’s work has only recent been translated from Italian into English, and Soul at Work was published in 2009 as part of the semiotext(e) foreign agents series.

The central themes of the Soul at Work are that the human faculties which in previous eras would have been considered to be constitutive of the soul, our capacities for language, creativity, emotion, empathy and affect, have now become central to the economy of digital capitalism (Berardi’s term is Semiocapitalism)

Putting the soul to work: this is the new form of alienation. Our desiring energy is trapped in the trick of self-enterprise, our libidinal investments are regulated according to economic rules, our attention is captures in the precariousness of virtual networks: every fragment of mental activity must be transformed into capital. (p24)

This is contrasted with the situation under industrial capitalism, wherein the labour of the working class was largely confined to an eight-hour day in a factory, where for a portion of the day their bodies functioned as cogs rented to maintain the production of gigantic machines. While their bodies laboured their minds or souls were still perceived as free. But as economic production became increasingly based up intellectual rather than physical labour, Berardi argues that a fundamental change has occurred, which requires a reconceptualization of the political field.

Once digital technologies made possible the connection of individual fragments of cognitive labor possible, the parceled intellectual labor was subjected to the value production cycle. The ideological and political forms of the left wing, legacy of the 20th Century, have become inefficient in theis new context. (p29)

After tracing a pathway through some of the Workerist ideas of the 1960’s, and particularly the role of alienation labor within this context, Berardi moves  on to analysis of how the

decisive transformation of the 1980’s was the systematic computerization of the working process. Thanks to digitalization, every concrete event can not only be symbolised, but also simulated, replaced by information. Consequently it becomes possible to progressively reduce the entire production process to the elaboration and exchange of information. (p95)

And how this change to the system of production and consumption accumulates as an ever-increasing torrent of information which he argues is conducive to conditions of mass panic (in the sense that the word stems from the etymological root pan – or everything) and depression.

If in modern society the vastly prevalent pathology was repression induced neurosis, today the most widely spread pathologies assume a psychotic, panic driven character. The hyper-stimulation of attention reduces the capacity for critical sequential interpretation, but also the time available for the emotional elaboration of the other, of his or her body and voice, tries to be understood without ever succeeding. (p183)

Searching for ways to approach these changes in social context, Berardi draws on Deleuze and Guattari’s work in arguing  that

Ethical conciousness cannot be founded on the binomial of Reason and Will – as during the modern period. The roots of rationalism have been forever erased, and rationalism cannot be the major direction of the planetary humanism we must conceive.

Today the ethical question is posed as a question of the soul, that is to say of the sensibility animating the body, making it capable of opening sympathetically towards the other…A new conceptualization of humanism must be founded on an aesthetic paradigm, since it has to take root in sensibility. The collapse of modern ethics needs to be interpreted as a generalized cognitive disturbance, as the paralysis of empathy in the social psychosphere. (p133).

It is interesting to contrast and compare Berardi’s vision of a revised humanism here with the various schemas of posthumanism proposed by the likes of Katherine Hayles, Donna Harraway and Robert Pepperell. The logic of basing ethics on feeling and connectivity with other(s) certainly has resonance between these authors despite their respective stances on whether humanism is a project in need of reconceptualization or a patriarchal, bourgeois, historical phenomenon which has led to the epistemological errors and artificial separation of nature and culture, humans and other living creatures and body and soul – many of the problems which Berardi examines.

Beradi goes to to present an interesting analysis of Baudrillard’s work around simulation, and contrasts this with the desire-based radical analyses of Deleuze and Guattari. Berardi argues that

The semiotic acceleration and the proliferation of simulacra within the mediatized experience of society produce an effect of exhaustion in the collective libidinal energy, opening the way to a panic-depressive cycle… Baudrillard sees simulation as the infinite replication of a virus that absorbs energy to the point of exhaustion. A sort of semiotic inflation explodes in the circuits of our collective sensibility, producing effects of mutation that run a pathological course: too many signs, too fast and too chaotic. The sensible body is subjected to an acceleration that destroys every possibility of conscious decodification and sensible perception. (158/159)

The problem, according to Beradi, is that the explosion of information leads to a paralysis and subsequent depression as the pace and scale of information flows expands far beyond what the human brain is capable of processing. The field of desire, which for Deleuze and Guattari possesses liberating potential, collapses in on itself and is confined to desiring the ever-increasing number of consumer fetishes that permeate upgrade culture. This leads to a contemporary scenario Beradi describes as the poisoning of the soul, as desire no longer reaches out for connectivity with the other, but instead is restricted to focusing on the self and personal accumulation. Looking for potential ways out of this situation, Beradi contends that

Perhaps the answer is that it is necessary to slow down, finally giving up on economistic fanaticism and collectively rethink the true meaning of the word “wealth.” Wealth does not mean a person who owns a lot, but refers to someone who has enough time to enjoy what nature and human collaboration place within everyone’s reach. If the great majority of people could understand this basic notion, if they could be liberated from the competitive illusion that is impoverishing everyone’s life, the very foundations of capitalism, would start to crumble. (p169)

Read Full Post »

Yesterday I wrote about the Penn State University investigation into Michael Mann’s research conduct which stemmed from the Climate Research Unit email hack widely known as Climategate, and criticising media outlets such as the Guardian, which gave a huge amount of coverage to the ‘scandal’ whilst then failing to give anything like equal attention to the three subsequent investigations which have found that there was no research misconduct on the part of climate scientists, and absolutely no evidence of fraud, data manipulation or inventing climate change to receive funding money.

Today, Fred Pearce, who wrote an utterly abysmal 10 part special on Climategate for the Guardian, and has now written a book on the subject, has a new piece linked off the front page of the paper. Shockingly, this latest piece on Climategate repeats the claims that

Critics say the emails reveal evasion of freedom of information law, secret deals done during the writing of reports for the UN’s Intergovernmental Panel on Climate Change (IPCC), a cover-up of uncertainties in key research findings and the misuse of scientific peer review to silence critics.

However Pearce decides to entirely omit from his report that three independent investigations have found every one of these claims not to be true. One would have thought that mentioning that the scientists have now been vindicated by three separate investigations into the accusations Pearce restates would be an important part of the story, but apparently this is not the case. He does mention the Muir Russell investigation, the fourth inquiry which is due to report its findings on Wednesday, and then says that

whatever Sir Muir Russell, the chairman of the Judicial Appointments Board for Scotland, concludes on these charges, senior climate scientists say their world has been dramatically changed by the affair.

Unbelievably it would seem that Fred Pearce thinks that it doesn’t matter whether climate scientists are guilty of research misconduct, fraudulently manipulating data or any of the other charges of which they have been accused. In the surreal world of the mainstream media, what matters is not the facts surrounding the issue, and certainly not the results of independent investigations into these matters, but the number of bogus accusations already made by journalists and bloggers.

Read Full Post »

I woke up this morning to find that I now have a Tory MP. Lets just say it isn’t what I was hoping for. The results for Bristol North West look like this
Charlotte Leslie, Conservative 19,115 38.0%
Paul Harrod, Liberal Democrat 15,841 31.5%
Sam Townend, Labour 13,059 25.9%
Robert Upton, UK Independence Party 1,175 2.3%
Ray Carr, English Democrats 635 1.3%
Alex Dunn, Green Party 511 1.0%

What’s striking about these figures isn’t that the Tories won, but that 62% of the votes cast were for other parties… Which is similar to the national figures, where at the moment it’s thought that around 64% of the votes went to other parties. It just fills me with despondency that polling a third of the votes is all it takes for a party to ‘win’ the ‘democratic’ elections in the UK.

It should be noted that the campaign in this seat was almost unerringly negative from all three mainstream parties: Labour said it was a two horse race between them and the Tories pointing to the 2005 election result, The Lib Dems said it was a two horse race between them and the Tories pointing to the more recent local council and European elections, and the Tories campaign was based around the notion that only voting for them would get rid of Gordon Brown. None of them had anything positive to say about how their policies would improve life in Bristol and the rest of the country, it was a simply a case of vote for us or you’ll get someone even worse. The result of this would seem to be that progressive votes were split between Labour and the Lib Dems who between them accounted for 57.4% of all the votes cast, far more than the Conservatives.

If this were any other country in Europe we would be looking at a coalition between Lib Dems and Labour, who between them polled over 50% of the vote nationally. However because we have a ridiculous voting system which means that the Lib Dems 23% of the national vote equates to under 10% of the seats in the house of commons, it looks like a Lib/Lab coalition wouldn’t be able to command a parliamentary majority. What this means is that the Tories 36ish% of the votes gives them just under 50% of the seats in the commons and they look likely to try to form a minority government. With around a third of the vote. By contrast the 53.3% who voted round the country for Labour and the Lib Dems will be told that they ‘lost’ that despite parties campaigning for electoral reform receiving over 50% of the total votes cast, that essentially their voting choice and political opinions are considered less important than those of the 36% Tory voters.

A mention here has to go to Nick Clegg, who is arguing that the Tories should have the first go at forming a government, contradicting constitutional practice which states that that this is a privilege bestowed upon the incumbent government… A Labour/Lib Dem coalition (if they could find the support from the SDLP and a couple of other small parties to have a commons majority) would have over half the popular vote. This would give them a clear mandate to govern and bring forward the process of electoral reform they both campaigned on. Its why I voted Lib Dem, and its why thousands of other people voted for your party. Deciding to let a party with just over a third of the vote attempt to govern and prevent electoral reform which so many millions voted for yesterday would be a travesty, and one which the public are unlikely to forgive the Liberal Democrats for.

Lets hope that over the coming days and weeks the campaign to reform our electoral system goes from strength to strength, and that we can get rid of the undemocratic and unrepresentative first past the post system.

Read Full Post »

Christian Marazzi is one of the group of Italian Post-Fordist theorists along with Antonio Negri, Paolo Virno and France Beradi. Capital and Language first published in Italian in 2002 is the first of Marazzi’s works to be published in English.

The starting point of the economic analysis presented by Marazzi in this text is that

Begining in the second half of the 1980’s, the prevailing analyses of the crisis of Fordism and the transition to post-Fordism were based in socio-economics, with particular attention to modifications in the nature of work and the production of goods, starting in the second half of the 1990’s the explosion of the securities market on a global scale forced everyone to update their analyses by paying more attention to the financial dimension of the paradigmatic shift. p13

The key here according to Marazzi is that whereas previously savings had been concentrated in household economies – property and goods – in the New Economy the collective savings and pension schemes of regular people became bound to the success of the global financial market, whose continuing growth their own financial future was tied to. As such, whereas within the old economy the workers saw Capital as an exterior enemy which they could organise and resist, within the New Economy the masses identify success of the financial markets with their own personal economic success.

With their savings invested in securities, workers are no longer separated from capital as they are, by virtue of its legal definition, in the salary relationship. As shareholders they are tied to the ups and downs of the markets and so they are co-interested in the “good operation” of capital in general. p37

The central thesis Marazzi presents in Capital and Language then, is that

In the Post-Fordist economy the distinction between the real economy, in which material and immaterial goods are produced and sold, and the monetary-financial economy, where the speculative dimension dominates investor decisions, must be totally reconceived… In the New Economy language and communication are structurally and contemporaneously present throughout both the sphere of the production and distribution of goods and the sphere of finance, and it is for this very reason that changes in the world of work and modification in the financial markets must be seen as two sides of the same coin. p14

Marazzi goes on to examine some of the effects of public opinion (through the lens of behavioural psychology) and confidence upon markets, and comes to the conclusion that “The theoretical analysis of financial market operations reveals the centrality of communication, of language, as a creative force.” (p27) As such, the creative and productive work done by language and communication demarcate them as no longer constituting a societal superstructure, distinct and separate from the productive sphere of material production. Key to this is the performative abilities of language, the capacity not merely to utilise language to describe actions or events, but the capacity to actively perform tasks through linguistic utterances.

Another area which Marazzi theorises, which has particular pertinence to media studies, is that of the attention economy. Quoting Davenport and Beck (2001), Marazzi states that

“In the New Economy, ‘What is scarce is human attention, the width of the telecommunications band is not a problem, the problem is the width of the human band.’ The technological revolution has certainly enlarged access to information enormously, but the limitless growth in the supply of information conflicts with a limited human demand, which is all the more limited the more work time reduce the attention time we are able to dedicate to ourselves and the people with whom we work and live.

We are in a situation of information glut, of an excess, an overload of information. The Sunday edition of the New York Times contains more information than all of the written material available to readers in the 15th Century. Back then the problem was not finding the time to read, but finding enough reading material to fill up the time. Information was a sellers’ market and books were thought to be more precious than peasants.'(p64/65)

Marazzi contends that while the growth of networked telecommunications technologies has exploded at an exponential rate since the 1980’s, meaning that for hundreds of millions of people across the world today that access to information is no longer a problem, and the traditional models based on scarcity of information are now null and void, ‘the fact is that on the demand side for goods and services, attention (and its allocation) has taken the place of the physical raw materials of the industrial economy. It is a scarce and extremely perishable good… A wealth of information creates a poverty of attention.’ (p66) Furthermore, Marazzi argues that many of the changes to the structure of work which have taken place in the New Economy actively contribute to this attention poverty, as the eight-hour working day is extended through for example ICT technologies which enable workers to be on call 24 hours a day, and as the Fordist notion of a stable job for life is undermined, forcing workers to devote attention time to looking for work instead of concentrating on consuming informational goods and services. This leads Marazzi to contend that

The disproportion between the supply of information and the demand for attention is a capitalistic contradiction, an internal contradiction of the value form, of its being simultaneously commodity and money, a commodity increasingly accompanied by information and money-income, distributed in such as not to increase effective demand. The financialisation of the 1990’s generated additional incomes but, beyond distributing them unequally, it created them by destroying occupational stability and salary regularity, thus helping to exacerbate the attention deficit of worker consumers by forcing them to devote more attention to the search for work than to the consumption of intangible goods and service. p141

As the current economic form of financial capitalism contributes to a poverty of attention, Marazzi contends that it is crucial to experiment with social formations which instead reduce this attention shortage, giving producer/consumers more time to both create and digest information, thereby both creating value in terms of a cultural commons, but also enhancing the general intellect. Such a notion can be understood to resemble the argument made by Hardt and Negri in Empire and Commonwealth and recently pledged on a national level in the UK by the Green Party for a universal citizen’s income.

In conclusion then, Capital and Language presents an interesting and innovative approach to understanding the main changes which society has undergone since the 1980’s from a socio-economic perspective which foregrounds the importance of language in the contemporary form of capitalism. In particular it provides thought-provoking analyses on the changes to 20th century notions of base and superstructure, on the genesis and contradictions of the attention economy, and how the financialisation of savings and pensions involves workers in the wider capitalist system to a far greater extent than previous manifestations of capitalism.

Read Full Post »

Commonwealth is the third in the series of socio-political analyses from Hardt and Negri which began with Empire (2000) and continued with Multitude (2004). To briefly summarise the series so far; Empire provided an overview of the changes to the structures of power and economic forces from the 1980’s onwards which Hardt and Negri characterise as moving from a nation state dominated imperial system to a globalised networked imperialist power and Multitude subsequently elucidated the emerging forms of networked resistance to the newfound global hegemonic forces of Empire.

Commonwealth seeks to further build upon the work laid out in the first two books through a deeper and more sustained engagement with some of the key concepts originally presented in the first two books, while dealing with some of the most pertinent criticisms leveled at the theoretical frameworks of Empire and the Multitude by other leading left-wing academics and theorists (a point which I will return to later).

Consequently while the book can be read as a stand-alone piece, it certainly helps to have read the prequels which give a thorough contextualisation of where Hardt and Negri are coming from, and also provide far more detailed analyses of the economic background from which they draw the conclusion that since the early 1980’s there has been the beginning of a paradigm shift away from industrial production and towards a form of information-led production which Hardt and Negri argue requires a revised understanding of both power and contemporary forms of resistance.

While throughout the series Hardt and Negri have referred to this newfound mode of production (amongst other things) as biopolitical production – using a term first developed by Foucault – both the Foucaultian orgins of the term and the differences between Foucault and Hardt & Negri’s usages are proscribed in far greater detail in Commonwealth.

Our reading not only identifies biopolitics with the localised productive powers of life – that is, the production of affects and languages through social cooperation and the interaction of bodies and desires, the invention of new forms of the relation to the self and others, and so forth – but also affirms biopolitics as the creation of new sunjectivities that are presented at once as resistance and de-subjectification. If we remain too closely tied to a philological analysis of Foucault’s texts, we might miss this central point: his analysis of biopower are aimed not merely at an empirical description of how power works for and through subjects but also at the potential for the production of alternative subjectivities, thus designating a distinction between qualitatively different forms of power. p59

Crucial to this reading and Hardt and Negri’s reading of biopolitics then is that as a emerging hegemonic form of power in the globalised world, biopolitical production is constantly producing new subjectivities and affects which escape and exceed the capitalist form of value extraction and thus produces newfound alternatives to global capitalism. While they are at pains to stress that this in itself does nothing to guarantee any kind of crisis for capitalism, or that capitalist contradictions and crises necessarily lead to revolution, they do argue forcefully that this opens up new spaces of conflict and resistance and produces alternative possibilities to the current status quo.

As the book’s title suggest, one of the primary focuses of the book is on common wealth, or the commons, again a concept which Hardt and Negri use in Empire and Multitude, but which is explored in far more depth in Commonwealth. Hardt and Negri employ a Deleuzian ontology which combines two traditionally distinct usages of the common, firstly the demarcation of a non-human commons in terms of the ‘natural world’ which is posited as an outside set of resources ripe for expropriation, and also the socially constructed commons, such as language, social bonds, affects, thoughts, and ideas

Wheras the tradition notion poses the common as a natural world outside of society, the biopolitical conception of the common permeates equally all spheres of life referring not only to the earth, the air, the elements, or even plane and animal life but also to the constitutive elements of human society, such as common languages, habits, gestures, affects, codes, and so forth. Whereas for traditional thinkers such as Locke and Rousseau, the formation of society and progress of history inevitably destroy the common, fencing it off as private property, the biopolitical conception emphasises not only preserving the common but also struggling over the conditions of producing it, as well as selecting among its qualities, promoting its beneficial forms, and fleeing its detrimental corrupt forms. We might call this an ecology of the common – an ecology focused equally on nature and society, on humans and the nonhuman world in a dynamic of interdependence, care and mutual transformation. p171

One important way in which Hardt and Negri extend their conception of commonwealth is the caveat that not all common forms of wealth are liberatory and positive. Indeed they contend that many of the ways in which the commons is currently experienced is through what they deem corrupted forms in which commonwealth is partially constrained and thus creates not a resource for all, but a means of exclusion and expropriation which striates the social field and creates hierarchies. Chief among these corrupted forms of the common identified by Hardt and Negri are the nation state, the corporation and the family.

H&N go on to contend that whereas the common is produced through love, which they trace conceptually back to Spinoza’s writings on love and joy arguing that love is what produces cultural forms of commonwealth ‘Every act of love is an ontological event in that it marks a rupture with existing being and creates new being…To say that love is ontologically constitutive then, simply means that it produces the common.’ (p181) However Hardt and Negri go on to warn that

Just like the common itself, love is deeply ambivalent and susceptible to corruption. In fact what passes for love in ordinary discourse and popular culture is predominantly is corrupt forms. The primary locus of this corruption is the shift in love from the common to the same, that is, from the production of the common to the same or a process of unification. p182

As such the identitarian forms of love such as patriotism, racism and certain religious fundamentalisms which are grounded on a love of the same and seek to impose that sameness or unity upon heterogeneous elements they classify as ‘outside’ of their identity. Thus Hardt and Negri characterise these belief systems and structures not as grounded in hatred, but in a form of love, albeit a corrupted form which seeks to reproduce unity and homogeneity rather than the diverse and heterogeneous positive forms of the common. This they define as evil; not evil as in the traditionally transcendent binary which stands diametrically opposed to the category of good, but as instantiations of particular forms of love and the common gone bad. This theortisation of evil,

Gives us a Spinozan explanation for why at times people fight for their servitude as though it were their salvation, why the poor sometimes support dictators, the working classes vote for right wing parties, and abused spouses and children protect their abusers. Such situations are obviously the result of ignorance, fear and superstition, but calling it false consciousness provides meager tools for transformation. Providing the oppressed with the truth and instructing them in their interests does little to change things. People fighting for their servitude is understood better as the result of love and community gone bad, failed and distorted. The first question when confronting evil then, is, what specific love went bad here? What instance of the common has been corrupted? p194

Whilst this does provide a novel approach for understanding why people fight for their servitude as though it were their salvation, one criticism to be made here is that Hardt and Negri are vague as to what kind of social forms they envision replacing ‘corrupted’ forms such as the family and the state, contending instead that these forms are currently unimaginable and must arise out out of the practical experimentation and experience of the multitude. While there is a logic which reflects their political position in refusing to project a teleology of the multitude, the failure to provide alternatives to contemporary corrupt forms of the common is somewhat unnerving, the lack of propositions for constructive alternatives to current systems makes the focus of Hardt and Negri’s theorising primarily negative, seemingly aimed at combating corrupt forms of the common without really suggesting the kind of positive alternatives they wish to see created. Where I found Commonwelath far stronger, was where Hardt and Neri reiterated some of the concrete proposals they first outlined in Empire with the addition of far more nuanced details in arguing for a living wage for all, the removal of the restriction on human movements imposed by state borders and universal open access to the commons in order to

Develop fully and put into practice the multitude’s abilities to think and cooperate with others. Such an infrastructure must include an open physical layer (including access to wires and wireless communications networks), an open logical layer (for instance, code and protocols) and an open content layer (such as cultural, intellectual and scientific works). p308

The criticism of the lack of concrete progressive forms for the multitude with respects to the family and the state feed into the second major current of criticism of their earlier works which Hardt and Negri seek to contest in Commonwealth. The first strand of critique, as advanced by the likes of Pierre Machery and Ernesto Laclau, is the argument that a plural and polyphonic choir such as Hardt and Negri’s conception of the multitude cannot function as a coherent political actor due to its heterogeneous composition. Whereas in the past the figure of the party, the people, or even the state and the nation have functioned in a way to unify differences and mobilise populations to create social transformation, and certain critics have argues that without a similar point of unification the multitude can act only as a cacophony of contradictory voices which cannot act commonly. Hardt and Negri’s retort to this is that

It is true that the organisation of singularities and decision making is not immediate and spontaneous, but that does not mean that hegemony and unification, the formation of a sovereign and unified power – whether it be a state a party or a people – is the necessary precondition for politics. Spontaneity and hegemony are not the only alternatives. The multitude can develop the power to organise itself through the conflictual and cooperative interactions of singularities in the common. p175

The second main line of critique which Hardt and Negri respond to are the arguments brought forth by Paolo Virno, Etienne Balibar, Slavoj Zizek and Alain Badiou, that whilst the multitude may be capable of acting as a political actor – albeit one which substantially differs from traditional forms based around unity – there is no guarantee that the consequences of such a political form would be liberatory and progressive. The actual contents of these critiques of the multitude vary widely, from Virno’s realist position which acknowledges that the formal structure of the muiltitude in now way guarantees the contents of its politics, to Zizek and Badiou’s positions which effectively argue that the multitude is merely an oppositional figure to contemporary character, and that this oppositional resistance can never be more than a mere component of that power from whence it derives, and I find myself giving more credence to Virno’s line of thought than Zizek/Badiou’s.This line of critique is dealt with far less effectively, and while Hardt and Negri do outline some very useful protocols for a liberatory or progressive politics of the multitude, and trace a genealogy of progressive political groups and movements, Virno’s critique in particular seems valid when assessing forms of contemporary networked radical Islamist groups, which exhibit many  structural properties similar to the composition of the multitude, however their ideology exhibits extreme forms of what Hardt and Negri descibe as corrupt forms of love and the common.

On the whole then, Commonwealth provides a useful exploration and expansion of a number of key concepts previously presented by Hardt and Negri, while partially addressing some of the most pertinent criticisms directed at their earlier works. As such it certainly provides interesting points for discussion and reflection for people involved in the various social and ecological movements which have grown out of the alternative globalisation movement, and provides some concrete proposals for an alternative to the current global system alongside some detailed analysis of geo-political and economic developments over the last few years. Personally though, I would recommend most readers new to Hardt and Negri’s work to start with their earlier writings, in particular Multitude, which provides a more accessible point of entry to the writings of two of the contemporary left’s most exciting political theorists.

Read Full Post »

According to this, living in North-West Bristol my vote is worth just 0.521 of a vote. More shocking than that though, is the fact that this is apparently over twice the average amount of voting power of most UK citizens (the average is 0.253). At first glance this is really useful way of illustrating the obvious deficiencies of the UK’s first past the post system, which last time out elected a majority Labour government despite the fact that they only won around 37% of the votes cast

But how accurate is the voter power index? How is is worked out? And does it actually provide more useful information than simply looking at the numbers and percentages of the votes cast in each constituency alongside the constituency size (both of which are also provided by the VPI page but appear lower down)? And does the fact that no-one in the UK has a vote equal to one vote make the figures look artificially low? Well… from about voter power index

By Nic Marks, fellow, nef (the new economics foundation)

I went to vote at the last general election with a heavy heart as I knew full well that my vote wouldn’t really count towards the result as I live in a safe seat. Straight after voting I felt really angry about the whole system and whilst out walking my dog the idea came to me that I must be able to work out how much my vote didn’t count – you see if you make a statistician angry he tries to fight back with numbers.

So how did I calculate the Voter Power Index? Well I figured there were two main factors that would illustrate how much or little power voters had. The first is how marginal the constituency you happen to live in is (the more marginal the more power) and the second is how many registered voters there are (fewer voters means each individual vote counts more). The problem was how to estimate exactly how much power you have in a particular constituency. I decided that if I looked at as many elections as possible I would be able to figure out what was the probability of a seat changing hands for different levels of marginality. By creating a probabilistic model I could then estimate this probability for every constituency and hence calculate (to three decimal places) the Voter Power Index. The details of all the calculations are available in the original report but much more fun is that web designer Martin Petts has created an interactive web-site so everyone can see how much their vote is actually worth.

I must admit that even I was staggered when I first calculated the VPI just how much most people’s votes don’t count. It is clear that our current system is hugely inefficient at translating the will of the people into the result of a general election. In fact the VPI allows us to put a number on the level of this inefficiency – the current system is only 25% efficient – whereas some sort of proportional representation system would approach 100% efficiency (for example the 2004 European Elections were about 96% efficient). Not only is the system inefficient it is also chronically unjust with Voter Power very unevenly distributed in the UK, with the luckiest 20% of voters having over 33 times more power than the unluckiest 20% – the graph below shows this spread. Note that this is a far more uneven distribution than household income in the UK. Even before the redistribution through taxes and benefits – the richest 20 per cent of households ‘only’ earn 15 times as much as the poorest. After redistribution this inequity factor is reduced to under 4 times.

The Voter Power Index shows that the first past the post system is profoundly undemocratic in that it gives considerably more power to some voters than others and so betrays the fundamental principle of democracy – one person one vote. It is high time we changed the whole rotten system.

The biggest problem I have with their stated methodology is that they think marginality should be calculated by how many times a seat has changed hands in the past going back to the 1954 election and only looking at the top two parties votes. This isn’t what they say on the about page (confusingly) but if you go back to the original report they link to this is what they appear to have done. That this isn’t what their about page suggests does little for clarity.

While frequent changes would suggest a marginal seat, this is a very broad brush stroke approach which doesn’t seem to factor the actual number of votes or percentages involved in the elections into account. The idea of a three way marginal seat is completely alien to this method and I find that lack of complexity concerning. Furthermore they seem to give equal weighting to every electoral change since 1954, meaning that a seat which was very marginal between 55 and 25 years ago changing hands every election, but which has remained in the same hands in every election since then is still counted as fairly marginal, although in practice it may now be a very safe seat.

Essentially despite the pretty pictures on the VPI website this means that in many cases you can get a better idea of how marginal your constituency is by looking at the results from the last two elections and seeing how close they have been. At the last election Bristol NW was 38% labour, 32.5% tory 25% lib dem. Anyone with half a brain can tell that’s a marginal seat. Similarly if you look at somewhere like Henley and see figures for the last election of 53% tory, 26 % Libdem and 15% labour you can easily understand that it’s a safe seat. Trying to put a three figure decimal place number on how much more likely Bristol SW is to change hands than Henley is always going to be a rough and inexact endeavour. Except admitting this would remove the rationale for the VPI existing.

An interesting statistic which isn’t included in the VPI is the percentage of voters who turned out per seat. For example while the VPI site lists Henley as having 72,681 voters it doesn’t tell you that at the last election only 67% of the electorate there voted. It would be interesting to see whether safe seats have similar, lower or higher turnouts than marginals. Do people who live in safe seats feel sufficiently disenfranchised to not bother voting? Or is voter apathy evenly distributed across the country as people turn of from politics as they feel that the three main parties are so similar to one another that there is no point in voting at all?

Another problem I have with the way the VPI’s presentation of the issue is the claim made by Nic Marks that the voting system is more unequal than household income in the UK. In order to justify this he looks only at the top and bottom 20% of each. Firstly both on the website and the original document the income figures are unreferenced. It’s hard to check on the figures someone has used if they decide not to link to where they obtained those figures from. Secondly there’s a question as to why only one figure, for 20% and 80% were used in the report. If he had decided to look at different numbers such as the top and bottom 5% of income and voting power the figures may well be completely different. However that wasn’t what he intended to say so perhaps he used the statistic which suited his argument. Without referencing the data who knows? Does the data actually look at per capita income for everyone in the UK or does it exclude pensioners and people on benefits? If it does (as most government stats which look at income look at those paying income tax only) then this would again distort the data into giving an appearance of economic equality which does not exist in practice among the electorate. This kind of statistical cherry picking does nothing to add to the important point that voting power in the UK is unequal, and in fact casts doubts on the honesty and integrity of the research as a whole.

On the whole while I like the idea of VPI site, the fact that it does present some very useful and easily comprehended information about the differing worth of UK votes dependent on area, there are always likely to be problems with adopting the kind of reductionist quantitative approach they implement. While some of the information displayed there is really interesting and informative, particularly the last election result pie chart, the percentage of votes discarded and the size of the constituency, in general it just produces answers which are extremely obvious: first past the post is a highly undemocratic system in which voters have differing abilities to impact the make up of parliament depending on their constituency, a safe seat is by its very nature undemocratic, and that electoral systems based on proportional representation give far more agency to voters.

Read Full Post »

Follow

Get every new post delivered to your Inbox.

Join 34 other followers