Excellent critical reflection on ‘algorithms, performativity and governability’ by Lucas Introna – very helpful for a bunch of things I’m currently thinking about…
In a piece over on New Republic, Evgeny Morozov (author of To Save Everything Click Here) outlines a version of ‘the attention economy’, in which, because it is mediated through digital media, everything that we do, every way we interact with people, places, services and things becomes an ‘asset class’, and traded in bulk. Ultimately, Morozov’s argument is not dissimilar to Bernard Stiegler’s critique of a ‘generalised proleterianisation‘ insofar as the grammatisation (capturing, storing and sorting as data) of ever-increasing parts of our lives we become subjectivised through systems of calculation at an industrial scale that results in a kind of ‘incapacitation’. As Morozov suggests:
[T]o sell our intimate data in bulk is to fully surrender our quest for autonomy, accepting a life where the most existential choices are shaped either by the forces of the market or by whatever war—be it on climate change or obesity—the government has enlisted us (rather than corporations) to fight. In this world, whether we become vegetarians, and even whether we end up thinking about it, might ultimately hinge on which player (the steakhouses, the supermarkets, the bureaucrats) has the most to gain from this switch. Our data constitutes our very humanity. To voluntarily treat it as an “asset class” is to agree to the fate of an interactive billboard. We shouldn’t unquestionably accept the argument that personal data is just like any other commodity and that most of our digital problems would disappear if only, instead of gigantic data monopolists like Google and Facebook, we had an army of smaller data entrepreneurs. We don’t let people practice their right to autonomy in order to surrender that very right by selling themselves into slavery. Why make an exception for those who want to sell a slice of their intellect and privacy rather than their bodies?
Worth a read anyway, in spite of being a bit dystopian…
Stuart Elden points to an interesting video of a conversation with Nigel Thrift, discussing urban informatics, ‘big data’ and so on. Slight hint of Thrift buying into the rhetoric around ‘big data’ but still an interesting discussion…
I wrote an outline for a paper/chapter for a proposed book and related conference edited/convened by F. Xavier Olleros and Majlinda Zhegu at Université de Québec à Montréal which they have kindly accepted. So, I will be fleshing out the following over the summer. Obviously, I owe an intellectual debt to Rob Kitchin here but I’d like to think that I am substantively developing some of the themes of his code/space work (with Martin Dodge) through my own reading of Bernard Stiegler’s philosophical project. In particular, I am developing some of my ideas about the politics of anticipation (from my PhD work) through Stiegler’s theorisation of the ‘industrialisation of memory‘ and the ‘eventisation‘ capacities of increasingly data-driven commercial industries.
This paper addresses the transformative sense in which computation has become an infrastructure upon which has been founded mechanisms to both support and intervene in how we live our everyday lives. The past two decades have witnessed a steady movement of the capacity of digital computation away from spaces dedicated to housing the apparatus of computing—such as the computer centre and the home office—towards a diffusion of that capacity into a variety of everyday places (in the global North). A number of authors have both predicted and described the ways in which computation has moved from dedicated places for bulky apparatus into a capacity available through interconnected devices and systems in an increasing number of contexts (Greenfield, 2006; Kitchin, 2011; Kitchin and Dodge, 2011; Mitchell, 1995, 2000, 2003; Shepard, 2011; Rheingold, 2002; Weiser, 1991). Large-scale computing apparatus have not been eliminated, in fact they have increased in number in the guise of data centres, server farms and so on, but the capacity for the interconnection of those resources through international telecommunications infrastructures to large numbers of portable and embedded devices has transformed the scope and reach of computation (Graham and Marvin, 2001; Graham, 2004, 2005). The purpose of this paper is to interrogate the ways in which this widespread infrastructure of computation is being used not only to both support and surveil increasing amounts of everyday activities, through the collection and retention of vast quantities of data, but also to anticipate and intervene into how we perform the everyday.
Increasing amounts of information about ourselves and others is harvested and stored using electronic devices and we volunteer even more information to email providers, search engines and social networking systems. Many aspects of our everyday lives are now gathered in a range of contexts and recorded (via CCTV, cellphone networks and so on) and retained in databases (Agre, 1994; Graham, 2002; Haggerty and Ericson, 2000; Murakami Wood, 2008), as a growing system of memory of life otherwise forgotten or unthought. These systems are increasingly involved in the ways in which we convene and perform a sense of place. If places are spatial contexts that we convene and give meaning through particular kinds of activities or arrangements of various people and things, then the ways in which we perform that sense of place can be understood to be increasingly mediated by digital technologies. We use mobile devices to search commercial systems for information about and navigate to locations, relying upon travel instructions and databases of past experiences of those places. We allow those systems to use data about ourselves to recommend the ways in which we might act in those locations, where we might eat, shop or socialise. Furthermore, especially in urban environments, we are subject to the regulation of particular locations through real-time analytics based upon infrastructures that gather data for city governments. Infrastructures of software and hardware thus have a growing agency in how we collectively communicate, remember and conduct ourselves socially.
The gathering and recording of data and volunteered information through the expanding computational infrastructure facilitates the ordering of time both as forms of history, and thus the sharing of knowledge and culture, and as the means of anticipating, planning for, and perhaps preventing, futures. The logic of retained knowledge is thus ‘programmatic’ and has arguably become more so with the advent of software programmes, which have augmented our capacities to remember, process and act upon information. Furthermore, these infrastructures increasingly anticipate, in real-time, the ways in which we will behave in order to inform how commercial and governmental organisations intervene in and regulate how a variety of urban environments function. The production and performance of cities, then, increasingly ‘takes place’ in concert with a host of quasi-autonomous computational agents, regardless of whether or not we are aware of it.
To investigate the transformative nature of the anticipatory capacities of a growing number of computational infrastructures embedded within our everyday lives this paper proceeds in three parts. In the following, second, section several technology case studies are explored as means of capturing & retaining and anticipating & operating upon our everyday activities in ‘industrial’ scale systems. Particular attention is paid to the quasi-autonomous agency of these systems, that appear to operate at a scale and speed that exceeds a human capacity of oversight. In the third section the mnemonic and prognostic capabilities of networked infrastructures are brought into focus to be examined, through the work of the philosopher Bernard Stiegler (1998, 2009, 2010b, 2010a), as ‘mnemotechnologies’, technologies and technical supports that both support and reterritorialise what we collectively understand about our everyday lives. The conclusion of this article addresses the ways in which the informatics of an ‘industrialisation of memory’ that operates at a scale and speed that bleeds into apparatuses of anticipatory intervention both challenges and transforms the ways in which we negotiate what are private and public activities and spaces.
Professor Rob Kitchin is currently engaged in a large five year EU-funded project: The Programmable City, concerned with the role of software on the ongoing production, performance and imagination of cities. Over on the blog for the project they have announced that Prof. Kitchin’s most recent book ‘The Data Revolution’ is now with the publishers, Sage, with a view to publication later this year. I’m looking forward to it…
Here’s the book outline blogpost from the Programmable City website:
Jussi Parikka has written an interesting post on his blog offering a glimpse at his new writing project, with the tentative title ‘A Geology of Media’. He suggests that this is the third in his series of books theorising media ecology, with the other two being Digital Contagions: A Media Archaeology of Computer Viruses (2007) and continued with Insect Media (2010).
Here’s an excerpt:
This book on the geophysics and the non-organic ground of media complements the earlier takes by offering a media materialism from the point of view of geological resources, electronic waste and media arts. Through engaging with several contemporary art and technology projects it provides a media theoretical argument: to think of materiality of media beyond the focus on machines and technologies by focusing on what they consist of: the chemistry and geological materials of media, from metals to dust.
In short, I am interested to see if what pejoratively sometimes is called “hardware fetishism” is not hard enough, and even media and cultural theorists need to focus on the rocks and crust that make technical media possible. Earth history of deep times mixes with media history, which becomes a matter of not only thousands, but millions of years of non-linear history (to modify Manuel Delanda’s original idea). This way media materialism becomes a way to entangle media technologies, environmental issues and themes of global labour. Perhaps instead of the Anthropocene, we should just refer to the Anthropobscene.
Shawn Sobers linked to a funny comment piece by Stewart Heritage on the Grauniad riffing on the idea of the ‘Internet of Things‘, with the main schtick being that there is such a lack of imagination behind the implementation of such ‘things’ that if we extrapolate then surely the interlinked ‘things’ will do us a mischief… Now, this is humorous, of course, but humour is also a good way to get us to think about why on earth we’re letting ourselves in for a vision of such ‘things’. I am not ‘anti-‘ technological innovation, I am merely arguing that we need to be critically reflective of the motivation behind the development of some of these systems and devices. The same kind of critical reflection we have seen in relation to the ‘MOOC revolution‘…
Here’s one of the funny bits from the article, extrapolating from actually existing technologies into the more ridiculous:
The Internet of Things has already produced some cool-sounding devices. There is the tennis racket kitted out with motion sensors to help you improve your game. There’s the parking sensor that directs your satnav to an empty spot. The basketball that, when bounced on the floor, automatically tells your home entertainment setup to start playing basketball-related content. The bridge that tells people when it’s about to collapse. The smoke alarm that switches itself off and works in conjunction with your electrical outlets to burn you to death in your sleep because it has become jealous of your capacity for love. The remote cave that fills itself with bears and poisonous snakes whenever it detects that someone has started sleeping in it because they’ve convinced themselves that their entire house has grown sentient and suddenly turned against them. All sorts, really. It’ll be fun.
Deborah Lupton has blogged about the creation of a self-quantification researcher network: