Past event > Technology’s Limits: Automation, Invention, Labour, and the Exhausted Environment

Glitched image of a mural of Prometheus giving humans' fire in Freiberg

This looks like it was a fascinating event…

Workshop hosted by the Digital Life Research Program 

An abstract artwork of a light background with brown and red rectangles of various sizes.

Artwork by Stephen Scrivener Opens in a new window

Event Details

Date: Friday March 10, 2017
Time: 10am – 4.30pm
Venue: EB.G.21, Parramatta South campus

Among its many political preoccupations, 2016 was marked by an obsessive concern with the new powers of the machine to erase human labour and employment. Science fiction dystopias – among them, the French Trepalium and the Brazilian 3% – saddled older anxieties about a world without work to a more novel recognition of resource depletion and scarcity. Academic publishing houses, conference organisers, funding agencies and the press have responded with a deluge of content covering algorithms, automation and the Anthropocene. Meanwhile, a less conspicuous narrative about the decline of innovation has resurfaced with claims that the rate of fundamental new technology inventions is slowing and jeopardising long term global productivity returns. What happens when technology hits its limits? Velocity and volume excite machinic economies, but do little to confront some of the core problems and challenges facing planetary labour and life today.

This workshop brings together leading Australian scholars of technology and society with contemporary German and French reflections on the prevailing discourses of technology’s limits. Since the 1990s, Bernard Stiegler has been a leading philosopher and critic of technology, and in his recent book Automatic Society he directly tackles problems of automation and algorithms for the distribution of financial and social resources to populations increasingly bereft of economic capital and political agency. Building upon Frankfurt School critical theory and Kittlerian media theory, contemporary German critique intersects with similar questions by combining investigations of epistemology, history and the technical. The Australian take on these European developments is simultaneously appreciative and critical, though often oriented toward more regional conditions that arise in part due to different economic, cultural and political relations with Asia.

The morning session of the workshop will introduce current theoretical European work on technology. Daniel Ross will develop a critical introduction to Bernard Stiegler, whose recent work in Automatic Society and In the Disruption continues to mount a wide-ranging and provocative critique of technology. Armin Beverungen will then offer an overview of his research on algorithmic management and high-frequency trading, with Ned Rossiter introducing logistical media as technologies of automation and labour control. In the afternoon, Gay Hawkins will outline her theoretical interest in nonhuman and technical objects and their irreducible role in making humans and ecologies. A key empirical example will be the history of plastic and the emergence of its technical agency and capacity to reconfigure life. Nicholas Carah will follow with a discussion of his latest work on algorithms, brand management and media engineering. The workshop will close with an audience-driven panel session and discussion. These interventions will be held in conjunction with a close reading of the key texts below.

Register

Attendance numbers will be limited so please register in advance. No registration fee required.

RSVP by 7 March on Eventbrite Opens in a new window

Speakers

  • Armin Beverungen
    Junior Director at the Digital Cultures Research Lab (DCRL) at Leuphana Universität Lüneburg & Visiting Fellow, Institute for Culture and Society, Western Sydney University
  • Nicholas Carah
    Author of Brand Machines, Sensory Media and Calculative Culture (2016)
  • Gay Hawkins
    Author of Plastic Water: The Social and Material Life of Bottled Water (2015)
  • Liam Magee
    Author of Interwoven Cities (2016)
  • Nicole Pepperell
    Author of Dissembling Capital (forthcoming, 2017)
  • Daniel Ross
    Translator of Bernard Stiegler’s Automatic Society (2016) and numerous other works
  • Ned Rossiter
    Author of Software, Infrastructure, Labor: A Media Theory of Logistical Nightmares (2016).

Co-chairs: Liam Magee and Ned Rossiter, co-convenors of the Institute for Culture and Society’s Digital Life research program.

Recommended Readings

Frank Pasquale (2017), Duped by the Automated Public Sphere Opens in a new window
Lee Rainer and Janna Anderson [Pew Research Center] (2017), Code-Dependent: Pros and Cons of the Algorithm Age Opens in a new window
Bernard Stiegler (2012), Die Aufklärung in the Age of Philosophical Engineering (opens in a new window)
Bernard Stiegler (2015), Escaping the Anthropocene Opens in a new window
Bernard Stiegler (2015), On Automatic Society Opens in a new window
Sonia Sodha [The Guardian] (2017), Is Finland’s basic universal income a solution to automation, fewer jobs and lower wages? Opens in a new window

Related Readings

Bruce Braun (2014), A New Urban Dispositif? Governing Life in an Age of Climate Change Opens in a new window
Nick Dyer-Witheford (2013), Contemporary Schools of Thought and the Problem of Labour Algorithms (opens in a new window)
Victor Galaz (2015), A Manifesto for Algorithms in the Environment Opens in a new window
Victor Galaz et al. (2017), The Biosphere Code Opens in a new window
Orit Halpern (2015), Cloudy Architectures Opens in a new window
Erich Hörl (2014), Prostheses of Desire: On Bernard Stiegler’s New Critique of Projection Opens in a new window
Yuk Hui (2015), Algorithmic Catastrophe: The Revenge of Contingency (opens in a new window)
International Labour Organisation (2016), ASEAN in Transformation Opens in a new window
Lilly Irani (2015), The Cultural Work of Microwork Opens in a new window
MIT Technology Review (2012), The Future of Work Opens in a new window
Cathy O’Neill (2016), How Algorithms Rule Our Working Lives Opens in a new window
Elaine Ou (2017), Working for an Algorithm Might Be an Improvement (opens in a new window)
The Guardian (2016), Robot Factories Could Threaten Jobs of Millions of Garment Workers Opens in a new window
Tommaso Venturini, Pablo Jensen, Bruno Latour (2015), Fill in the Gap. A New Alliance for Social and Natural Sciences Opens in a new window

Agenda

  • 10:00 –10:10: Liam Magee, Ned Rossiter: Welcome and Introduction
  • 10:10–11:10: Daniel Ross
  • 11:10–11:30: Q&A
  • 11:30–11:45: Coffee
  • 11:45–1:00: Armin Beverungen, Ned Rossiter
  • 1:00–2:00: Lunch
  • 2:00–3:15: Gay Hawkins, Nicholas Carah
  • 3:15–4:15: Panel discussion responding to automation: Dan / Gay / Nicholas / Armin / Nicole – Liam & Ned to chair
  • 4:15–4:30: Closing thoughts, future actions

65% of future non-existent jobs (which doesn’t exist) 70% of jobs automated (just not yet)

Twiki the robot from Buck Rogers

The future of work is the work of imagination. We are, repeatedly, and have been for a while, bombarded with (pseudo-)facts about what the future of work will bring. These are, of course, part of well-known, long-standing, narratives about ‘innovation’, ‘growth’, technological advance and, of course, ‘automation’.

Martin shared a good post by , on his site Long View on Education, about some persistent kinds of story around the nature of work our schools are preparing children for, or not. Here’s  an abridged, and selective, version of the story…

“The top 10 in demand jobs in 2010 did not exist in 2004. We are currently preparing students for jobs that don’t exist yet, using technologies that haven’t been invented, in order to solve problems we don’t even know are problems yet.”

Shift Happens videos (2007).

People repeat the claim again and again, but in slightly different forms. Sometimes they remove the dates and change the numbers; 65% is now in fashion. Respected academics who study education, such as Linda Darling-Hammond (1:30), have picked up and continue to repeat a mutated form of the factoid, as has the World Economic Forum and the OECD.

[…]

“By one popular estimate 65% of children entering primary schools  today will ultimately work in new job types and  functions that currently don’t yet exist. Technological  trends such as the Fourth Industrial Revolution will  create many new cross-functional roles for which  employees will need both technical and social and analytical skills. Most existing education systems at all levels provide highly siloed training and continue a  number of 20th century practices that are hindering  progress on today’s talent and labour market issues.  “¦  Businesses should work closely with governments,  education providers and others to imagine what a true 21st century curriculum might look like.”

The WeF Future of Jobs report

[…]

Cathy Davidson (May 2017) explains up how she came to the factoid:

“I first read this figure in futurist Jim Carroll’s book, Ready, Set, Done (2007). I tracked his citation down to an Australian website where the “65%” figure was quoted with some visuals and categories of new jobs that hadn’t existed before. “Genetic counseling” was the one I cited in the book.

After Now You See It appeared, that 65% figure kept being quoted so I attempted to contact the authors of the study to be able to learn more about their findings but with no luck.  By then, the site was down and even the Innovation Council of Australia had been closed by a new government.”

The BBC radio programme More or Less picks up the story from here, demonstrating how it most likely has no factual basis derived from any identifiable source (there never was an Innovation Council of Australia, for example).

Davidson sort of defends this through dissimulation, in an interview for More or Less, by saying she believes that 100% of jobs have been affected by ‘the digital era we now live in’.

As Audrey Watters has highlighted, statistics like this and the appeal for a ‘disruption’ of education by the tech sector to teach ‘the skills of the future’ etc. can be reasonably interpreted as a marketing smoke screen – ‘the best way to predict the future is to issue a press release’.

An allied claim, that fall within the same oeuvre as the “65%” of not-existing jobs (or should that be non-existent?), is the various statistics for the automation of job roles, with varying timescales. A canonical example, from another “thought leader” (excuse me while I just puke in this bin), is from WIRED maven Kevin Kelly:

There are an awful lot of variations on this theme, focusing on particular countries, especially the USA, or particular sectors, or calculating likelihoods for particular kinds of jobs and so on and so on. This is, of course, big business in and of itself – firms like Deloitte, McKinsey and others sell this bullshit to anyone willing to pay.

What should we make of all this..?

There are a few interpretations we can make of this genre of ‘foresight’. Alongside several other academics I have written about particular ways of communicating possible futures, making them malleable-yet-certain in some way, as a ‘politics of anticipation‘. This politics has various implications, some banal some perhaps more troubling.

First, you might say it’s a perfectly understandable tendency, of pretty much all of us, to try and lend some certainty to the future. So, in our adolescent know-it-all way, we are all wont to lend our speculations some authority, and statistics, however spurious, is a key tool for such a task.

Second, and perhaps allied to the first, is the sense in which methods for speculation become formalised and normative – they’re integrated into various parts of institutional life. So, it becomes normal to talk about speculative (spurious?!) statistics about a future of work, education etc. in the same tone, with the same seriousness, and the same confidence as statistics about a firm’s current inventory, or last year’s GDP trends. Of course, all statistics, all facts, have conditions and degrees of error and so if the calculation of trends for past events is open to change, the rationale might be, perhaps future trends are just as reliable (there’s all sorts of critique available here but I’m not going to delve into that). In this way, consultancies can package up ‘foresight’ as a product/service that can be sold to others. “Futures” are, of course, readily commodified.

Third, an ideological critique might be that it is precisely these forms of storytelling about the redundancy or insufficiency of the labour force that allows those with the large concentrations of capital to accrue more by demeaning the nature of work itself and privatising profits upwards. If we are repeatedly told that the work that generates the good and services that move through our economy is worth less – because it can be automated, because it is ‘out-dated’, because there are other kinds of superior ‘skilled’ work – then it perhaps becomes easier to suppress wage growth, to chip away at labour rights and render work more precarious. Gloomy I know. However, some data (oh no! statistics!) Doxtdator has in his blogpost (and the kinds of data David Harvey uses in his books, such as The Engima of Captial) could be seen as backing up such arguments. For example (source):


These sorts of graphs, tell a different story about yesterday’s future – which didn’t lead to families reaping the rewards of automation and increased productivity by profiting from a share in increased leisure time (following JM Keynes), but rather delivered the profits of these trends to the “1%” (or even the “0.1%”) by massively increasing top executive salaries while keeping wider wage growth comparatively low, if not stagnant. I’m not an economist, so I don’t want to push my luck arguing this point but there are folk out there who argue such points pretty convincingly, such as David Harvey (though see also economic critiques of the ‘zombie’ automation type of argument).

Ultimately, I am, personally, less interested in the numbers themselves – who knows if 65% of today’s school children will be doing new jobs that represent only 70% of the total work we currently undertake?!  I’m more interested in the kinds of (speculative) truth-making or arguing practices they illustrate. The forms of speculative discourse/practice/norms about technology and work we’re all involved in reproducing. It seems to me that if we can’t fathom those things, we’re less able to care for those of us materially affected by what such speculation does, because, of course, sometimes speculation is self-fulfilling.

To try to advance some discussions about the kinds of technological and economic future that get proposed, gain momentum and become something like “truths”, I’ve been puzzling over the various ways we might see the creation of these economic statistics, the narrating of technological ‘innovation’ in particular ways, and the kinds of stories ‘critical’ academics then tell in analysing these things as making up collectively some form of collective imagination. I started out with ‘algorithms’ but I think that’s merely one aspect of a wider set of discourses about automation that I increasingly feel need to be addressed. My placeholder term for the moment is an “automative imaginary” ~ a collective set of discourses and practices by which particular versions of automation, in the present and the future, are brought into being.

A tendency to spill…

Talented and creative friends I used to knock about with in the studio have made a lovely thing for the International Literature Showcase that you should definitely check out. Please see A Tendency to Spill, an interactive story created by the superlative duo Hazel Grian and Constance Fleuriot.

A Tendency To Spill is an interactive Science Fiction story that involves a chat bot (instant chat using artificial intelligence.)

Romy is a human child brought up in world of robots. She is about to be taken from her home and expelled to the human world across The Wall. She doesn’t have much time, maybe 10 or 20 minutes. She asks you now for a crash course in what’s really at the heart of human society.

This story is recommended for anyone aged 13 and upwards and is equivalent to Young Adult Fiction. With that in mind, appropriate behaviour and language are expected from you as well as from Romy.

We’d love you to take part in our research, please help us by filling in the form when prompted. It doesn’t mean we’re watching you while you chat but interactions are archived and, if you agree, we may share them in future whilst you remain entirely anonymous.

Thank you and enjoy the short story.

Click here to meet Romy.

Songs “written by AI” from SonyCSL

Songs written by Sony CSL’s “AI”…

From the Sony CSL “flow machines” website:

Flow Machines is a research project funded by the European Research Council (ERC) and coordinated by Fran̤ois Pachet (Sony CSL Paris РUMPC).

The goal of Flow Machines is to research and develop Artificial Intelligence systems able to generate music autonomously or in collaboration with human artists.
We do so by turning music style into a computational object. Musical style can come from individual composers, for example Bach or The Beatles, or a set of different artists, or, of course, the style of the musician who is using the system.

Their “Deep Bach” thing was doing the rounds at the end of last year, so I presume there will be more to come.

Thresholds

Clever people at York are talking about Thresholds. Check out the website, it’s really interesting!

Based in the Science & Technology Studies Unit (SATSU) at the University of York, Threshold is a thematic programme of work that will unfold over the coming months. Taking Thresholds as a focal point, this research programme will use a range of diverse resources and perspectives to explore the liminal edges of everyday, organisational and social life. What and who reside beyond or within different types of thresholds? Who has to cross thresholds? What prevents people or things crossing? How does power operate through thresholds? How is it that thresholds articulate with limits, extremes, dangers and tipping points? These are just some of the questions we will explore.

Aimed at generating ideas and dialogue, this programme is geared toward political, conceptual and creative exchanges and contributions. Led by Joanna LatimerRolland MunroNik Brownand Dave Beer, this programme will develop a variety of perspectives on this central focal point of thresholds. This website will be used to communicate our key ideas, to promote events and to share outputs.

Reblog> 1:1 and Cartographic Operations

Via Machinology.

1:1 and Cartographic Operations

Cartographic Operations-exhibition is on at the Level 4 gallery in Southampton (Hartley Library). Supported by AMT, it features work from Winchester School of Art practitioners addressing maps. Jane Birkin, Abelardo Gil-Fournier, Sunil Manghani and Ian Dawson’s pieces address the main theme: “In Bernhard Siegert’s ‘The map is the territory’, he refers to the idea of ‘cartographic operations’. The suggestion is that our way of seeing the world is not simply represented in maps, but that map-making is itself a play of competing signs and discourses producing our subjecthood. These are the coordinates we come to live by, which in turn influence the marks and signs at our disposal when we seek to make and share representations of the world.”

One of the pieces is Jane Birkin’s 1:1 which is described and show below. It opens up the exhibition space to the depth of the surface by making visible the electric current and metal inside the wall. While it can be read in relation to some earlier pieces of contemporary art it also speaks to the current work in critical practices of infrastructure.

Birkin 1 to 1_med

From the catalogue text:

Jane Birkin’s 1:1 is a direct mapping of infrastructure behind the white space of display. It is ­a piece produced by performative procedure: a regulated operation where authorial control is established at the outset and rules are strictly followed. Electric current and metal are plotted using a DIY store metal/voltage detector and the information transferred simply to print.

There are literary precedents for mapping at this scale. In Jorge Luis Borges’ short story On Exactitude in Science cartography became exactingly precise, producing a map that has the same scale as its territory. And, in Lewis Carroll’s Sylvie and Bruno Concluded, a German professor tells how map-makers experimented with the use of ever larger maps, until they finally produced a map of the scale of 1:1. ‘It has never been spread out, yet’, said the professor. ‘The farmers objected: they said it would cover the whole country, and shut out the sunlight!’ In this case, the gallery wall is covered, shut off from light and eyes. Although 1:1 is an impassive engagement with the rule-based activity of cartography, it simultaneously performs an affective act of display.Birkin 1 to 1 detail_med.jpeg

Microsoft Cognitive Services

Microsoft Cognitive Services (sounds like something from a Phillip K. Dick novel) have opened up APIs, which you can call on (req. subscription), to outsource forms of machine learning. So, if you want to identify faces in pictures or videos you can call on the “Face API“, for example. Obviously, this is all old news… but, it’s sort of interesting to maybe think about how this foregrounds the homogenisation of process – the apparent ‘power’ of these particular programmes (accessed via their APIs) may be their widespread use.

This might be of further interest when we consider things like the “Emotion API” through which (in line with many other forms of programmatic measure of the display or representation of ’emotion’ or ‘sentiment’) the programme scores a facial expression along several measures”, listed in the free example as: “anger”, “contempt”, “disgust”,” fear, “happiness”, “neutral”, “sadness”, “surprise”. For each image you’ll get a table of scores for each recognised face. Have a play – its beguiling, but of course then perhaps prompts the sorts of questions lots of people have been asking about how ‘affect’ and emotions can get codified (e.g. Massumi) and the politics and ethics of the ‘algorithms’ and such like that do these things (e.g. Beer).

I am probably late to all of this and seeing significance here because it’s relatively novel to me (not the tech itself but the ‘easy-to-use’ API structure), nevertheless it seems interesting, to me at least, that these forms of machine learning are being produced as mundane through being made abundant, as apparently straightforward tools. Maybe what I’m picking up on is that these APIs, the programmes they grant access to, are relatively transparent, whereas much of what various ‘algorithm studies’ folk look at is opaque.  Microsoft’s Cognitive Services make mundane what, to some, are very political technologies.

 

Reblog> Body & Society 22.4 – Special Issue on The New Biologies

via TCS.

Body & Society 22.4 – Out Now! Special Issue on The New Biologies

new-biologiesThe December 2016 issue of Body & Society – 22 (4) – is now available.

This Special Issue, ‘The New Biologies: Epigenetics, the Microbiome and Immunities’, is edited by Lisa Blackman, and features articles on Antibiotic Resistance, Epigenetics & Obesity, Placental Biologies, Pandemics, and the MRSA Epidemic, among others.

The issue is available here: http://bod.sagepub.com/content/22/4.toc