Some videos have been widely shared concerning the soap dispensers and taps in various public or restaurant toilets that appear to have been calibrated to work with light skin colour and so subsequently appear to not work with darker skin. See the below for a couple of example videos.
If you have ever had a problem grasping the importance of diversity in tech and its impact on society, watch this video pic.twitter.com/ZJ1Je1C4NW
— Chukwuemeka Afigbo (@nke_ise) August 16, 2017
Of course, there are (depressingly) all sorts of examples of technologies being calibrated to favour people who conform to a white racial appearance, from the Kodak’s “Shirley” calibration cards, to Nikon’s “Did someone blink?” filter, to HP’s webcam face tracking software. There are unfortunately more examples, which I won’t list here, but to suffice it to say this demonstrates an important aspect of artefactual and technological politics – things often carry the political assumptions of their designers. Even if this was an ‘innocent’ mistake such as a result of a manufacturing error, skewing the calibration etc., it demonstrates the sense in which there remains a politics to the artefact/technology in question because the agency of the object remains skewed along lines of difference.
There are perhaps two sides to this politics, if we resurrect Langdon Winner’s (1980) well-known argument about artefactual politics and the resulting discussion. First, like the well-known story (cited by Winner, gleaned from Caro) of Robert Moses’ New York bridges: “someone wills a specific social state, and then subtly transfers this vision into an artefact” (Joerges 1999: p. 412). What Joerges (1999) calls the design-led version of ‘artefacts-have-politics’, following Winner (I am not condoning Joerges’ rather narrow reading of Winner, just using a useful short-hand).
Second, following Winner, artefacts can have politics by virtue of the kinds of economic, political social (and so on) systems upon which they are predicated. There is the way in which such a deliberate or mistaken development, such as the tap sensor, is facilitated or at the least tolerated by virtue of the kinds of standards that are used to govern the design, manufacture and sale or implementation of a given artefact/technology. So, the fact that a bridge that apparently excludes particular groups on people by virtue of preventing their most likely means of travel, a bus, to pass under it, or a tap only works with lighter skin colour, can pass into circulation, or socialisation perhaps, by virtue of normative and bureaucratic frameworks of governance.
In this sense, and again following Winner, we might think about the ways these outcomes transcend “the simple categories of ‘intended’ and ‘unintended’ altogether”. Rather, they represent “instances in which the very process of technical development is so thoroughly biased in a particular direction that it regularly produces results heralded as wonderful breakthroughs by some social interests and crushing setbacks by others” (Winner 1980: p. 125-6)
So, even when considered the results of error, and especially when the mechanism for regulating such errors is considered to be ‘the market’—with the expectation that if the thing doesn’t work it won’t sell and the manufacturer will be forced to change it—the assumptions behind the rectification of the ‘error’ carry a politics too (perhaps in the sense of Weber’s loaded value judgements).
Third, there is the what Woolgar (1991 – in a critical response to Winner) calls the ‘contingent and contestable versions of the capacity of various technologies’, which might include the ‘manufacturing mistakes’ but would also include the videos produced and their support or contestation through responses in other videos and in media coverage.
This analysis might become further complicated by widening our consideration of the ways in which contingencies render a given artefact/ technology political.
Take, for example, an ‘Internet of Things’ device that might seem innocuous, such as a ‘smart thermostat’ that ‘learns’ when you use the heating and begins to automatically schedule your heating. There are immediate technical issues that might render such a device political, such as in terms of the strength of the security settings, and so whether or not it could be hacked and whether or not you as the ‘owner’ of the device would know and what you may be able to do in response.
Further, there are privacy issues if the ‘smart’ element is actually not embedded in the device but enabled through remote services ‘in the cloud’, do you know where your data is, how it is being used, does it identify you? etc. etc. Further still, the device might appear to be a one-off expense but may actually require a further payment or subscription to work in the way you expected. For example, I bought an Amazon Kindle that had advertising as the ‘screen saver’ and I had to pay an additional £10 to remove it.
Even further, it may be that even if the security, privacy and payment systems are all within the bounds of what one might consider to be politically or ethically acceptable, it may still be that there are political contingencies that exclude or disproportionately effect particular groups of people. The thermostat might only work with particular boilers or may require a ‘smart’ meter, so it may also only work with particular energy subscription plans. Such plans, even if they’re no more expensive might require good credit ratings to access them or other pre-conditions, which are not immediately obvious. Likewise, the thermostat may not work with pre-payment meter-driven systems, which necessarily disadvantages those without a choice – renting for example.
The thermostat may require a particular kind of smart phone to access its functionality, which again may require particular kinds of phone contract and these may require credit ratings and so on. The manufacturer of the thermostat might cease to trade, or get bought out, and the ‘smart’ software ‘in the cloud’ may cease to function – you may therefore find yourself without a thermostat. If the thermostat was installed in a ‘vulnerable’ person’s home in order to enable remote monitoring by concerned family members this might create anxiety and risk.
As apparently individual, or discrete, artefacts/technologies become apparently more entangled in sociotechnical systems of use (as Kline says) with concomitant contingencies the politics of these things has the potential to become more opaque.
So, all artefacts have politics and the examples within this post might be considered useful if troubling contemporary examples for discussion in research projects and in the classroom (as well as, one might hope, the committee rooms of regulators, or parliaments).
P.S. I think this now is a chunk of a lecture rewritten for my “Geographies of Technology” module at Exeter, heh.
Used that clip in a lecture a couple of weeks back introducing students to critical digital sociology!