Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Robotopias: mapping utopian perspectives on new industrial technology

Robotopias: mapping utopian perspectives on new industrial technology Purpose – This paper maps utopian theories of technological change. The focus is on debates surrounding emerging industrial technologies which contribute to making the relationship between humans and machines more symbiotic and entangled, such as robotics, automation and artificial intelligence. The aim is to provide a map to navigate complex debates on the potential for technology to be used for emancipatory purposes and to plot the grounds for tactical engagements. Design/methodology/approach – The paper proposes a two-way axis to map theories into to a six-category typology. Axis one contains the parameters humanist–assemblage. Humanists draw on the idea of a human essence of creative labour-power, and treat machines as alienated and exploitative form of this essence. Assemblage theorists draw on posthumanism and poststructuralism, maintaining that humans always exist within assemblages which also contain non-human forces. Axis two contains the parameters utopian/optimist; tactical/processual; and dystopian/pessimist, depending on the construed potential for using new technologies for empowering ends. Findings – The growing social role of robots portends unknown, and maybe radical, changes, but there is no single human perspective from which this shift is conceived. Approaches cluster in six distinct sets, each with different paradigmatic assumptions. Practical implications – Mapping the categories is useful pedagogically, and makes other political interventions possible, for example interventions between groups and social movements whose practice-based ontologies differ vastly. Originality/value – Bringing different approaches into contact and mapping differences in ways which make them more comparable, can help to identify the points of disagreement and the empirical or axiomatic grounds for these. It might facilitate the future identification of criteria to choose among the approaches. Keywords Robots, Cybernetics, Posthumanism, Humanism, Utopia Paper type Conceptual paper Introduction Capitalism is punctuated by crises, most recently the 2008 financial crisis. Modernisation discourses and practices portending historical recomposition of Capital in labour saving/ enhancing technologies, and environmental techno-capitalism abound in scientific and policy © Rhiannon Firth and Andrew Robinson. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode. Rhiannon Firth presented a version of this paper at the 20th International meeting of the Utopian Studies Society of Europe at Monash University, Prato, on 2nd July 2019 and is grateful to attendees for International Journal of Sociology and Social Policy their questions and feedback. Vol. 41 No. 3/4, 2021 Funding: Rhiannon Firth would like to acknowledge the support of the Engineering and Physical pp. 298-314 Emerald Publishing Limited Sciences Research Council (EPSRC) [grant number EP/R021031/1]. Co-author Andrew Robinson is 0144-333X DOI 10.1108/IJSSP-01-2020-0004 independent. literatures and mainstream culture, under terminology of the “fourth industrial revolution” Utopian (Schwab, 2017). The technical emphasis in this “new” paradigm appears to be on automation perspectives on technologies, for example robots encompassing cognitive functions (e.g. natural language robots processing), artificial intelligence and machine learning and algorithms mimicking management functions (Wang et al., 2017) and even social and emotional labour (Breazeal, 2002; Kerruish, 2016). However, some aspects emphasize communication and digital connectivity between humans, technology, and nature through big data, wearable devices, “smart cities,” and “smart factories,” and “the Internet of things”, and human augmentation through nano- and bio-technology (Zanella et al., 2014). There is apparent conflict between desires to replicate/replace human labour through technology, and desires rendering the relationship between humans, machines and nature more symbiotic through hybrid connections (Romero et al., 2016). Academic social theories address shifts in the sociotechnical. They mobilise empirical (based on observation/analysis of reality) and axiomatic (based on normative assumptions about the nature of reality) claims. We contend that divides in these literatures are often cast as hegemonic ontologies or two-way splits, binarizing difference into opposition. For example, Marxists have accused critical posthumanists, who seek to empirically understand entangled human-technology relationships, of normative complicity in reproducing their specific forms under capitalism (Rikowski, 2003, pp. 121–123), whilst posthumanists have bundled Marxists together with (Neo) Liberals proclaiming their mutual complicity in the modernist project of patriarchal essentialism and human exceptionalism, despite incompatible views of what constitutes the human (Braidotti, 2013, p. 19). This divisiveness reflects polarisation in mainstream culture and radical social movements and oversimplifies discussions of accelerating capitalist sociotechnical change and ecological destruction (Dale, 2019). This article crafts a six-cluster typology of perspectives on shifting human-technology relationships along two axes. Along the first axis, we complexify the debate by attention to utopianism, understood as encompassing investments and articulations of affect and desire in the present and/or future intentionality (Garforth, 2009). Optimists invest new technologies with utopian, miraculous or revolutionary potential, with dangers or costs seen as manageable. Pessimists invest them with fears of a dystopian future, portraying a trend towards greater control, alienation, ecocide, and other unwanted outcomes. Between these are strategic or tactical authors, who emphasise socioeconomic systems or assemblages within which technologies are deployed, rendering them wonderful or harmful. The second axis divides humanist and assemblage theories, distinguished by the ontological primacy attached to humans, or else the assemblages or relations within which actors are situated. We take “humanist” to encompass a variety of positions, from belief in an essential human nature or special human creative power, to teleological ideas of a human calling. Assemblage theories are associated with post-humanists, transhumanists and others who critique human essence and view “technology as a trait of the human outfit” (Ferrando, 2013, p. 28). Humans are embedded in wider assemblages containing nonhuman components (e.g. machines), and ideas of “Man,”“the human” or “the individual” are results of contingent assemblages. Assemblage theorists may still judge assemblages in normative and imaginary terms, in relation to the affects or social effects they produce. The clusters do not express unified political positions, nor will we critique them from a singular position, aside from noting our anti-authoritarianism. All clusters can produce authoritarian theories, but some are more prone to. Nor do we associate the clusters with particular historical periods, though they tend to be temporally clustered. Political and historical mappings are important (see Ferrando, 2013, 2014) but beyond our scope. Our purpose is closer to Deleuze’s “problem-field” (Deleuze, 1994). Each cluster is an abstract machine, with concepts as components, arranging percepts and sense in particular ways, making some things visible and thinkable, and obscuring others. When a problem field IJSSP establishes axioms, it tends to ignore or anathematise positions which deny the axiom 41,3/4 (Deleuze, 1994, pp. 108–112, p. 268), but many of these are in principle empirical-type claims and are treated empirically in other clusters. Particular problems arise from a cluster which are not solved within its concepts, which lead to lines of interest, or what the theory is trying to do. It is possible (even desirable) to have attachments to more than one problem-field. Mapping the categories is thus useful pedagogically, and makes other political interventions possible, for example the “insurgent training” that Nold associates with “ontological interventions” (Nold, 2020) between groups whose practice-based ontologies differ vastly. Policy-oriented research is dominated by humanist-optimists, whereas Science and Technology Studies (STS) is dominated by assemblage-optimists and assemblage- tacticians. Humanist-strategists are strong in social sciences, while humanist-pessimist and assemblage-tactical approaches are common in technology-related activism. These different approaches often ignore or speak past one another, leading to a lack of interperspectival learning. Bringing different approaches into contact and mapping their differences can identify disagreements and (empirical or axiomatic) grounds for these (see Figure 1). Humanist-optimist Humanist-optimists invest emerging technologies with hopes for desired utopian futures. Most are enthusiastic about existing economic institutions, conflating these with technological progress. Writings are replete with metaphysical talk of “miraculous” changes (Weise et al., 2018), humans obtaining god-like powers, or the solution of issues like immortality (Kelly, 1994). Historically, they often emerge during accelerated technological innovation, believing currently fashionable sciences (e.g. cybernetics) index foundational levels of life and matter, as in Dennett’s (2004) view that humans, animals and robots are basically similar. Since living creatures are machine-like, there are few ethical or practical barriers to AI, artificial life, biological manipulation, or Human–Robot Interaction (HRI). Neither mechanising humans nor anthropomorphising machines is necessarily fallacious. Figure 1. Visual map of the perspectives There is a long tradition of imagining machines as sources of unlimited wealth and/or Utopian ways around the messy relationship between capital and labour (Wendling, 2009, pp. 68–69; perspectives on Dyer-Witheford, 1999, p. 3), ranging from Babbage in the nineteenth century, through Bell, robots Brzezinski, Drucker, and Wiener in the postwar era, to the “Californian Ideology” (Barbrook and Cameron, 1995) in the 1980–1990s. Opposition to technological change is dismissed as resistance to progress. Fundamental economic shifts make knowledge, innovation and information main sources of wealth (Dyer-Witheford, 1999, pp. 23–26). Humanist-optimists downplay risks of unemployment and machinic enslavement, suggesting work will become more creative, cognitive and autonomous (Reich, 2000; Buterin, 2013; Licklider, 1990; Beer, 1959). Many humanist-optimists use an actor-tool model whereby technologies are basically neutral: negative consequences stem from human misuse. This is compatible with concerns about “technical developments with great possibilities for good and evil” (Wiener, 1948, p. 28). There is often substantial faith in unknown futures or invocations for leaders to step up to realise their utopian aspects. Within humanist-optimism, there is a division between the speculative utopias of transhumanism and more mundane, problem-solving research of “policy relevant” humanist- optimists. Much (para-) academic research falls into the latter subset (for critical discussion see Plows and Reinsborough, 2011; Gupta et al., 2019). Robots, AIs and HRI provide human benefits including efficiency, reduced drudgery, and even inclusivity and sustainability. Dangers are largely technical, subject to techno- or edu-fixes within a neoliberal framework. So hostile AI is an issue of avoiding programming errors and human malice (Sotala and Yampolskiy, 2015), and for Kurzweil (2005, p. 420), capitalist markets provide optimal conditions for friendly AI. At the more utopian end, transhumanists and extropians promote “the belief that we can, and should...overcome our biological limits by means of reason, science and technology”, augmenting humans using new technologies (Ouroboros, 1999, p. 4). Transhumanists also “favour reason, progress, and values centered on [human] well being”, however, they see humanity as a “transitory stage” towards a “transhuman or posthuman condition” (More, 1994, p. 1). One branch focuses on the “Singularity”: a point at which AI surpasses human intelligence. While this is recognised as posing existential dangers to humans, it is a means by which humanity transcends itself, or fulfils humanity’s destiny on an evolutionary ladder. Posthumanists generally characterise transhumanism as liberal humanist, hubristic, and dualistic, and “classist and technocentric” (Ferrando, 2013, p. 28; c.f. Graham, 2004). Kelly is an exemplary humanist-optimist, placing faith in a hivemind or technium: a holistic aggregate of humans, computers and nature (1995, p. 174; cf. Kelly, 2010). Complex systems (including markets and AIs) are alive, intelligent, and smarter than humans (Kelly, 2010, pp. 233, 264, 289). Kelly adopts a cybernetic definition of life, as a “computational function” (Kelly, 2010, p. 96) which can be engineered. Kelly also treats life as metaphysically important, immortal and omnipotent (e.g. Kelly, 2010, pp. 54, 203, 220, 392–394). Machinic life is the next stage in evolution (Kelly, 2010, p. 227). Complex systems will solve problems like climate change (Kelly, 2010, pp. 140, 371–374). Humans should “harness” complex systems and evolution through cybernetic control “to carry us where we cant go by ourselves” (Kelly, 2010, p. 233). Complex systems preclude top-–down control (e.g. state planning) (Kelly, 2010,p.42), but not soft cybernetic control, e.g. manipulating inputs and selecting outputs (Kelly, 2010, pp. 105–106, pp. 281–282). Individuals proliferate, are judged by their performance, and selected-out through “the destruction of the unfit” (Kelly, 2010, pp. 313–314). Although Kelly urges deference to complex systems, he remains humanist, assuming invariants of human nature, notably economic rationality (Kelly, 2010, p. 367) and culture (Kelly, 2010,p. 306). Human life has meaning as a vector in the self-expansion of life. We later explore posthumanist critiques that humanists are prone to essentialism and exceptionalism, excluding non-ideal persons (e.g. women). The combination of humanism, teleological optimism and deference to IJSSP cybernetic control imbues this cluster with authoritarian tendencies. 41,3/4 Humanist-strategic Humanist-strategists maintain that robots and machines can harm or benefit humans, depending on the socioeconomic assemblages they are embedded in. Most are Marxists or other leftists, and socioeconomic rather than technological determinists. Many embrace assemblage theories like situatedness and human-machine interchange. However, human labour retains a special place as the source of creativity, progress or value. There is no fixed human nature, but humanity has an autopoietic power of labour/creation (Wendling, 2009, p. 140). Humanist-strategists seek to use (alienated, but not ontologically autonomous) machines for human-directed goals. They are congealed human labour or knowledge, employed as “fixed capital” owned by capitalists, only seeming like an autonomous force (Marx, 1973[1857-8], p. 692; Marx 1990[1867], p. 508; Wendling, 2009, p. 67). Negative effects of automation within capitalism include unemployment, subordination of workers to machines, and a range of psychological and physical harms (e.g. Marx 1990[1867], pp. 544–545). Positive potential effects include reduced drudgery, increased social wealth, and satisfaction of human needs. The potential of machines will be redeemed in communism (Marx, 1973[1857–8], p. 706). Marxists generally oppose humanist-pessimist “Luddism” as well as optimists’ desocialised technophilia (e.g. Rikowski, 2003, pp. 159–160; Wark, 2004, p. s246). Although humanist-strategists agree that technology is socially mediated and ambivalent in its effects, they disagree which current technologies are reappropriable for strategic or postcapitalist use. Some technologies (e.g. the blockchain, filesharing) are evaluated positively. Technology ownership is also a recurring issue. Humanist-strategists are relatively optimist or pessimist. Optimistic advocates of accelerationism and Fully Automated Luxury Communism see the possibility of total automation and postcapitalism. Accelerating technological tendencies will destroy capitalism and produce socialism through full automation and a universal basic income (UBI) (Srnicek and Williams, 2015; Mason, 2015). This perspective embraces most emerging technologies, including cyborg augmentations, artificial life, biotechnology, automation, and econometric modelling (Srnicek and Williams, 2015, pp. 82, 144; Reed, 2014, p. 529). Post-autonomists see more extensive changes in capitalism than other Marxists, e.g. information society as a new type of exploitation of specifically cognitive labour (Dyer- Witherford, 1999, p. 94). New technologies under capitalism are generally harmful, but contain progressive potential, as sources of abundance in a future liberated society, and as means of recomposition and social struggle. For Berardi, “semiocapitalism” reduces individuals to fragments plugged into automatic systems (2016, pp. 214–218). Terranova (2004, pp. 100, 112–115, 118) sees cybernetic systems as systems of soft control (rather than dispersed networks), altering initial conditions and Darwinian selection among outcomes, bringing distributed systems under control. People are fragmented into emotionally reactive units, then managed through aggregate statistical probability (Terranova, 2004, pp. 20, 123). Terranova encourages resistance within and against this field, including refusals and strategic uses (Terranova, 2004, p. 128). Like other post-autonomists, Griziotti believes in a reciprocal, assemblage-like human- machine exchange (Griziotti, 2019, pp. 13, 144), but sees socioeconomic forces as primary (Griziotti, 2019, p. 15). New technologies aim to elide the machine-life boundary until indistinguishable (Griziotti, 2019, pp. 107, 148). It is possible to self-organise the digital common, but in practice it is dominated by corporations (Griziotti, 2019, p. 180) and controlled by cybernetic automatisms containing the increased creative power of labour (Griziotti, 2019, p. 147). If capitalism continues, Griziotti’s outlook is dystopian. Humans could lose control to a “digital Leviathan” of algorithms (Griziotti, 2019, pp. 44, 135), die out (Griziotti, 2019, p. 163), Utopian or be trapped in “a rampant technological neuro-totalitarianism” (Griziotti, 2019, p. 174). The perspectives on current situation involves widespread performance anxiety, precarity, exploitation of free robots labour, bullying, surveillance, disposability, extreme competition, and mental distress and collapse (Griziotti, 2019, pp. 50–1, 104). However, these technologies also allow oppositional and postcapitalist uses. Peer production, gaming and virtual worlds, hacker communities, 3D printing, and cryptocurrency are positive forces (Griziotti, 2019, pp. 80–1, 131, 148, 190). “The only way to continue the human story is the construction of the common” (Griziotti, 2019, p. 207). Open Marxists are often more pessimistic, seeing technology as a means through which capital encloses people and enforces their reduction to abstract value. Kleiner (2016, pp. 63– 68) theorises a strategic battle between systemic forces of enclosure (which impose work) and practices of escape, which today focuses on intermediation in virtual networks. Some (e.g. Pitts and Dinerstein, 2017) favour DIY initiatives to create a concrete instead of abstract utopia. Cooperativists generally seek a lower-technology socialism focused on direct worker control and production management; some see platforms and the gig economy as an opportunity to expand cooperativism. It “has the potential to be wildly democratizing” (Sipp, 2016, p. 60; c.f. Rushkoff, 2016, p. 33) despite current exploitative tendencies. Humanist-pessimist Humanist-pessimists see emerging technologies as threats to vital human values. Most are radical ecologists, with some romantic conservatives, anarchists, and craft socialists also in this cluster. There is a longstanding critique found in Heidegger (1977[1954]), Marcuse (1962 [1941]) and Adorno (2002[1977]) that because science and technology are purely instrumental, they are corrosive of qualitative, subjective, immanent or expressive meaning. In a technologically alienated world, people become disconnected from others and nature, dependent on technology and social hierarchies, and lose autonomy. Technology is addictive, making users dependent. Psychological welfare and life-goals are threatened, along with ecosystems which matter inherently and as bases for human survival. Humanist-pessimists generally advocate degrowth, human-scale communities, and engagement in meaningful life- activity and self-actualisation (Kallis, 2017; Chamberlin, 2009). Technology as such, or its malevolent subclass, is considered part of a general system or “megamachine” ( Mumford, 1986, p. 321). Technologies are not simply tools. They embed strong technological determinism. Either technology and tools are radically differentiated (e.g. Zerzan, 1997; Gorrion, 2012), or technology is bisected. At some point, technology develops inhuman agency and harms humans’ autonomy, or else expresses humanity’s self- alienation. Zerzan (1997, p. 1) views technology as essentially bad, and as underpinning hierarchies like gender and division of labour. Computers are the latest stage in making people “dependent on the machine for everything” (Zerzan, 2012, p. 92), while AI and robotics will render humans unnecessary (Zerzan, 2012, p. 101). Virtual reality “takes representation to new levels of self-enclosure and self-domestication” (Zerzan, 2008, p. 3). For Perlman (1983,p.46) civilisation is a force of death, reducing “human beings to things”. For Winner, benefits of technology come at the cost of vulnerabilities to high-cost disruptions, and likely unbearable resultant policing demands (Winner, 1986, p. 319). Kingsnorth (2015) distinguishes between addictive technologies which require the entire industrial economy to function, and simple tools which do not. Other humanist-pessimists, like Illich (1973), Mumford (1986[1966]) and Ellul (1965[1954]), distinguish between benevolent and harmful technologies. Illich divides tools into convivial (aiding individual autonomy, social conviviality and ecology) and industrial (manipulative or dependency-forming) (Illich, 1973, pp. 12–14). Illich believes humans have an “inalienable IJSSP nature” (Illich, 1973, p. 97) which is flexible only “within bounds” (Illich, 1973, p. 46). Industrial 41,3/4 society subordinates humans to technological logics, and may threaten human life (Illich, 1973, p. 47). Convivial tools allow user autonomy, can be used or avoided at will, for different purposes by different users. Most low-technology tools are convivial (Illich, 1973, p. 22). Most high-tech tools (e.g. cars and hospitals) are industrial. However, some higher-tech tools, such as telephones and bicycles, pass Illich’s test (Illich, 1973, pp. 22, 64, 79). Referencing robotics, Illich depicts even simple machines as “energy slaves” (Illich, 1973, p. 14) dangerously substituting or supplementing human energy inputs, introducing unequal power (Illich, 1973, p. 26; Illich, 1974). Machines stem from an earlier desire for a “laboratory- made homunculus [that] could do our labor instead of slaves” (Illich, 1973, p. 20). This fails to overcome the master-slave relation (Illich, 1973, p. 20). Humans must then be educated to work alongside homunculi, and thus, subordinated to tools (Illich, 1973, p. 30). Illich’s followers provide criteria and typologies for convivial technology (Kostakis et al., 2016; Prieur, 2011; Gordon, 2009), generally focused on avoiding ecological harms, encouraging user autonomy and egalitarian and participatory societies, and providing “meaningful” work. Voinea (2018, p. 76) provides criteria of flexibility, transparency, simplicity/usability, sharedness, creativity and sociality. This raises questions about whether advanced, anthropomorphised technologies (e.g. robotics) can be convivial, with some arguing that robots are likely to become dependency-forming and substitute for human skills in a wider “attack on autonomy” (Anon, 2018, p. 10). Humanist-pessimists rely on many of the same empirical-type claims as humanist-optimists, for example the idea that the “technium” has grown beyond human control. However, the affective investments and normative imperatives are opposed: it is “rearranging itself round us now like a prison” (Kingsnorth, 2015, p. 39), threatening “the abolition of human nature” (Illich, 1973, p. 41). “Human nature” here is axiomatic: Harvey (2015) argues that humans are becoming robotised through media exposure, breaking down complex selves and capacities like concentration and empathy. Assemblage-optimist While humanist-optimists value technological development as empowering or augmenting humans, assemblage-optimists value becoming-other and decentring of supposedly immanent binaries in technological/cybernetic assemblages. Assemblage-optimists tend to scorn modernity, within which “humanism”, “the human”, “reason” and “the subject” are necessarily enmeshed. Oppressions and violence are rooted in presubjective linguistic binaries which elevate humanity over Derridean Others. A Derridean view of technology as a subordinate/supplementary term in a co-constituted binary with the “human” posits technology as the underprivileged term, used to disrupt the privileged “human”.In posthumanism, “[t]he cyborg, the monster, the animal... are... emancipated from the category of pejorative difference” (Braidotti, 2011, p. 68). Optimistic posthumanists celebrate emergent technologies’ potential to subvert dominant binaries, and thereby undermine social oppression. Posthumanists reject “humanism” or “Man”; they are post-anthropocentric and post-dualist (Ferrando, 2013, p. 27). They reconceive humans as entangled, situated in, and co-constituted by, assemblages encompassing nonhuman elements. Posthumanists have common belief in “technogenesis” (Ferrando, 2014, p. 28): humans relate to technology as animals to habitat (Braidotti, 2011, p. 62); and are always-already co-evolved with technology (Graham, 2004, p. 25). Thus human-technological hybridity should not be used instrumentally or feared as threatening (e.g. Graham, 2004, p. 27). Posthumanist-optimists often situate themselves as a golden mean between euphoric transhumanists and technophobes (Braidotti, 2011, p. 55; Graham, 2004, pp. 16–17; Ferrando, 2013, p. 28): a Derridean strategy of seeking ambivalence while Utopian recognising interdependence of terms. perspectives on New materialism is sometimes treated as a posthumanist subtype and sometimes a robots distinct approach. It places greater emphasis on embodiment (rather than textual constructivism), generally conceiving matter as a process of “materialization” (Ferrando, 2013, p. 30), or becoming within a monistic-holistic field. New materialism encompasses authors like Braidotti; Karen Barad (2003), who argue reality is produced through a necessary splitting of the holistic field by observers; and Jane Bennett (2010), who believes inorganic entities exhibit vital force and should be ethically valued as agents. Like posthumanists, new materialists place strong emphasis on human embeddedness in the world (Barad, 2007, p. 185). Metahumanism effectively rebrands posthumanism, sharing a focus on “an unquantifiable field of relational bodies”, technogenesis, critique of humanism and binaries, and inbetweenness as subversion of modernity/capitalism (del Val and Sorgner, 2011). Posthumanists value emerging technologies for their disruption of “the human”. Humans are effects of assemblages: becoming “posthumans” or “cyborgs” when placed in different assemblages(e.g. Haraway, 2015[1985]; Kaloski, 1997). Resistance to this is taken as a reactionary clinging to order and attempt to keep collapsing binaries intact. Hence, for Carroll (2003) objections to eBooks are reactions to “a threat or disruption” to a “habituated behavior” of reading which is itself socially conditioned. Becoming posthuman may entail renouncing human agency, reducing it to a moment of reflection, or simply making it more situated. When becoming posthumans, we undergo changes perhaps including a more relational worldview, constant awareness of connectedness, an other-centric ethics of accountability, loss of fears of new technologies and social changes, and in some cases a passive rather than active stance towards the world. Donna Haraway rejects boundaries among humans, animals and machines on an assemblage-theoretic basis. People are always-already cyborgs, enmeshed in “messy’ interactions inside nonhuman systems (Haraway 2016[2003], p. 181). While Haraway now renounces the label “posthumanist’ (2015, p. 161), she still writes of people as cyborgs (Haraway, 2016). In her earlier work, she defined the human as an effect of binaries splitting “Man” from nature, resting on “othering” of animals, robots, nature, women, etc (Haraway 2016[1985], pp. 28–30). Cybernetics leads outside such representational hierarchies. Monsters and cyborgs are valued for their transgression of binaries. Haraway’s early work is driven by an ethicopolitical duty to cyborgise as part of a Manichean battle against “modernity”. In her more recent work, “modernity”’ is taken to have collapsed in ecological crisis, the main task now to recreate refuges for biodiversity and “make kin” (2015, p. 162). Posthumanist-optimists generally endorse or assume cybernetics or close relatives (e.g. complexity theory and systems theory). Braidotti insists on the “primacy of intelligent and self-organizing matter” over human agency (2019, p. 31). Machines subvert binaries, producing direct relations (2013, p. 57); cybernetics is post-representational (Braidotti, 2013., p. 59). Hayles embraces a vision of complexity as recursive reapplication of simple rules to simple nodes (1999, p. 285). Humans are to relinquish control to distributed cognition of automated systems (Hayles, 1999, p. 288). Wolfe seeks a “self-referential autopoiesis” in which self-referential closure of individuals enables social determination and thus systemic complexity (Wolfe, 2010, p. xxi). Some posthumanists emphasise the distinctness of human embodiment, which is both like and unlike machines (Hayles, 1999, pp. 283–284; Wolfe, 2010, p. xxiii; Ferrando, 2013, p. 32). However, their “body” tends to be a relational node in cybernetic networks, not an inner self or material substance. Most posthumanists supplement cybernetics with holistic, relational, or Derridean ethics. For Braidotti, “affirmative ethics” makes the difference (2019, p. 41) between “good” cybernetic relationality and “bad” capitalist accelerationism. For Graham (2004, p. 10), “choices and values” are central to how (not whether) to use new technologies – from a position “always already immersed in the material conditions of our own creations” (2004, IJSSP p. 27). The shift from modernity to cybernetics has already happened: we must craft an 41,3/4 appropriate ethics, politics, and subjectivity for the new reality (Braidotti, 2011, p. 69, 2019, pp. 34, 49; Ferrando, 2013, p. 32). To the extent that ethicisation is transformation, there is always a tactical/strategic element to posthumanism, and posthumanists are typically ethically committed scholars. However, they rarely allow selection among technologies and are optimistic about the effects of (properly ethically supplemented) technological change as such. Some endorse capitalism, claiming it impossible and undesirable to leave (Rossini and Toggweiler, 2017, p. 6). Others are anti-capitalist, while endorsing related modes of cybernetic power (Braidotti, 2019, pp. 40–41; Haraway, 2016; Ferrando, 2013). Fantasies of technology overcoming finitude, embodiment or death are generally denounced (Braidotti, 2013, p. 60; Hayles, 1999,p.5; Wolfe, 2010, p. xv; Graham, 2004, p. 24). An exception is Land (2011)[1993], who values cybernetic systems because they slip out of human control. Cybernetics began as a control project, but has spun out of control (Plant and Land, 2014, p. 305). Today’s political landscape therefore pits viral machinic forces against the “phobic resistance” and immunopolitics of reactionary humans (Land, 2014, p. 256). As viral forces triumph, “Humanity recedes like a loathsome dream” (Land, 2014, p. 261). People become hyper-diverse, fragmentary individuals, exceeding human limits and crossing identity-boundaries in a kind of cyberpunk utopia (Land, 2011[1997], p. 456). Critics object that even ethicised posthumanist-optimism eliminates human agency (Fuller, 2000, p. 26; Nold, 2020). Similarly to humanist-optimists, assemblage-optimists (including some posthumanists) give technology agency in itself, channelling their desires through it. However, some posthumanists take a more critical and political stance, as outlined below. Assemblage-tactical Assemblage-tactical approaches use assemblage models but seek to maintain human agency. Most term themselves tactical rather than strategic, aligning with de Certeau’s (1988[1980]) association of strategy with rigid hierarchies and tactics with micropolitical everyday resistance. Tactical media is based on de Certeau’s theory and involves bricolage and detournement of technologies outside their usual assemblages. It refers to bottom-up, flexible, hybrid and provisional approaches (Garcia and Lovink, 2008). Assemblages are assessed in terms of how far they provide joyous experiences, empower individuals/collectives, equalise power, produce cooperation, etc. Technology is not neutral, but its impact depends on the assemblage it is part of. So some technologies can be reclaimed, hacked and repurposed – track-jumping into more emancipatory assemblages. Selection distinguishes them from assemblage-optimists. They usually prefer an active “hacker” over a passive “consumer” role and emphasise participation (Richardson, 2003, pp. 347–350; Westerkamp, 2003, p. 261). Poststructuralism underpins assemblage-tactical positions. For Deleuze and Foucault, “there is no need to uphold man in order to resist” (Deleuze, 1988, p. 92). Everything is part of assemblages, but resistance is possible based on desires, lines of flight, or forces of life entrapped within assemblages. Certain non-humanist machines are preferable (schizorevolutionary or active power for Deleuze; aesthetic self-constitution and self-care for Foucault). “The question concerns the forces that make up man: with what other forces do they combine, and what is the compound that emerges?” (Deleuze, 1988, p. 73). Cybernetics is treated as a disempowering economic machine fragmenting and recomposing labour (Deleuze, 1988, p. 131) or as channelling flows (Deleuze and Guattari, 1987, pp. 510–512), sometimes leading to machinic enslavement (see below). DeLanda (1991,p.3)isasa “robot historian”, focusing on machinic agency. He conceives the human-robot relation as symbiotic, with machines using humans for propagation. Preciado both opposes disciplinary control and manipulation of techno-bodies, and celebrates gender-bending, experimentation, and “molecular revolution” generated by the current “pharmacopornographic biocapitalism” Utopian (Preciado, 2013[2008]), pp. 325, 166). perspectives on “Contestational robotics” applies tactical media theory to robot design and HRI, robots encouraging tinkerers to create simple robots or drones for purposes such as infiltrating securitised sterile zones inaccessible to human protesters, and spreading pamphlets. They call for “continuous development of tactics to reestablish a means of expression and a space of temporary autonomy within the... social” in the context of “advanced surveillance capabilities” (Critical Art Ensemble and Institute for Applied Autonomy, 2001, p. 115). Hacker accounts suggest empowering affects within certain technosocial assemblages, relative to everyday life. The Hacker Manifesto (The Mentor, 1986) depicts everyday life as a hellhole of status-competition, apathy, authoritarianism and sadism, with computers freeing the hacker. Hacker culture is often portrayed as a subversive counterculture involving gift economy and an ethos of sharing, non-instrumentality, and information freedom (Levy, 1984; Stallman, 2015; Reimens, 2002; Dasgupta, 2003, pp. 335–356), though some suggest it is recuperated in neoliberal cyberculture (Turner, 2006). Pirate radio, meshnets, FLOSS, anonymity software, hacklabs, and hacktivism fit into this model. STS scholars today are influenced by poststructuralism, and use assemblage approaches bridging optimist and tactical clusters. Examples include the Anglo-Foucauldian school, Actor Network Theory (ANT) and Object-Oriented Ontology (OOO). ANT explores relations in a network, foregoing explanation (Law, 2004, p. 157; Latour, 2005). It sees knowledge- production and practice as world-changing interventions, and rejects strong social constructivism, technological determinism, and essentialism (Law, 1991,p.8; Nold, 2017, p. 24). Robots, like humans, are “actants”. ANT’s flat ontology tends towards assemblage- optimism and cybernetic soft power because of difficulty motivating choices among assemblages, embrace of manipulative power, and de-emphasising of human agency (Harman, 2018, pp. 136–139). However, ANT is critical of rendering the conditions of scientific/technological production invisible (“blackboxing”)(Latour, 1999, p. 314). De- blackboxing technology is consistent with assemblage-tactical ideas of hacking and power within networks. OOO (Harman, 2018) emphasises human humility before objects, particularly “hyperobjects” which are too big to know or control (Morton, 2013). While Harman’s politics is reformist or quietistic, Bryant’s discussion of eruptive rogue objects, including new technologies (Bryant, 2012), is more clearly assemblage-strategic. Anglo-Foucauldians like Rose et al. both argue for an assemblage ontology (2016, pp. 1, 3, 9) and call for humans to be “at the centre” of new technologies, and augmented rather than replaced (Rose et al., 2016, p. 3). There is also a group of critical/radical posthumanists in the assemblage-tactical cluster, combining posthumanist, new materialist and ANT ideas with radical politics; embracing criticisms of mainstream posthumanism while salvaging the name and ontology: endorsing selection among technologies and assemblages. Nold criticises authors like Latour and Bennett, arguing that some objects should be excluded from social collectives – e.g. nuclear reactors and surveillance systems (Nold, 2020). He calls for a “pragmatic coalition of human and nonhuman agencies” making targeted interventions (Rose et al., 2016). Cudworth and Hobden endorse posthumanism in the sense of assemblage enmeshment and antihumanism (Cudworth and Hobden, 2018, p. 5, 8), ethics of responsibility, rejection of purity (Rose et al., 2016, p. 156), and an affinity for complexity theory (Rose et al., 2016, pp. 13–15). However, they criticise emphasis on entanglement rather than power (Rose et al., 2016, p. 14). They posit selection as an ethical imperative to equalise and reduce harm. This involves lower-impact living, vegetarianism, everyday conviviality, anti-capitalism, and bottom-up community (Rose et al., 2016, pp. 137, 146–147, 151–152). Anarchists using cybernetics similarly try to unpack decentralising and self-organising aspects from the focus on control (Swann, 2018; Beer, 1959; Duda, 2013; Goodman, 2010). Assemblage-tacticians are more prepared to denounce cybernetic power than assemblage- IJSSP optimists. Can robots be used within empowering assemblages? Contestational robotics 41,3/4 suggests they can, as demonstrated by robotic distribution of illegal abortion pills at a recent Irish protest (Press Association, 2018). Assemblage-tacticians have similar issues to humanist-pessimists, of which technologies are tactically progressive. By their implied criteria, a good use disrupts dominant systems, empowers disempowered people, and feels empowering. One problem in HRI is how to tactically “use” strong AI, without being used back, in the manner Galloway (2004) suggests already happens with ostensibly empowering protocols. Assemblage-pessimist Assemblage-pessimists are a smaller cluster of poststructuralists and anarchists who do not object to assemblages in principle, but believe present dominant assemblages are broadly disempowering, alienating, immiserating, and fragmenting. Assemblage-pessimists worry about elite control of new technologies, heavy military and surveillance use, the emergence of alien machinic perspectives, and a cluster of social, ecological, psychological and physical harms. Assemblage-pessimists focus on “machinic enslavement”: humans become cogs within social machines, or subordinate nodes in computer networks). Hence, “human beings... are constituent pieces of a machine that they compose among themselves and with other things (animals, tools), under the control and direction of a higher unity” (Deleuze and Guattari, 1987, pp. 456–457). Flows of desire are presubjective and not distinctly human. Social assemblages are assessed in terms of whether they subordinate/are subordinate to flows of desiring- production (Deleuze and Guattari, 2004, p. 8). “Higher unity” distinguishes enslaving systems from free systems. Machinic enslavement ’makes desubjectivized flows and fragments’ and then ‘turns those subjects into component parts of machines (slave units in the cybernetic sense)” (Wark, 2017, p. 51). Personal aspects of life are thus rendered irrelevant. “Whether you are happy, whether you stutter, whether you are afraid of death or of old age – all this counts for nothing [...] On the contrary, it inconveniences. It makes too much ‘noise’ in the sense of information theory” (Guattari, 1996, p. 137). Galloway (2004, 2012) understands the Internet through Deleuze’s control society theory, with interfaces operating similarly to control. Media are metonymical rather than indexical, with objects reduced to classes, and thereby blackboxed (Galloway, 2012, p. 9). Objects are manipulated to form a world (Galloway, 2012, p. 23). Baudrillard (1994, p. 81) sees the cybernetic order–the “code” - as seeking ’absolute control’. Reality is fragmented into ’simple elements’ rearranged into binary oppositions and segmented performances, on the model of surveys (Galloway, 2012, pp. 83–84). Feedback systems are closed and tautological, lacking affective force, meaning, and reversibility (Galloway, 2012, p. 78), causing human suffering, and taking to its implosive conclusion the denial of symbolic exchange. Cybernetics is seen as a conservative system, designed to suppress chaotic flows or capture data. The Invisible Committee see cybernetics as a conservative strategy to ’impede the spontaneously entropic, chaotic movement of the world’ (Invisible Committee, 2014, p. 3). Gorrion (2012) sees apparatuses domesticating and containing underlying ontological chaos, through mechanisms of capture and machinic enslavement. Apparatuses are used strategically by human controllers, but tend to condition strategisers as well as users, making humans increasingly robot-like. Virilio sees cybernetic systems as a stage in the emergence of a “vision machine” from origins in military control. Machinic gaze is privileged over human gaze, and machines start to “see in our place” (Virilio, 1994, p. 64). Machinic vision is continuous with modern reason (Virilio, 1994., p. 70), inaccessible to humans, exercises total control, and disrupts other important assemblages (1990, pp. 68–69). Virilio instead values human-scale, civilian social Utopian relations and ecological systems (Virilio, 1986, pp. 117–118, 1990, p. 13, 2000, pp. 14–15). perspectives on robots Conclusion The growing social role of robots and human-technology symbiosis portends unknown, potentially radical, changes, but there is no single human perspective on this shift. Approaches largely cluster in six distinct sets, each with different paradigmatic assumptions. It is important to develop ways to test claims made by different approaches, instead of remaining within group-specific tautologies, as well as disembedding axioms masquerading as facts which do much of the work in several clusters. The purpose of this typology has been to elucidate a diversity of problem fields, explore paradigmatic assumptions and explanatory strength, and to signal some of the philosophical and political work these perspectives can do. Humanists are prone to essentialism and human exceptionalism, which can exclude non-ideal humans, non-human beings and nature. Assemblage theories are prone to relativism. In combination with utopian optimism or dystopian pessimism, perspectives can suggest deterministic or nihilistic attitudes to emerging technologies. The strategic and tactical clusters open up more possibilities for political agency. There is also an element of futurology in mapping the field because epistemologies prefigure technology creation. It seems likely that we are approaching a new reality in which old categories fail, as in an experiment described by Kuhn (1962, pp. 62–64; Bruner and Postman, 1949), where viewers watching rapidly-spinning playing cards depicting black hearts could not discern them, instead labelling them red hearts or black clubs. Only with familiarity and slowdown were black hearts visible. Human-robot relations may be the black hearts of our time, with traits from different models – the benevolent tools of humanist-optimists, the dangerous inhuman systems of humanist-pessimists, the congealed labour of humanist-strategists, and the extimate others of assemblage theory – combined in as yet unknowable ways. References Adorno, T. (2002[1947]), Dialectic of Enlightenment: Philosophical Fragments, Stanford University Press, Stanford. Anon. (2018), And All the World Shall Become Google: Google’s Digital Attack and its Consequences, Shitstorm: Anarchist Zeitung, Vol. 2, available at: https://theanarchistlibrary.org/library/ anonymous-and-the-world-shall-become-google (accessed 6 September 2019). Barad, K. (2003), “Posthumanist performativity: toward an understanding of how matter comes to matter”, Signs, Vol. 28 No. 3, pp. 801-831. Barad, K. (2007), Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning, Duke University Press, Durham. Barbrook, R. and Cameron, A. (1995), “The Californian ideology”, Science As Culture Vol. 6 No. 1, pp. 44-72. Baudrillard, J. (1994[1981]), Simulacra and Simulation, University of Michigan Press, Lansing, MI. Beer, S. (1959), Cybernetics and Management, English Universities Press, London. Bennett, J. (2010), Vibrant Matter. A Political Ecology of Things, Duke University Press, Durham, NC. Berardi, F. (2016), Cognitarians and Semiocapital, Maska, Ljubljana. Braidotti, R. (2011), “Meta(l)morphoses: women, aliens and machines”,in Nomadic Theory: The Portable Rosi Braidotti, Columbia University Press, , New York, pp. 55-80. Braidotti, R. (2013), The Posthuman, Polity, Cambridge. Braidotti,R.(2019), “A theoretical framework for the critical posthumanities”, Theory, Culture & Society Vol. 36 No. 6, pp. 31-61. Breazeal, C. (2002), Designing Sociable Robots, MIT Press, Cambridge, MA. IJSSP Bruner, J.S. and Postman, L. (1949), “On the perception of incongruity: a paradigm”, Journal of 41,3/4 Personality, Vol. XVIII, pp. 206-223. Bryant, L. (2012), “Five types of objects: gravity and onto-cartography”, Larval Subjects, available at: https://larvalsubjects.wordpress.com/2012/06/17/five-types-of-objects-gravity-and-onto- cartography/ (accessed 6 September 2019). Buterin, V. (2013), “Bootstrapping an autonomous decentralized Corporation, Part 2: Interacting with the world”, Bitcoin Magazine, Vol. 21, available at: https://bitcoinmagazine.com/articles/ bootstrapping-an-autonomous-decentralized-corporation-part-2-interacting-with-the-world-1379808279 (accessed 6 September 2010). Carroll, L. (2003), “Reading technology: curling up with a good information appliance”, in Sarai Collective (Eds.), Sarai Reader 03: Shaping Technologies, CSDS/Sarai, Delhi, pp. 205-214. Chamberlin,S.(2009), The Transition Timeline: For a Local, Resilient Future,Chelsea Green, Hartford,VE. Critical Art Ensemble and Institute for Applied Autonomy (2001), “Contestational robotics”, in Critical Art Ensemble (Eds.), Digital Resistance: Explorations in Tactical Media, Autonomedia, New York, pp. 115-134. Cudworth, E. and Hobden, S. (2018), The Emancipatory Project of Posthumanism, Routledge, London. Dale, G. (2019) “Degrowth and the green new deal”, Ecologist, 28 October 2019, available at: https:// theecologist.org/2019/oct/28/degrowth-and-green-new-deal (accessed 30 October 2019). Dasgupta, R. (2003), “Beyond the apocalypse: an unfinished meditation on ethics”, in Sarai Collective (Eds.), Sarai Reader 03: Shaping Technologies, CSDS/Sarai, Delhi, pp. 236-242. de Certeau, M. (1988[1980]), The Practice of Everyday Life, University of California Press, Berkeley, CA. del Val, J. and Sorgner, S.L. (2011), “A metahumanist Manifesto”, available at: https://metabody.eu/ metahumanism/ (accessed 20 December 2019). DeLanda, M. (1991), War in the Age of Intelligent Machines, Swerve, New York. Deleuze, G. and Guattari, F. (1987), A Thousand Plateaus, Continuum, London. Deleuze, G. and Guattari, F. (2004), Anti-Oedipus, Continuum, London. Deleuze, G. (1988), Foucault, Continuum, London. Deleuze, G. (1994), Difference and Repetition, London, Athlone. Dennett, D. (2004), Atheism Tapes, part 6, BBC TV documentation of Jonathan Miller, produced by Richard Denton, recorded 2003, broadcast 2004, available at: https://www.youtube.com/watch? v5fvG-q7VrFPg (accessed 6 August 2019). Duda, J. (2013), “Cybernetics, anarchism and self-organisation”, Anarchist Studies, Vol. 21 No. 1, pp. 52-72. Dyer-Witheford, N. (1999), Cyber-Marx: Cycles and Circuits of Struggle in High-Technology Capitalism, University of Illinois Press, Champaign, IL. Ellul, J. (1965[1954]), The Technological Society, Knopf, New York. Ferrando, F. (2013), “Posthumanism, transhumanism, antihumanism, metahumanism and new materialisms: differences and relations”, Existsenz Vol. 8 No. 2, pp. 26-32. Ferrando, F. (2014), “Is the post-human a post-woman? Cyborgs, robots, artificial intelligence and the futures of gender: a case study”, European Journal of Forest Research, Vol. 43, pp. 1-17. Fuller, S. (2000), “Why science Studies has never been critical of science: some recent lessons on how to be a helpful nuisance and a harmless radical”, Philosophy of the Social Sciences, Vol. 30 No. 1, pp. 5-32. Galloway, A.R. (2004), Protocol: How Control Exists after Decentralization, MIT Press, Cambridge, MA. Galloway, A.R. (2012), The Interface Effect, Polity, Cambridge. Garcia, D. and Lovink, G. (2008), “The ABC of tactical media”, available at: http://www. Utopian tacticalmediafiles.net/articles/3160 (accessed 6 August 2019). perspectives on Garforth, L. (2009), “No intentions? Utopian theory after the future”, Journal for Cultural Research, robots Vol. 13 No. 1, pp. 5-27. Goodman, R. (2010), New Reformation: Notes of a Neolithic Conservative, PM Press, Oakland. Gordon, U. (2009), “Anarchism and the politics of technology”, Working USA: Journal of Labor and Society, Vol. 12 No. 3, pp. 489-503. Gorrion, A. (2012), “Robots of repression”, Mute, 27 March 2012, available at: https://www.metamute. org/community/your-posts/robots-repression (accessed 16 April 2020). Graham, E. (2004), “Post/Human conditions”, Theology & Sexuality, Vol. 10 No. 2, pp. 10-32. Griziotti, G. (2019), Neurocapitalism, Minor Compositions, New York. Guattari, F. (1996), The Guattari Reader, Blackwell, Oxford. Gupta, P., Chauhan, S. and Jaiswal, M.P. (2019), “Classification of smart city research: a descriptive literature review and future research agenda”, Information Systems Frontiers Vol. 21 No. 3, pp. 661-685. Haraway, D. (2015), “Anthropocene, capitalocene, plantationcene, cthulucene: making kin”, Environmental Humanities, Vol. 6, pp. 159-165. Haraway, D. (2016[1985]), “The cyborg manifesto: science, technology and socialist feminism in the late twentieth century”, Manifestly Haraway, University of Minnesota Press, Minneapolis, pp. 3-90. Haraway, D. (2016[2003]), “The companion species manifesto: dogs, people and significant otherness”, Manifestly Haraway, University of Minnesota Press, Minneapolis, pp. 91-198. Haraway, D. (2016), Staying with the Trouble: Making Kin in the Cthulucene, Duke University Press, Durham, NC. Harman, G. (2018), Object-Oriented Ontology: A New Theory of Everything, Pelican, Harmondsworth. Harvey, C. (2015), “Sex robots and solipsism”, Philosophy in the Contemporary World, Vol. 22 No. 2, pp. 80-93. Hayles, N.K. (1999), How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics, University of Chicago Press, Chicago. Heidegger, M. (1977[1954]), The Question Concerning Technology and Other Writings, Garland, New York. Illich, I. (1973), Tools for Conviviality, Harper and Row, New York. Illich, I. (1974), Energy and Equity, Harper and Row, New York. Invisible Committee (2014), ‘Fuck off, Google,’ to Our Friends, semiotext(e), Cambridge, MA, pp. 99-130. Kallis, G. (2017), Degrowth, Agenda, Newcastle. Kaloski, A. (1997), “Bisexuals making out with cyborgs: politics, pleasure, con/fusion”, International Journal of Sexuality and Gender Studies, Vol. 2 No. 1, pp. 47-64. Kelly, K. (1994), Out of Control: The New Biology of Machines, Social Systems and the Economic World, self-published, available at: https://kk.org/mt-files/books-mt/ooc-mf.pdf (accessed 6 October 2019). Kelly, K. (2010), What Technology Wants, Viking, New York. Kerruish, E. (2016), “Perception, imagination and affect in human-robot relationships”, New York, Cultural Studies Review, Vol. 22 No. 2, pp. 4-20. Kingsnorth, P. (2015), “Planting trees in the anthropocene: a conspiracy theory”, Dark Mountain, Vol. 8, pp. 30-42. Kleiner, D. (2016), “Counterantidisintermediation”, in Scholz, T. and Schneider, N. (Eds.), Ours to Hack and to Own: The Rise of Platform Cooperativism, A New Vision for the Future of Work and a Fairer Internet, New York: OR, pp. 63-68. Kostakis, V., Latoufis, K., Liarokapis, M. and Bauwens, M. (2016), “The convergence of digital commons IJSSP with local manufacturing from a degrowth perspective: two illustrative cases”, Journal of Cleaner 41,3/4 Production,pp. 1-10,available at: https://www.minasliarokapis.com/CleanerProduction2016_ Kostakis_DigitalCommonsLocalManufacturing.pdf (accessed 6 August 2019). Kuhn, T. (1962), The Structure of Scientific Revolutions, University of Chicago Press, Chicago, IL. Kurzweil, R. (2005), The Singularity Is Near: When Humans Transcend Biology, Viking, New York. Land, N. (2011 [1993]), ‘Machinic Desire,’ Fanged Noumena: Collected Writings 1987-2007, Urbanomic, Falmouth, pp. 319-344. Land, N. (2011 [1997]), “Meltdown” Fanged Noumena: Collected Writings 1987-2007, Urbanomic, Falmouth, pp. 441-459. Land, N. (2014), “Circuitries”, in Mackay, R. and Avanessian, A. (Eds.), #Accelerate: The Accelerationist Reader, Urbanomic, Falmouth, pp. 251-274. Latour, B. (1999), Pandora’s Hope: Essays on the Reality of Science Studies, Harvard University Press, Cambridge, MA. Latour, B. (2005), Reassembling the Social: An Introduction to Actor-Network-Theory, Oxford University Press, New York. Law, J. (1991), “Introduction”, in Law, J. (Eds.), A Sociology of Monsters: Essays on Power, Technology, and Domination, Routledge, London, pp. 1-25. Law, J. (2004), After Method: Mess in Social Science Research, Routledge, London. Levy, S. (1984), Hackers: Heroes of the Computer Revolution, Nerraw Manijaime/Doubleday, New York. Licklider, J.C.R. and Taylor, R.W. (1990), “The computer as a communication device”,in Memoriam: J. C. R. Licklider 1915–1990, CA Systems Research Centre, Palo Alto, available at: https://web. stanford.edu/dept/SUL/library/extra4/sloan/mousesite/Secondary/Licklider.pdf (accessed 6 August 2019). Marcuse, H. (1962[1941]), “’Some social implications of modern technology”, in Arato, A. and Eike, G. (Eds.), The Essential Frankfurt School Reader, The Continuum Publishing Company, New York, pp. 138-162. Marx, K. (1973[1857-8]), Grundrisse, Penguin, Harmondsworth. Marx, K. (1990[1867]), Capital, Vol. 1, Penguin, Harmondsworth. Mason, P. (2015), PostCapitalism: A Guide to Our Future, Allen Lane, Bristol. Mentor (1986), “Hacker’s Manifesto: the conscience of a hacker”, Phrack, Vol. 1 No. 7, available at: http://phrack.org/issues/7/3.html (accessed 6 August 2019). More, M. (1994), “On becoming posthuman”, available at: www.maxmore.com/becoming.htm. Morton, T. (2013), Hyperobjects: Philosophy and Ecology after the End of the World, University of Minnesota Press, Minneapolis. Mumford, L. (1986[1966]), “The first megamachine”, in Miller, D.L. (Ed.), Lewis Mumford Reader, Random House, New York, 1986, pp. 315-323. Nold, C. (2017), “Device studies of participatory sensing: ontological politics and design interventions”, Doctoral thesis, UCL (University College London). Nold, C. (2020), “Insurrection training for post-human politics”, International Journal of Sociology and Social Policy (in press). Ouroboros (1999), “The transtopian principles”, Version 2.2’, 18 November at: available at: http:// members.wbs.net/homepages/c/r/y/cryogenic4life/index2.html_link_appears_dead. Perlman, F. (1983), Against His-Story, Against Leviathan, Black & Red, Detroit. Pitts, H. and Dinerstein, A. (2017), “Corbynism’s conveyor belt of ideas: postcapitalism and the politics of social reproduction”, Capital and Class, Vol. 41, pp. 423-434. Plant, S. and Land, N. (2014), “Cyberpositive”, in Mackay, R. and Avanessian, A. (Eds.), #Accelerate: Utopian The Accelerationist Reader, Urbanomic, Falmouth, pp. 303-314. perspectives on Plows, A. and Reinsborough, M. (2011), “Encountering ‘the politics of technology’: public engagement robots from the bottom up”,inZulsdorf, T.B., Coenen, C., Ferrari, A., Fiedeler, U., Milburn, C. and Wienroth, M. (Eds.), Quantum Engagements, AKA Verlag, Heidelberg, pp. 91-107. Preciado, P.B. (2013[2008]), Testo Junkie: Sex, Drugs, and Biopolitics in the Pharmacopornographic Era, SUNY Press, New York. Press Association (2018), Activists Take “Abortion Pills” during Pro-choice Rally in Belfast, The Guardian, Vol. 31 May, available at: https://www.theguardian.com/world/2018/may/31/ pro-choice-activists-take-abortion-pills-belfast-protest (accessed 6 August 2019). Prieur, R. (2011), “TechJudge”, available at: http://www.ranprieur.com/tech.html (accessed 6 August 2019). Reed, P. (2014), “Seven prescriptions for accelerationism”, in Mackay, R. and Avanessian, A. (Eds.), #Accelerate: The Accelerationist Reader, Urbanomic, Falmouth, pp. 521-536. Reich, R. (2000), The Future of Success: Working and Living in the New Economy, Random House, New York. Reimens, P. (2002), “Some thoughts on the idea of ‘hacker culture’”, available at: http://cryptome.org/ hacker-idea.htm (accessed 6 August 2019). Richardson, J. (2003), “The language of tactical media”, in Sarai Collective (Eds.), Sarai Reader 03: Shaping Technologies, CSDS/Sarai, Delhi, pp. 346-351. Rikowski, G. (2003), “Alien life: Marx and the future of the human”, Historical Materialism, Vol. 11 No. 2, pp. 121-164. Romero,D., Stahre,J., Wuest,T., Noran, O., Bernus, P.,Fast-Berglund, A. and Gorecky, D. (2016) “Towards an operator 4.0 typology: a human-centric perspective on the fourth industrial revolution technologies” in International conference on computers and industrial engineering (CIE46) proceedings, 29-31 October 2016, Tianjin/China, ISSN 2164-8670 CD-ROM, ISSN 2164-8689 ON-LINE. Rose, N., Aicardi, C. and Reinsborough, M. (2016), “Foresight report on future computing and robotics”, An Ethics and Society Deliverable of the Human Brain Project to the European Commission, Kings College London, London. Rossini, M. and Toggweiler, M. (2017), “Editorial: posthuman temporalities”, New Formations: A Journal of Culture/Theory/Politics, Autumn (92), pp. 5-15. Rushkoff, D. (2016), “Renaissance now”, in Scholz, T. and Schneider, N. (Eds.), Ours to Hack and to Own: The Rise of Platform Cooperativism, A New Vision for the Future of Work and a Fairer Internet, New York, OR, pp. 33-37. Schwab, K. (2017), The Fourth Industrial Revolution, World Economic Forum, Davos. Sipp, K. (2016), “Portable reputation in the on-demand economy”, in Scholz, T. and Schneider, N. (Eds.), Ours to Hack and to Own: The Rise of Platform Cooperativism, A New Vision for the Future of Work and a Fairer Internet, New York, Or, pp. 59-62. Sotala, K. and Yampolskiy, S.J. (2015), “Responses to catastrophic AGI risk: a survey”, Physica Scripta, Vol. 90 No. 1, pp. 1-35. Srnicek, N. and Williams, A. (2015), Inventing the Future: Postcapitalism and a World without Work, Verso, London. Stallman, R.M. (2015), Free Software, Free Society: Selected Essays of Richard M, GNU Press, Stallman, Boston, MA. Swann, T. (2018), “Towards an anarchist cybernetics: Stafford Beer, self-organisation and radical social movements”, Ephemera Vol. 18 No. 3, pp. 427-456. Terranova, T. (2004), Network Culture: Politics for the Information Age, Pluto, London. Turner, F. (2006), From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and IJSSP the Rise of Digital Utopianism, University of Chicago Press, Chicago, IL. 41,3/4 Virilio, P. (1986[1977]), Speed and Politics, semiotext(e), New York. Virilio, P. (1990[1978]), Popular Defense and Ecological Struggles, semiotext(e) , New York. Virilio, P. (1994[1988]), The Vision Machine, British Film Institute, London. Virilio, P. (2000[1998]), The Information Bomb, Verso, London. Voinea, C. (2018), “Designing for conviviality”, Technology In Society, Vol. 52, pp. 70-78. Wang, X.V., Kemeny, Z., Vancza, J. and Wang, L. (2017), “Human–robot collaborative assembly in cyber-physical production: classification framework and implementation”, CIRP annals, Vol. 66 No. 1, pp. 5-8. Wark, M. (2004), A Hacker Manifesto, Harvard University Press, Cambridge, MA. Wark, M. (2017), General Intellects: Twenty-Five Thinkers for the Twenty-First Century, Verso, London. Weise, M.R., Hanson, A.R., Sentz, R. and Saleh, Y. (2018), “Robot ready: humanþ skills for the future of work, Indianapolis”, IN: Strada Institute for the Future of Work. Wendling, A. (2009), Karl Marx on Technology and Alienation, Palgrave, Basingstoke. Westerkamp, H. (2003), “Colliding soundscapes’, interviewed by L. Bhagat”, in Sarai Collective (Eds.), Sarai Reader 03: Shaping Technologies, CSDS/Sarai, Delhi, pp. 255-262. Wiener, N. (1948), Cybernetics: Or Control and Communication in the Animal and the Machine, MIT Press, Cambridge, MA. Winner, L. (1986), The Whale and the Reactor: A Search for Limits in an Age of High Technology, University of Chicago Press, Chicago. Wolfe, C. (2010), What Is Posthumanism?, University of Minnesota Press, Minneapolis. Zanella, A., Bui, N., Castellani, A., Vangelista, L. and Zorzi, M. (2014), “Internet of things for smart cities”, IEEE Internet of Things journal, Vol. 1 No. 1, pp. 22-32. Zerzan, J. (1997), “Against technology”, available at: https://theanarchistlibrary.org/library/john- zerzan-against-technology-a-talk-by-john-zerzan-april-23-1997 (accessed 6 August 2019). Zerzan, J. (2008), “Second-best life: real virtuality”, Green Anarchy 25, available at: https:// theanarchistlibrary.org/library/john-zerzan-second-best-life-real-virtuality (accessed 6 August 2019). Zerzan, J. (2012), Future Primitive Revisited, Feral House, Port Townsend, WA. Corresponding author Rhiannon Firth can be contacted at: r.firth@essex.ac.uk For instructions on how to order reprints of this article, please visit our website: www.emeraldgrouppublishing.com/licensing/reprints.htm Or contact us for further details: permissions@emeraldinsight.com http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Sociology and Social Policy Emerald Publishing

Robotopias: mapping utopian perspectives on new industrial technology

Loading next page...
 
/lp/emerald-publishing/robotopias-mapping-utopian-perspectives-on-new-industrial-technology-VgGekgJK0d

References (135)

Publisher
Emerald Publishing
Copyright
© Rhiannon Firth and Andrew Robinson
ISSN
0144-333X
DOI
10.1108/ijssp-01-2020-0004
Publisher site
See Article on Publisher Site

Abstract

Purpose – This paper maps utopian theories of technological change. The focus is on debates surrounding emerging industrial technologies which contribute to making the relationship between humans and machines more symbiotic and entangled, such as robotics, automation and artificial intelligence. The aim is to provide a map to navigate complex debates on the potential for technology to be used for emancipatory purposes and to plot the grounds for tactical engagements. Design/methodology/approach – The paper proposes a two-way axis to map theories into to a six-category typology. Axis one contains the parameters humanist–assemblage. Humanists draw on the idea of a human essence of creative labour-power, and treat machines as alienated and exploitative form of this essence. Assemblage theorists draw on posthumanism and poststructuralism, maintaining that humans always exist within assemblages which also contain non-human forces. Axis two contains the parameters utopian/optimist; tactical/processual; and dystopian/pessimist, depending on the construed potential for using new technologies for empowering ends. Findings – The growing social role of robots portends unknown, and maybe radical, changes, but there is no single human perspective from which this shift is conceived. Approaches cluster in six distinct sets, each with different paradigmatic assumptions. Practical implications – Mapping the categories is useful pedagogically, and makes other political interventions possible, for example interventions between groups and social movements whose practice-based ontologies differ vastly. Originality/value – Bringing different approaches into contact and mapping differences in ways which make them more comparable, can help to identify the points of disagreement and the empirical or axiomatic grounds for these. It might facilitate the future identification of criteria to choose among the approaches. Keywords Robots, Cybernetics, Posthumanism, Humanism, Utopia Paper type Conceptual paper Introduction Capitalism is punctuated by crises, most recently the 2008 financial crisis. Modernisation discourses and practices portending historical recomposition of Capital in labour saving/ enhancing technologies, and environmental techno-capitalism abound in scientific and policy © Rhiannon Firth and Andrew Robinson. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode. Rhiannon Firth presented a version of this paper at the 20th International meeting of the Utopian Studies Society of Europe at Monash University, Prato, on 2nd July 2019 and is grateful to attendees for International Journal of Sociology and Social Policy their questions and feedback. Vol. 41 No. 3/4, 2021 Funding: Rhiannon Firth would like to acknowledge the support of the Engineering and Physical pp. 298-314 Emerald Publishing Limited Sciences Research Council (EPSRC) [grant number EP/R021031/1]. Co-author Andrew Robinson is 0144-333X DOI 10.1108/IJSSP-01-2020-0004 independent. literatures and mainstream culture, under terminology of the “fourth industrial revolution” Utopian (Schwab, 2017). The technical emphasis in this “new” paradigm appears to be on automation perspectives on technologies, for example robots encompassing cognitive functions (e.g. natural language robots processing), artificial intelligence and machine learning and algorithms mimicking management functions (Wang et al., 2017) and even social and emotional labour (Breazeal, 2002; Kerruish, 2016). However, some aspects emphasize communication and digital connectivity between humans, technology, and nature through big data, wearable devices, “smart cities,” and “smart factories,” and “the Internet of things”, and human augmentation through nano- and bio-technology (Zanella et al., 2014). There is apparent conflict between desires to replicate/replace human labour through technology, and desires rendering the relationship between humans, machines and nature more symbiotic through hybrid connections (Romero et al., 2016). Academic social theories address shifts in the sociotechnical. They mobilise empirical (based on observation/analysis of reality) and axiomatic (based on normative assumptions about the nature of reality) claims. We contend that divides in these literatures are often cast as hegemonic ontologies or two-way splits, binarizing difference into opposition. For example, Marxists have accused critical posthumanists, who seek to empirically understand entangled human-technology relationships, of normative complicity in reproducing their specific forms under capitalism (Rikowski, 2003, pp. 121–123), whilst posthumanists have bundled Marxists together with (Neo) Liberals proclaiming their mutual complicity in the modernist project of patriarchal essentialism and human exceptionalism, despite incompatible views of what constitutes the human (Braidotti, 2013, p. 19). This divisiveness reflects polarisation in mainstream culture and radical social movements and oversimplifies discussions of accelerating capitalist sociotechnical change and ecological destruction (Dale, 2019). This article crafts a six-cluster typology of perspectives on shifting human-technology relationships along two axes. Along the first axis, we complexify the debate by attention to utopianism, understood as encompassing investments and articulations of affect and desire in the present and/or future intentionality (Garforth, 2009). Optimists invest new technologies with utopian, miraculous or revolutionary potential, with dangers or costs seen as manageable. Pessimists invest them with fears of a dystopian future, portraying a trend towards greater control, alienation, ecocide, and other unwanted outcomes. Between these are strategic or tactical authors, who emphasise socioeconomic systems or assemblages within which technologies are deployed, rendering them wonderful or harmful. The second axis divides humanist and assemblage theories, distinguished by the ontological primacy attached to humans, or else the assemblages or relations within which actors are situated. We take “humanist” to encompass a variety of positions, from belief in an essential human nature or special human creative power, to teleological ideas of a human calling. Assemblage theories are associated with post-humanists, transhumanists and others who critique human essence and view “technology as a trait of the human outfit” (Ferrando, 2013, p. 28). Humans are embedded in wider assemblages containing nonhuman components (e.g. machines), and ideas of “Man,”“the human” or “the individual” are results of contingent assemblages. Assemblage theorists may still judge assemblages in normative and imaginary terms, in relation to the affects or social effects they produce. The clusters do not express unified political positions, nor will we critique them from a singular position, aside from noting our anti-authoritarianism. All clusters can produce authoritarian theories, but some are more prone to. Nor do we associate the clusters with particular historical periods, though they tend to be temporally clustered. Political and historical mappings are important (see Ferrando, 2013, 2014) but beyond our scope. Our purpose is closer to Deleuze’s “problem-field” (Deleuze, 1994). Each cluster is an abstract machine, with concepts as components, arranging percepts and sense in particular ways, making some things visible and thinkable, and obscuring others. When a problem field IJSSP establishes axioms, it tends to ignore or anathematise positions which deny the axiom 41,3/4 (Deleuze, 1994, pp. 108–112, p. 268), but many of these are in principle empirical-type claims and are treated empirically in other clusters. Particular problems arise from a cluster which are not solved within its concepts, which lead to lines of interest, or what the theory is trying to do. It is possible (even desirable) to have attachments to more than one problem-field. Mapping the categories is thus useful pedagogically, and makes other political interventions possible, for example the “insurgent training” that Nold associates with “ontological interventions” (Nold, 2020) between groups whose practice-based ontologies differ vastly. Policy-oriented research is dominated by humanist-optimists, whereas Science and Technology Studies (STS) is dominated by assemblage-optimists and assemblage- tacticians. Humanist-strategists are strong in social sciences, while humanist-pessimist and assemblage-tactical approaches are common in technology-related activism. These different approaches often ignore or speak past one another, leading to a lack of interperspectival learning. Bringing different approaches into contact and mapping their differences can identify disagreements and (empirical or axiomatic) grounds for these (see Figure 1). Humanist-optimist Humanist-optimists invest emerging technologies with hopes for desired utopian futures. Most are enthusiastic about existing economic institutions, conflating these with technological progress. Writings are replete with metaphysical talk of “miraculous” changes (Weise et al., 2018), humans obtaining god-like powers, or the solution of issues like immortality (Kelly, 1994). Historically, they often emerge during accelerated technological innovation, believing currently fashionable sciences (e.g. cybernetics) index foundational levels of life and matter, as in Dennett’s (2004) view that humans, animals and robots are basically similar. Since living creatures are machine-like, there are few ethical or practical barriers to AI, artificial life, biological manipulation, or Human–Robot Interaction (HRI). Neither mechanising humans nor anthropomorphising machines is necessarily fallacious. Figure 1. Visual map of the perspectives There is a long tradition of imagining machines as sources of unlimited wealth and/or Utopian ways around the messy relationship between capital and labour (Wendling, 2009, pp. 68–69; perspectives on Dyer-Witheford, 1999, p. 3), ranging from Babbage in the nineteenth century, through Bell, robots Brzezinski, Drucker, and Wiener in the postwar era, to the “Californian Ideology” (Barbrook and Cameron, 1995) in the 1980–1990s. Opposition to technological change is dismissed as resistance to progress. Fundamental economic shifts make knowledge, innovation and information main sources of wealth (Dyer-Witheford, 1999, pp. 23–26). Humanist-optimists downplay risks of unemployment and machinic enslavement, suggesting work will become more creative, cognitive and autonomous (Reich, 2000; Buterin, 2013; Licklider, 1990; Beer, 1959). Many humanist-optimists use an actor-tool model whereby technologies are basically neutral: negative consequences stem from human misuse. This is compatible with concerns about “technical developments with great possibilities for good and evil” (Wiener, 1948, p. 28). There is often substantial faith in unknown futures or invocations for leaders to step up to realise their utopian aspects. Within humanist-optimism, there is a division between the speculative utopias of transhumanism and more mundane, problem-solving research of “policy relevant” humanist- optimists. Much (para-) academic research falls into the latter subset (for critical discussion see Plows and Reinsborough, 2011; Gupta et al., 2019). Robots, AIs and HRI provide human benefits including efficiency, reduced drudgery, and even inclusivity and sustainability. Dangers are largely technical, subject to techno- or edu-fixes within a neoliberal framework. So hostile AI is an issue of avoiding programming errors and human malice (Sotala and Yampolskiy, 2015), and for Kurzweil (2005, p. 420), capitalist markets provide optimal conditions for friendly AI. At the more utopian end, transhumanists and extropians promote “the belief that we can, and should...overcome our biological limits by means of reason, science and technology”, augmenting humans using new technologies (Ouroboros, 1999, p. 4). Transhumanists also “favour reason, progress, and values centered on [human] well being”, however, they see humanity as a “transitory stage” towards a “transhuman or posthuman condition” (More, 1994, p. 1). One branch focuses on the “Singularity”: a point at which AI surpasses human intelligence. While this is recognised as posing existential dangers to humans, it is a means by which humanity transcends itself, or fulfils humanity’s destiny on an evolutionary ladder. Posthumanists generally characterise transhumanism as liberal humanist, hubristic, and dualistic, and “classist and technocentric” (Ferrando, 2013, p. 28; c.f. Graham, 2004). Kelly is an exemplary humanist-optimist, placing faith in a hivemind or technium: a holistic aggregate of humans, computers and nature (1995, p. 174; cf. Kelly, 2010). Complex systems (including markets and AIs) are alive, intelligent, and smarter than humans (Kelly, 2010, pp. 233, 264, 289). Kelly adopts a cybernetic definition of life, as a “computational function” (Kelly, 2010, p. 96) which can be engineered. Kelly also treats life as metaphysically important, immortal and omnipotent (e.g. Kelly, 2010, pp. 54, 203, 220, 392–394). Machinic life is the next stage in evolution (Kelly, 2010, p. 227). Complex systems will solve problems like climate change (Kelly, 2010, pp. 140, 371–374). Humans should “harness” complex systems and evolution through cybernetic control “to carry us where we cant go by ourselves” (Kelly, 2010, p. 233). Complex systems preclude top-–down control (e.g. state planning) (Kelly, 2010,p.42), but not soft cybernetic control, e.g. manipulating inputs and selecting outputs (Kelly, 2010, pp. 105–106, pp. 281–282). Individuals proliferate, are judged by their performance, and selected-out through “the destruction of the unfit” (Kelly, 2010, pp. 313–314). Although Kelly urges deference to complex systems, he remains humanist, assuming invariants of human nature, notably economic rationality (Kelly, 2010, p. 367) and culture (Kelly, 2010,p. 306). Human life has meaning as a vector in the self-expansion of life. We later explore posthumanist critiques that humanists are prone to essentialism and exceptionalism, excluding non-ideal persons (e.g. women). The combination of humanism, teleological optimism and deference to IJSSP cybernetic control imbues this cluster with authoritarian tendencies. 41,3/4 Humanist-strategic Humanist-strategists maintain that robots and machines can harm or benefit humans, depending on the socioeconomic assemblages they are embedded in. Most are Marxists or other leftists, and socioeconomic rather than technological determinists. Many embrace assemblage theories like situatedness and human-machine interchange. However, human labour retains a special place as the source of creativity, progress or value. There is no fixed human nature, but humanity has an autopoietic power of labour/creation (Wendling, 2009, p. 140). Humanist-strategists seek to use (alienated, but not ontologically autonomous) machines for human-directed goals. They are congealed human labour or knowledge, employed as “fixed capital” owned by capitalists, only seeming like an autonomous force (Marx, 1973[1857-8], p. 692; Marx 1990[1867], p. 508; Wendling, 2009, p. 67). Negative effects of automation within capitalism include unemployment, subordination of workers to machines, and a range of psychological and physical harms (e.g. Marx 1990[1867], pp. 544–545). Positive potential effects include reduced drudgery, increased social wealth, and satisfaction of human needs. The potential of machines will be redeemed in communism (Marx, 1973[1857–8], p. 706). Marxists generally oppose humanist-pessimist “Luddism” as well as optimists’ desocialised technophilia (e.g. Rikowski, 2003, pp. 159–160; Wark, 2004, p. s246). Although humanist-strategists agree that technology is socially mediated and ambivalent in its effects, they disagree which current technologies are reappropriable for strategic or postcapitalist use. Some technologies (e.g. the blockchain, filesharing) are evaluated positively. Technology ownership is also a recurring issue. Humanist-strategists are relatively optimist or pessimist. Optimistic advocates of accelerationism and Fully Automated Luxury Communism see the possibility of total automation and postcapitalism. Accelerating technological tendencies will destroy capitalism and produce socialism through full automation and a universal basic income (UBI) (Srnicek and Williams, 2015; Mason, 2015). This perspective embraces most emerging technologies, including cyborg augmentations, artificial life, biotechnology, automation, and econometric modelling (Srnicek and Williams, 2015, pp. 82, 144; Reed, 2014, p. 529). Post-autonomists see more extensive changes in capitalism than other Marxists, e.g. information society as a new type of exploitation of specifically cognitive labour (Dyer- Witherford, 1999, p. 94). New technologies under capitalism are generally harmful, but contain progressive potential, as sources of abundance in a future liberated society, and as means of recomposition and social struggle. For Berardi, “semiocapitalism” reduces individuals to fragments plugged into automatic systems (2016, pp. 214–218). Terranova (2004, pp. 100, 112–115, 118) sees cybernetic systems as systems of soft control (rather than dispersed networks), altering initial conditions and Darwinian selection among outcomes, bringing distributed systems under control. People are fragmented into emotionally reactive units, then managed through aggregate statistical probability (Terranova, 2004, pp. 20, 123). Terranova encourages resistance within and against this field, including refusals and strategic uses (Terranova, 2004, p. 128). Like other post-autonomists, Griziotti believes in a reciprocal, assemblage-like human- machine exchange (Griziotti, 2019, pp. 13, 144), but sees socioeconomic forces as primary (Griziotti, 2019, p. 15). New technologies aim to elide the machine-life boundary until indistinguishable (Griziotti, 2019, pp. 107, 148). It is possible to self-organise the digital common, but in practice it is dominated by corporations (Griziotti, 2019, p. 180) and controlled by cybernetic automatisms containing the increased creative power of labour (Griziotti, 2019, p. 147). If capitalism continues, Griziotti’s outlook is dystopian. Humans could lose control to a “digital Leviathan” of algorithms (Griziotti, 2019, pp. 44, 135), die out (Griziotti, 2019, p. 163), Utopian or be trapped in “a rampant technological neuro-totalitarianism” (Griziotti, 2019, p. 174). The perspectives on current situation involves widespread performance anxiety, precarity, exploitation of free robots labour, bullying, surveillance, disposability, extreme competition, and mental distress and collapse (Griziotti, 2019, pp. 50–1, 104). However, these technologies also allow oppositional and postcapitalist uses. Peer production, gaming and virtual worlds, hacker communities, 3D printing, and cryptocurrency are positive forces (Griziotti, 2019, pp. 80–1, 131, 148, 190). “The only way to continue the human story is the construction of the common” (Griziotti, 2019, p. 207). Open Marxists are often more pessimistic, seeing technology as a means through which capital encloses people and enforces their reduction to abstract value. Kleiner (2016, pp. 63– 68) theorises a strategic battle between systemic forces of enclosure (which impose work) and practices of escape, which today focuses on intermediation in virtual networks. Some (e.g. Pitts and Dinerstein, 2017) favour DIY initiatives to create a concrete instead of abstract utopia. Cooperativists generally seek a lower-technology socialism focused on direct worker control and production management; some see platforms and the gig economy as an opportunity to expand cooperativism. It “has the potential to be wildly democratizing” (Sipp, 2016, p. 60; c.f. Rushkoff, 2016, p. 33) despite current exploitative tendencies. Humanist-pessimist Humanist-pessimists see emerging technologies as threats to vital human values. Most are radical ecologists, with some romantic conservatives, anarchists, and craft socialists also in this cluster. There is a longstanding critique found in Heidegger (1977[1954]), Marcuse (1962 [1941]) and Adorno (2002[1977]) that because science and technology are purely instrumental, they are corrosive of qualitative, subjective, immanent or expressive meaning. In a technologically alienated world, people become disconnected from others and nature, dependent on technology and social hierarchies, and lose autonomy. Technology is addictive, making users dependent. Psychological welfare and life-goals are threatened, along with ecosystems which matter inherently and as bases for human survival. Humanist-pessimists generally advocate degrowth, human-scale communities, and engagement in meaningful life- activity and self-actualisation (Kallis, 2017; Chamberlin, 2009). Technology as such, or its malevolent subclass, is considered part of a general system or “megamachine” ( Mumford, 1986, p. 321). Technologies are not simply tools. They embed strong technological determinism. Either technology and tools are radically differentiated (e.g. Zerzan, 1997; Gorrion, 2012), or technology is bisected. At some point, technology develops inhuman agency and harms humans’ autonomy, or else expresses humanity’s self- alienation. Zerzan (1997, p. 1) views technology as essentially bad, and as underpinning hierarchies like gender and division of labour. Computers are the latest stage in making people “dependent on the machine for everything” (Zerzan, 2012, p. 92), while AI and robotics will render humans unnecessary (Zerzan, 2012, p. 101). Virtual reality “takes representation to new levels of self-enclosure and self-domestication” (Zerzan, 2008, p. 3). For Perlman (1983,p.46) civilisation is a force of death, reducing “human beings to things”. For Winner, benefits of technology come at the cost of vulnerabilities to high-cost disruptions, and likely unbearable resultant policing demands (Winner, 1986, p. 319). Kingsnorth (2015) distinguishes between addictive technologies which require the entire industrial economy to function, and simple tools which do not. Other humanist-pessimists, like Illich (1973), Mumford (1986[1966]) and Ellul (1965[1954]), distinguish between benevolent and harmful technologies. Illich divides tools into convivial (aiding individual autonomy, social conviviality and ecology) and industrial (manipulative or dependency-forming) (Illich, 1973, pp. 12–14). Illich believes humans have an “inalienable IJSSP nature” (Illich, 1973, p. 97) which is flexible only “within bounds” (Illich, 1973, p. 46). Industrial 41,3/4 society subordinates humans to technological logics, and may threaten human life (Illich, 1973, p. 47). Convivial tools allow user autonomy, can be used or avoided at will, for different purposes by different users. Most low-technology tools are convivial (Illich, 1973, p. 22). Most high-tech tools (e.g. cars and hospitals) are industrial. However, some higher-tech tools, such as telephones and bicycles, pass Illich’s test (Illich, 1973, pp. 22, 64, 79). Referencing robotics, Illich depicts even simple machines as “energy slaves” (Illich, 1973, p. 14) dangerously substituting or supplementing human energy inputs, introducing unequal power (Illich, 1973, p. 26; Illich, 1974). Machines stem from an earlier desire for a “laboratory- made homunculus [that] could do our labor instead of slaves” (Illich, 1973, p. 20). This fails to overcome the master-slave relation (Illich, 1973, p. 20). Humans must then be educated to work alongside homunculi, and thus, subordinated to tools (Illich, 1973, p. 30). Illich’s followers provide criteria and typologies for convivial technology (Kostakis et al., 2016; Prieur, 2011; Gordon, 2009), generally focused on avoiding ecological harms, encouraging user autonomy and egalitarian and participatory societies, and providing “meaningful” work. Voinea (2018, p. 76) provides criteria of flexibility, transparency, simplicity/usability, sharedness, creativity and sociality. This raises questions about whether advanced, anthropomorphised technologies (e.g. robotics) can be convivial, with some arguing that robots are likely to become dependency-forming and substitute for human skills in a wider “attack on autonomy” (Anon, 2018, p. 10). Humanist-pessimists rely on many of the same empirical-type claims as humanist-optimists, for example the idea that the “technium” has grown beyond human control. However, the affective investments and normative imperatives are opposed: it is “rearranging itself round us now like a prison” (Kingsnorth, 2015, p. 39), threatening “the abolition of human nature” (Illich, 1973, p. 41). “Human nature” here is axiomatic: Harvey (2015) argues that humans are becoming robotised through media exposure, breaking down complex selves and capacities like concentration and empathy. Assemblage-optimist While humanist-optimists value technological development as empowering or augmenting humans, assemblage-optimists value becoming-other and decentring of supposedly immanent binaries in technological/cybernetic assemblages. Assemblage-optimists tend to scorn modernity, within which “humanism”, “the human”, “reason” and “the subject” are necessarily enmeshed. Oppressions and violence are rooted in presubjective linguistic binaries which elevate humanity over Derridean Others. A Derridean view of technology as a subordinate/supplementary term in a co-constituted binary with the “human” posits technology as the underprivileged term, used to disrupt the privileged “human”.In posthumanism, “[t]he cyborg, the monster, the animal... are... emancipated from the category of pejorative difference” (Braidotti, 2011, p. 68). Optimistic posthumanists celebrate emergent technologies’ potential to subvert dominant binaries, and thereby undermine social oppression. Posthumanists reject “humanism” or “Man”; they are post-anthropocentric and post-dualist (Ferrando, 2013, p. 27). They reconceive humans as entangled, situated in, and co-constituted by, assemblages encompassing nonhuman elements. Posthumanists have common belief in “technogenesis” (Ferrando, 2014, p. 28): humans relate to technology as animals to habitat (Braidotti, 2011, p. 62); and are always-already co-evolved with technology (Graham, 2004, p. 25). Thus human-technological hybridity should not be used instrumentally or feared as threatening (e.g. Graham, 2004, p. 27). Posthumanist-optimists often situate themselves as a golden mean between euphoric transhumanists and technophobes (Braidotti, 2011, p. 55; Graham, 2004, pp. 16–17; Ferrando, 2013, p. 28): a Derridean strategy of seeking ambivalence while Utopian recognising interdependence of terms. perspectives on New materialism is sometimes treated as a posthumanist subtype and sometimes a robots distinct approach. It places greater emphasis on embodiment (rather than textual constructivism), generally conceiving matter as a process of “materialization” (Ferrando, 2013, p. 30), or becoming within a monistic-holistic field. New materialism encompasses authors like Braidotti; Karen Barad (2003), who argue reality is produced through a necessary splitting of the holistic field by observers; and Jane Bennett (2010), who believes inorganic entities exhibit vital force and should be ethically valued as agents. Like posthumanists, new materialists place strong emphasis on human embeddedness in the world (Barad, 2007, p. 185). Metahumanism effectively rebrands posthumanism, sharing a focus on “an unquantifiable field of relational bodies”, technogenesis, critique of humanism and binaries, and inbetweenness as subversion of modernity/capitalism (del Val and Sorgner, 2011). Posthumanists value emerging technologies for their disruption of “the human”. Humans are effects of assemblages: becoming “posthumans” or “cyborgs” when placed in different assemblages(e.g. Haraway, 2015[1985]; Kaloski, 1997). Resistance to this is taken as a reactionary clinging to order and attempt to keep collapsing binaries intact. Hence, for Carroll (2003) objections to eBooks are reactions to “a threat or disruption” to a “habituated behavior” of reading which is itself socially conditioned. Becoming posthuman may entail renouncing human agency, reducing it to a moment of reflection, or simply making it more situated. When becoming posthumans, we undergo changes perhaps including a more relational worldview, constant awareness of connectedness, an other-centric ethics of accountability, loss of fears of new technologies and social changes, and in some cases a passive rather than active stance towards the world. Donna Haraway rejects boundaries among humans, animals and machines on an assemblage-theoretic basis. People are always-already cyborgs, enmeshed in “messy’ interactions inside nonhuman systems (Haraway 2016[2003], p. 181). While Haraway now renounces the label “posthumanist’ (2015, p. 161), she still writes of people as cyborgs (Haraway, 2016). In her earlier work, she defined the human as an effect of binaries splitting “Man” from nature, resting on “othering” of animals, robots, nature, women, etc (Haraway 2016[1985], pp. 28–30). Cybernetics leads outside such representational hierarchies. Monsters and cyborgs are valued for their transgression of binaries. Haraway’s early work is driven by an ethicopolitical duty to cyborgise as part of a Manichean battle against “modernity”. In her more recent work, “modernity”’ is taken to have collapsed in ecological crisis, the main task now to recreate refuges for biodiversity and “make kin” (2015, p. 162). Posthumanist-optimists generally endorse or assume cybernetics or close relatives (e.g. complexity theory and systems theory). Braidotti insists on the “primacy of intelligent and self-organizing matter” over human agency (2019, p. 31). Machines subvert binaries, producing direct relations (2013, p. 57); cybernetics is post-representational (Braidotti, 2013., p. 59). Hayles embraces a vision of complexity as recursive reapplication of simple rules to simple nodes (1999, p. 285). Humans are to relinquish control to distributed cognition of automated systems (Hayles, 1999, p. 288). Wolfe seeks a “self-referential autopoiesis” in which self-referential closure of individuals enables social determination and thus systemic complexity (Wolfe, 2010, p. xxi). Some posthumanists emphasise the distinctness of human embodiment, which is both like and unlike machines (Hayles, 1999, pp. 283–284; Wolfe, 2010, p. xxiii; Ferrando, 2013, p. 32). However, their “body” tends to be a relational node in cybernetic networks, not an inner self or material substance. Most posthumanists supplement cybernetics with holistic, relational, or Derridean ethics. For Braidotti, “affirmative ethics” makes the difference (2019, p. 41) between “good” cybernetic relationality and “bad” capitalist accelerationism. For Graham (2004, p. 10), “choices and values” are central to how (not whether) to use new technologies – from a position “always already immersed in the material conditions of our own creations” (2004, IJSSP p. 27). The shift from modernity to cybernetics has already happened: we must craft an 41,3/4 appropriate ethics, politics, and subjectivity for the new reality (Braidotti, 2011, p. 69, 2019, pp. 34, 49; Ferrando, 2013, p. 32). To the extent that ethicisation is transformation, there is always a tactical/strategic element to posthumanism, and posthumanists are typically ethically committed scholars. However, they rarely allow selection among technologies and are optimistic about the effects of (properly ethically supplemented) technological change as such. Some endorse capitalism, claiming it impossible and undesirable to leave (Rossini and Toggweiler, 2017, p. 6). Others are anti-capitalist, while endorsing related modes of cybernetic power (Braidotti, 2019, pp. 40–41; Haraway, 2016; Ferrando, 2013). Fantasies of technology overcoming finitude, embodiment or death are generally denounced (Braidotti, 2013, p. 60; Hayles, 1999,p.5; Wolfe, 2010, p. xv; Graham, 2004, p. 24). An exception is Land (2011)[1993], who values cybernetic systems because they slip out of human control. Cybernetics began as a control project, but has spun out of control (Plant and Land, 2014, p. 305). Today’s political landscape therefore pits viral machinic forces against the “phobic resistance” and immunopolitics of reactionary humans (Land, 2014, p. 256). As viral forces triumph, “Humanity recedes like a loathsome dream” (Land, 2014, p. 261). People become hyper-diverse, fragmentary individuals, exceeding human limits and crossing identity-boundaries in a kind of cyberpunk utopia (Land, 2011[1997], p. 456). Critics object that even ethicised posthumanist-optimism eliminates human agency (Fuller, 2000, p. 26; Nold, 2020). Similarly to humanist-optimists, assemblage-optimists (including some posthumanists) give technology agency in itself, channelling their desires through it. However, some posthumanists take a more critical and political stance, as outlined below. Assemblage-tactical Assemblage-tactical approaches use assemblage models but seek to maintain human agency. Most term themselves tactical rather than strategic, aligning with de Certeau’s (1988[1980]) association of strategy with rigid hierarchies and tactics with micropolitical everyday resistance. Tactical media is based on de Certeau’s theory and involves bricolage and detournement of technologies outside their usual assemblages. It refers to bottom-up, flexible, hybrid and provisional approaches (Garcia and Lovink, 2008). Assemblages are assessed in terms of how far they provide joyous experiences, empower individuals/collectives, equalise power, produce cooperation, etc. Technology is not neutral, but its impact depends on the assemblage it is part of. So some technologies can be reclaimed, hacked and repurposed – track-jumping into more emancipatory assemblages. Selection distinguishes them from assemblage-optimists. They usually prefer an active “hacker” over a passive “consumer” role and emphasise participation (Richardson, 2003, pp. 347–350; Westerkamp, 2003, p. 261). Poststructuralism underpins assemblage-tactical positions. For Deleuze and Foucault, “there is no need to uphold man in order to resist” (Deleuze, 1988, p. 92). Everything is part of assemblages, but resistance is possible based on desires, lines of flight, or forces of life entrapped within assemblages. Certain non-humanist machines are preferable (schizorevolutionary or active power for Deleuze; aesthetic self-constitution and self-care for Foucault). “The question concerns the forces that make up man: with what other forces do they combine, and what is the compound that emerges?” (Deleuze, 1988, p. 73). Cybernetics is treated as a disempowering economic machine fragmenting and recomposing labour (Deleuze, 1988, p. 131) or as channelling flows (Deleuze and Guattari, 1987, pp. 510–512), sometimes leading to machinic enslavement (see below). DeLanda (1991,p.3)isasa “robot historian”, focusing on machinic agency. He conceives the human-robot relation as symbiotic, with machines using humans for propagation. Preciado both opposes disciplinary control and manipulation of techno-bodies, and celebrates gender-bending, experimentation, and “molecular revolution” generated by the current “pharmacopornographic biocapitalism” Utopian (Preciado, 2013[2008]), pp. 325, 166). perspectives on “Contestational robotics” applies tactical media theory to robot design and HRI, robots encouraging tinkerers to create simple robots or drones for purposes such as infiltrating securitised sterile zones inaccessible to human protesters, and spreading pamphlets. They call for “continuous development of tactics to reestablish a means of expression and a space of temporary autonomy within the... social” in the context of “advanced surveillance capabilities” (Critical Art Ensemble and Institute for Applied Autonomy, 2001, p. 115). Hacker accounts suggest empowering affects within certain technosocial assemblages, relative to everyday life. The Hacker Manifesto (The Mentor, 1986) depicts everyday life as a hellhole of status-competition, apathy, authoritarianism and sadism, with computers freeing the hacker. Hacker culture is often portrayed as a subversive counterculture involving gift economy and an ethos of sharing, non-instrumentality, and information freedom (Levy, 1984; Stallman, 2015; Reimens, 2002; Dasgupta, 2003, pp. 335–356), though some suggest it is recuperated in neoliberal cyberculture (Turner, 2006). Pirate radio, meshnets, FLOSS, anonymity software, hacklabs, and hacktivism fit into this model. STS scholars today are influenced by poststructuralism, and use assemblage approaches bridging optimist and tactical clusters. Examples include the Anglo-Foucauldian school, Actor Network Theory (ANT) and Object-Oriented Ontology (OOO). ANT explores relations in a network, foregoing explanation (Law, 2004, p. 157; Latour, 2005). It sees knowledge- production and practice as world-changing interventions, and rejects strong social constructivism, technological determinism, and essentialism (Law, 1991,p.8; Nold, 2017, p. 24). Robots, like humans, are “actants”. ANT’s flat ontology tends towards assemblage- optimism and cybernetic soft power because of difficulty motivating choices among assemblages, embrace of manipulative power, and de-emphasising of human agency (Harman, 2018, pp. 136–139). However, ANT is critical of rendering the conditions of scientific/technological production invisible (“blackboxing”)(Latour, 1999, p. 314). De- blackboxing technology is consistent with assemblage-tactical ideas of hacking and power within networks. OOO (Harman, 2018) emphasises human humility before objects, particularly “hyperobjects” which are too big to know or control (Morton, 2013). While Harman’s politics is reformist or quietistic, Bryant’s discussion of eruptive rogue objects, including new technologies (Bryant, 2012), is more clearly assemblage-strategic. Anglo-Foucauldians like Rose et al. both argue for an assemblage ontology (2016, pp. 1, 3, 9) and call for humans to be “at the centre” of new technologies, and augmented rather than replaced (Rose et al., 2016, p. 3). There is also a group of critical/radical posthumanists in the assemblage-tactical cluster, combining posthumanist, new materialist and ANT ideas with radical politics; embracing criticisms of mainstream posthumanism while salvaging the name and ontology: endorsing selection among technologies and assemblages. Nold criticises authors like Latour and Bennett, arguing that some objects should be excluded from social collectives – e.g. nuclear reactors and surveillance systems (Nold, 2020). He calls for a “pragmatic coalition of human and nonhuman agencies” making targeted interventions (Rose et al., 2016). Cudworth and Hobden endorse posthumanism in the sense of assemblage enmeshment and antihumanism (Cudworth and Hobden, 2018, p. 5, 8), ethics of responsibility, rejection of purity (Rose et al., 2016, p. 156), and an affinity for complexity theory (Rose et al., 2016, pp. 13–15). However, they criticise emphasis on entanglement rather than power (Rose et al., 2016, p. 14). They posit selection as an ethical imperative to equalise and reduce harm. This involves lower-impact living, vegetarianism, everyday conviviality, anti-capitalism, and bottom-up community (Rose et al., 2016, pp. 137, 146–147, 151–152). Anarchists using cybernetics similarly try to unpack decentralising and self-organising aspects from the focus on control (Swann, 2018; Beer, 1959; Duda, 2013; Goodman, 2010). Assemblage-tacticians are more prepared to denounce cybernetic power than assemblage- IJSSP optimists. Can robots be used within empowering assemblages? Contestational robotics 41,3/4 suggests they can, as demonstrated by robotic distribution of illegal abortion pills at a recent Irish protest (Press Association, 2018). Assemblage-tacticians have similar issues to humanist-pessimists, of which technologies are tactically progressive. By their implied criteria, a good use disrupts dominant systems, empowers disempowered people, and feels empowering. One problem in HRI is how to tactically “use” strong AI, without being used back, in the manner Galloway (2004) suggests already happens with ostensibly empowering protocols. Assemblage-pessimist Assemblage-pessimists are a smaller cluster of poststructuralists and anarchists who do not object to assemblages in principle, but believe present dominant assemblages are broadly disempowering, alienating, immiserating, and fragmenting. Assemblage-pessimists worry about elite control of new technologies, heavy military and surveillance use, the emergence of alien machinic perspectives, and a cluster of social, ecological, psychological and physical harms. Assemblage-pessimists focus on “machinic enslavement”: humans become cogs within social machines, or subordinate nodes in computer networks). Hence, “human beings... are constituent pieces of a machine that they compose among themselves and with other things (animals, tools), under the control and direction of a higher unity” (Deleuze and Guattari, 1987, pp. 456–457). Flows of desire are presubjective and not distinctly human. Social assemblages are assessed in terms of whether they subordinate/are subordinate to flows of desiring- production (Deleuze and Guattari, 2004, p. 8). “Higher unity” distinguishes enslaving systems from free systems. Machinic enslavement ’makes desubjectivized flows and fragments’ and then ‘turns those subjects into component parts of machines (slave units in the cybernetic sense)” (Wark, 2017, p. 51). Personal aspects of life are thus rendered irrelevant. “Whether you are happy, whether you stutter, whether you are afraid of death or of old age – all this counts for nothing [...] On the contrary, it inconveniences. It makes too much ‘noise’ in the sense of information theory” (Guattari, 1996, p. 137). Galloway (2004, 2012) understands the Internet through Deleuze’s control society theory, with interfaces operating similarly to control. Media are metonymical rather than indexical, with objects reduced to classes, and thereby blackboxed (Galloway, 2012, p. 9). Objects are manipulated to form a world (Galloway, 2012, p. 23). Baudrillard (1994, p. 81) sees the cybernetic order–the “code” - as seeking ’absolute control’. Reality is fragmented into ’simple elements’ rearranged into binary oppositions and segmented performances, on the model of surveys (Galloway, 2012, pp. 83–84). Feedback systems are closed and tautological, lacking affective force, meaning, and reversibility (Galloway, 2012, p. 78), causing human suffering, and taking to its implosive conclusion the denial of symbolic exchange. Cybernetics is seen as a conservative system, designed to suppress chaotic flows or capture data. The Invisible Committee see cybernetics as a conservative strategy to ’impede the spontaneously entropic, chaotic movement of the world’ (Invisible Committee, 2014, p. 3). Gorrion (2012) sees apparatuses domesticating and containing underlying ontological chaos, through mechanisms of capture and machinic enslavement. Apparatuses are used strategically by human controllers, but tend to condition strategisers as well as users, making humans increasingly robot-like. Virilio sees cybernetic systems as a stage in the emergence of a “vision machine” from origins in military control. Machinic gaze is privileged over human gaze, and machines start to “see in our place” (Virilio, 1994, p. 64). Machinic vision is continuous with modern reason (Virilio, 1994., p. 70), inaccessible to humans, exercises total control, and disrupts other important assemblages (1990, pp. 68–69). Virilio instead values human-scale, civilian social Utopian relations and ecological systems (Virilio, 1986, pp. 117–118, 1990, p. 13, 2000, pp. 14–15). perspectives on robots Conclusion The growing social role of robots and human-technology symbiosis portends unknown, potentially radical, changes, but there is no single human perspective on this shift. Approaches largely cluster in six distinct sets, each with different paradigmatic assumptions. It is important to develop ways to test claims made by different approaches, instead of remaining within group-specific tautologies, as well as disembedding axioms masquerading as facts which do much of the work in several clusters. The purpose of this typology has been to elucidate a diversity of problem fields, explore paradigmatic assumptions and explanatory strength, and to signal some of the philosophical and political work these perspectives can do. Humanists are prone to essentialism and human exceptionalism, which can exclude non-ideal humans, non-human beings and nature. Assemblage theories are prone to relativism. In combination with utopian optimism or dystopian pessimism, perspectives can suggest deterministic or nihilistic attitudes to emerging technologies. The strategic and tactical clusters open up more possibilities for political agency. There is also an element of futurology in mapping the field because epistemologies prefigure technology creation. It seems likely that we are approaching a new reality in which old categories fail, as in an experiment described by Kuhn (1962, pp. 62–64; Bruner and Postman, 1949), where viewers watching rapidly-spinning playing cards depicting black hearts could not discern them, instead labelling them red hearts or black clubs. Only with familiarity and slowdown were black hearts visible. Human-robot relations may be the black hearts of our time, with traits from different models – the benevolent tools of humanist-optimists, the dangerous inhuman systems of humanist-pessimists, the congealed labour of humanist-strategists, and the extimate others of assemblage theory – combined in as yet unknowable ways. References Adorno, T. (2002[1947]), Dialectic of Enlightenment: Philosophical Fragments, Stanford University Press, Stanford. Anon. (2018), And All the World Shall Become Google: Google’s Digital Attack and its Consequences, Shitstorm: Anarchist Zeitung, Vol. 2, available at: https://theanarchistlibrary.org/library/ anonymous-and-the-world-shall-become-google (accessed 6 September 2019). Barad, K. (2003), “Posthumanist performativity: toward an understanding of how matter comes to matter”, Signs, Vol. 28 No. 3, pp. 801-831. Barad, K. (2007), Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning, Duke University Press, Durham. Barbrook, R. and Cameron, A. (1995), “The Californian ideology”, Science As Culture Vol. 6 No. 1, pp. 44-72. Baudrillard, J. (1994[1981]), Simulacra and Simulation, University of Michigan Press, Lansing, MI. Beer, S. (1959), Cybernetics and Management, English Universities Press, London. Bennett, J. (2010), Vibrant Matter. A Political Ecology of Things, Duke University Press, Durham, NC. Berardi, F. (2016), Cognitarians and Semiocapital, Maska, Ljubljana. Braidotti, R. (2011), “Meta(l)morphoses: women, aliens and machines”,in Nomadic Theory: The Portable Rosi Braidotti, Columbia University Press, , New York, pp. 55-80. Braidotti, R. (2013), The Posthuman, Polity, Cambridge. Braidotti,R.(2019), “A theoretical framework for the critical posthumanities”, Theory, Culture & Society Vol. 36 No. 6, pp. 31-61. Breazeal, C. (2002), Designing Sociable Robots, MIT Press, Cambridge, MA. IJSSP Bruner, J.S. and Postman, L. (1949), “On the perception of incongruity: a paradigm”, Journal of 41,3/4 Personality, Vol. XVIII, pp. 206-223. Bryant, L. (2012), “Five types of objects: gravity and onto-cartography”, Larval Subjects, available at: https://larvalsubjects.wordpress.com/2012/06/17/five-types-of-objects-gravity-and-onto- cartography/ (accessed 6 September 2019). Buterin, V. (2013), “Bootstrapping an autonomous decentralized Corporation, Part 2: Interacting with the world”, Bitcoin Magazine, Vol. 21, available at: https://bitcoinmagazine.com/articles/ bootstrapping-an-autonomous-decentralized-corporation-part-2-interacting-with-the-world-1379808279 (accessed 6 September 2010). Carroll, L. (2003), “Reading technology: curling up with a good information appliance”, in Sarai Collective (Eds.), Sarai Reader 03: Shaping Technologies, CSDS/Sarai, Delhi, pp. 205-214. Chamberlin,S.(2009), The Transition Timeline: For a Local, Resilient Future,Chelsea Green, Hartford,VE. Critical Art Ensemble and Institute for Applied Autonomy (2001), “Contestational robotics”, in Critical Art Ensemble (Eds.), Digital Resistance: Explorations in Tactical Media, Autonomedia, New York, pp. 115-134. Cudworth, E. and Hobden, S. (2018), The Emancipatory Project of Posthumanism, Routledge, London. Dale, G. (2019) “Degrowth and the green new deal”, Ecologist, 28 October 2019, available at: https:// theecologist.org/2019/oct/28/degrowth-and-green-new-deal (accessed 30 October 2019). Dasgupta, R. (2003), “Beyond the apocalypse: an unfinished meditation on ethics”, in Sarai Collective (Eds.), Sarai Reader 03: Shaping Technologies, CSDS/Sarai, Delhi, pp. 236-242. de Certeau, M. (1988[1980]), The Practice of Everyday Life, University of California Press, Berkeley, CA. del Val, J. and Sorgner, S.L. (2011), “A metahumanist Manifesto”, available at: https://metabody.eu/ metahumanism/ (accessed 20 December 2019). DeLanda, M. (1991), War in the Age of Intelligent Machines, Swerve, New York. Deleuze, G. and Guattari, F. (1987), A Thousand Plateaus, Continuum, London. Deleuze, G. and Guattari, F. (2004), Anti-Oedipus, Continuum, London. Deleuze, G. (1988), Foucault, Continuum, London. Deleuze, G. (1994), Difference and Repetition, London, Athlone. Dennett, D. (2004), Atheism Tapes, part 6, BBC TV documentation of Jonathan Miller, produced by Richard Denton, recorded 2003, broadcast 2004, available at: https://www.youtube.com/watch? v5fvG-q7VrFPg (accessed 6 August 2019). Duda, J. (2013), “Cybernetics, anarchism and self-organisation”, Anarchist Studies, Vol. 21 No. 1, pp. 52-72. Dyer-Witheford, N. (1999), Cyber-Marx: Cycles and Circuits of Struggle in High-Technology Capitalism, University of Illinois Press, Champaign, IL. Ellul, J. (1965[1954]), The Technological Society, Knopf, New York. Ferrando, F. (2013), “Posthumanism, transhumanism, antihumanism, metahumanism and new materialisms: differences and relations”, Existsenz Vol. 8 No. 2, pp. 26-32. Ferrando, F. (2014), “Is the post-human a post-woman? Cyborgs, robots, artificial intelligence and the futures of gender: a case study”, European Journal of Forest Research, Vol. 43, pp. 1-17. Fuller, S. (2000), “Why science Studies has never been critical of science: some recent lessons on how to be a helpful nuisance and a harmless radical”, Philosophy of the Social Sciences, Vol. 30 No. 1, pp. 5-32. Galloway, A.R. (2004), Protocol: How Control Exists after Decentralization, MIT Press, Cambridge, MA. Galloway, A.R. (2012), The Interface Effect, Polity, Cambridge. Garcia, D. and Lovink, G. (2008), “The ABC of tactical media”, available at: http://www. Utopian tacticalmediafiles.net/articles/3160 (accessed 6 August 2019). perspectives on Garforth, L. (2009), “No intentions? Utopian theory after the future”, Journal for Cultural Research, robots Vol. 13 No. 1, pp. 5-27. Goodman, R. (2010), New Reformation: Notes of a Neolithic Conservative, PM Press, Oakland. Gordon, U. (2009), “Anarchism and the politics of technology”, Working USA: Journal of Labor and Society, Vol. 12 No. 3, pp. 489-503. Gorrion, A. (2012), “Robots of repression”, Mute, 27 March 2012, available at: https://www.metamute. org/community/your-posts/robots-repression (accessed 16 April 2020). Graham, E. (2004), “Post/Human conditions”, Theology & Sexuality, Vol. 10 No. 2, pp. 10-32. Griziotti, G. (2019), Neurocapitalism, Minor Compositions, New York. Guattari, F. (1996), The Guattari Reader, Blackwell, Oxford. Gupta, P., Chauhan, S. and Jaiswal, M.P. (2019), “Classification of smart city research: a descriptive literature review and future research agenda”, Information Systems Frontiers Vol. 21 No. 3, pp. 661-685. Haraway, D. (2015), “Anthropocene, capitalocene, plantationcene, cthulucene: making kin”, Environmental Humanities, Vol. 6, pp. 159-165. Haraway, D. (2016[1985]), “The cyborg manifesto: science, technology and socialist feminism in the late twentieth century”, Manifestly Haraway, University of Minnesota Press, Minneapolis, pp. 3-90. Haraway, D. (2016[2003]), “The companion species manifesto: dogs, people and significant otherness”, Manifestly Haraway, University of Minnesota Press, Minneapolis, pp. 91-198. Haraway, D. (2016), Staying with the Trouble: Making Kin in the Cthulucene, Duke University Press, Durham, NC. Harman, G. (2018), Object-Oriented Ontology: A New Theory of Everything, Pelican, Harmondsworth. Harvey, C. (2015), “Sex robots and solipsism”, Philosophy in the Contemporary World, Vol. 22 No. 2, pp. 80-93. Hayles, N.K. (1999), How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics, University of Chicago Press, Chicago. Heidegger, M. (1977[1954]), The Question Concerning Technology and Other Writings, Garland, New York. Illich, I. (1973), Tools for Conviviality, Harper and Row, New York. Illich, I. (1974), Energy and Equity, Harper and Row, New York. Invisible Committee (2014), ‘Fuck off, Google,’ to Our Friends, semiotext(e), Cambridge, MA, pp. 99-130. Kallis, G. (2017), Degrowth, Agenda, Newcastle. Kaloski, A. (1997), “Bisexuals making out with cyborgs: politics, pleasure, con/fusion”, International Journal of Sexuality and Gender Studies, Vol. 2 No. 1, pp. 47-64. Kelly, K. (1994), Out of Control: The New Biology of Machines, Social Systems and the Economic World, self-published, available at: https://kk.org/mt-files/books-mt/ooc-mf.pdf (accessed 6 October 2019). Kelly, K. (2010), What Technology Wants, Viking, New York. Kerruish, E. (2016), “Perception, imagination and affect in human-robot relationships”, New York, Cultural Studies Review, Vol. 22 No. 2, pp. 4-20. Kingsnorth, P. (2015), “Planting trees in the anthropocene: a conspiracy theory”, Dark Mountain, Vol. 8, pp. 30-42. Kleiner, D. (2016), “Counterantidisintermediation”, in Scholz, T. and Schneider, N. (Eds.), Ours to Hack and to Own: The Rise of Platform Cooperativism, A New Vision for the Future of Work and a Fairer Internet, New York: OR, pp. 63-68. Kostakis, V., Latoufis, K., Liarokapis, M. and Bauwens, M. (2016), “The convergence of digital commons IJSSP with local manufacturing from a degrowth perspective: two illustrative cases”, Journal of Cleaner 41,3/4 Production,pp. 1-10,available at: https://www.minasliarokapis.com/CleanerProduction2016_ Kostakis_DigitalCommonsLocalManufacturing.pdf (accessed 6 August 2019). Kuhn, T. (1962), The Structure of Scientific Revolutions, University of Chicago Press, Chicago, IL. Kurzweil, R. (2005), The Singularity Is Near: When Humans Transcend Biology, Viking, New York. Land, N. (2011 [1993]), ‘Machinic Desire,’ Fanged Noumena: Collected Writings 1987-2007, Urbanomic, Falmouth, pp. 319-344. Land, N. (2011 [1997]), “Meltdown” Fanged Noumena: Collected Writings 1987-2007, Urbanomic, Falmouth, pp. 441-459. Land, N. (2014), “Circuitries”, in Mackay, R. and Avanessian, A. (Eds.), #Accelerate: The Accelerationist Reader, Urbanomic, Falmouth, pp. 251-274. Latour, B. (1999), Pandora’s Hope: Essays on the Reality of Science Studies, Harvard University Press, Cambridge, MA. Latour, B. (2005), Reassembling the Social: An Introduction to Actor-Network-Theory, Oxford University Press, New York. Law, J. (1991), “Introduction”, in Law, J. (Eds.), A Sociology of Monsters: Essays on Power, Technology, and Domination, Routledge, London, pp. 1-25. Law, J. (2004), After Method: Mess in Social Science Research, Routledge, London. Levy, S. (1984), Hackers: Heroes of the Computer Revolution, Nerraw Manijaime/Doubleday, New York. Licklider, J.C.R. and Taylor, R.W. (1990), “The computer as a communication device”,in Memoriam: J. C. R. Licklider 1915–1990, CA Systems Research Centre, Palo Alto, available at: https://web. stanford.edu/dept/SUL/library/extra4/sloan/mousesite/Secondary/Licklider.pdf (accessed 6 August 2019). Marcuse, H. (1962[1941]), “’Some social implications of modern technology”, in Arato, A. and Eike, G. (Eds.), The Essential Frankfurt School Reader, The Continuum Publishing Company, New York, pp. 138-162. Marx, K. (1973[1857-8]), Grundrisse, Penguin, Harmondsworth. Marx, K. (1990[1867]), Capital, Vol. 1, Penguin, Harmondsworth. Mason, P. (2015), PostCapitalism: A Guide to Our Future, Allen Lane, Bristol. Mentor (1986), “Hacker’s Manifesto: the conscience of a hacker”, Phrack, Vol. 1 No. 7, available at: http://phrack.org/issues/7/3.html (accessed 6 August 2019). More, M. (1994), “On becoming posthuman”, available at: www.maxmore.com/becoming.htm. Morton, T. (2013), Hyperobjects: Philosophy and Ecology after the End of the World, University of Minnesota Press, Minneapolis. Mumford, L. (1986[1966]), “The first megamachine”, in Miller, D.L. (Ed.), Lewis Mumford Reader, Random House, New York, 1986, pp. 315-323. Nold, C. (2017), “Device studies of participatory sensing: ontological politics and design interventions”, Doctoral thesis, UCL (University College London). Nold, C. (2020), “Insurrection training for post-human politics”, International Journal of Sociology and Social Policy (in press). Ouroboros (1999), “The transtopian principles”, Version 2.2’, 18 November at: available at: http:// members.wbs.net/homepages/c/r/y/cryogenic4life/index2.html_link_appears_dead. Perlman, F. (1983), Against His-Story, Against Leviathan, Black & Red, Detroit. Pitts, H. and Dinerstein, A. (2017), “Corbynism’s conveyor belt of ideas: postcapitalism and the politics of social reproduction”, Capital and Class, Vol. 41, pp. 423-434. Plant, S. and Land, N. (2014), “Cyberpositive”, in Mackay, R. and Avanessian, A. (Eds.), #Accelerate: Utopian The Accelerationist Reader, Urbanomic, Falmouth, pp. 303-314. perspectives on Plows, A. and Reinsborough, M. (2011), “Encountering ‘the politics of technology’: public engagement robots from the bottom up”,inZulsdorf, T.B., Coenen, C., Ferrari, A., Fiedeler, U., Milburn, C. and Wienroth, M. (Eds.), Quantum Engagements, AKA Verlag, Heidelberg, pp. 91-107. Preciado, P.B. (2013[2008]), Testo Junkie: Sex, Drugs, and Biopolitics in the Pharmacopornographic Era, SUNY Press, New York. Press Association (2018), Activists Take “Abortion Pills” during Pro-choice Rally in Belfast, The Guardian, Vol. 31 May, available at: https://www.theguardian.com/world/2018/may/31/ pro-choice-activists-take-abortion-pills-belfast-protest (accessed 6 August 2019). Prieur, R. (2011), “TechJudge”, available at: http://www.ranprieur.com/tech.html (accessed 6 August 2019). Reed, P. (2014), “Seven prescriptions for accelerationism”, in Mackay, R. and Avanessian, A. (Eds.), #Accelerate: The Accelerationist Reader, Urbanomic, Falmouth, pp. 521-536. Reich, R. (2000), The Future of Success: Working and Living in the New Economy, Random House, New York. Reimens, P. (2002), “Some thoughts on the idea of ‘hacker culture’”, available at: http://cryptome.org/ hacker-idea.htm (accessed 6 August 2019). Richardson, J. (2003), “The language of tactical media”, in Sarai Collective (Eds.), Sarai Reader 03: Shaping Technologies, CSDS/Sarai, Delhi, pp. 346-351. Rikowski, G. (2003), “Alien life: Marx and the future of the human”, Historical Materialism, Vol. 11 No. 2, pp. 121-164. Romero,D., Stahre,J., Wuest,T., Noran, O., Bernus, P.,Fast-Berglund, A. and Gorecky, D. (2016) “Towards an operator 4.0 typology: a human-centric perspective on the fourth industrial revolution technologies” in International conference on computers and industrial engineering (CIE46) proceedings, 29-31 October 2016, Tianjin/China, ISSN 2164-8670 CD-ROM, ISSN 2164-8689 ON-LINE. Rose, N., Aicardi, C. and Reinsborough, M. (2016), “Foresight report on future computing and robotics”, An Ethics and Society Deliverable of the Human Brain Project to the European Commission, Kings College London, London. Rossini, M. and Toggweiler, M. (2017), “Editorial: posthuman temporalities”, New Formations: A Journal of Culture/Theory/Politics, Autumn (92), pp. 5-15. Rushkoff, D. (2016), “Renaissance now”, in Scholz, T. and Schneider, N. (Eds.), Ours to Hack and to Own: The Rise of Platform Cooperativism, A New Vision for the Future of Work and a Fairer Internet, New York, OR, pp. 33-37. Schwab, K. (2017), The Fourth Industrial Revolution, World Economic Forum, Davos. Sipp, K. (2016), “Portable reputation in the on-demand economy”, in Scholz, T. and Schneider, N. (Eds.), Ours to Hack and to Own: The Rise of Platform Cooperativism, A New Vision for the Future of Work and a Fairer Internet, New York, Or, pp. 59-62. Sotala, K. and Yampolskiy, S.J. (2015), “Responses to catastrophic AGI risk: a survey”, Physica Scripta, Vol. 90 No. 1, pp. 1-35. Srnicek, N. and Williams, A. (2015), Inventing the Future: Postcapitalism and a World without Work, Verso, London. Stallman, R.M. (2015), Free Software, Free Society: Selected Essays of Richard M, GNU Press, Stallman, Boston, MA. Swann, T. (2018), “Towards an anarchist cybernetics: Stafford Beer, self-organisation and radical social movements”, Ephemera Vol. 18 No. 3, pp. 427-456. Terranova, T. (2004), Network Culture: Politics for the Information Age, Pluto, London. Turner, F. (2006), From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and IJSSP the Rise of Digital Utopianism, University of Chicago Press, Chicago, IL. 41,3/4 Virilio, P. (1986[1977]), Speed and Politics, semiotext(e), New York. Virilio, P. (1990[1978]), Popular Defense and Ecological Struggles, semiotext(e) , New York. Virilio, P. (1994[1988]), The Vision Machine, British Film Institute, London. Virilio, P. (2000[1998]), The Information Bomb, Verso, London. Voinea, C. (2018), “Designing for conviviality”, Technology In Society, Vol. 52, pp. 70-78. Wang, X.V., Kemeny, Z., Vancza, J. and Wang, L. (2017), “Human–robot collaborative assembly in cyber-physical production: classification framework and implementation”, CIRP annals, Vol. 66 No. 1, pp. 5-8. Wark, M. (2004), A Hacker Manifesto, Harvard University Press, Cambridge, MA. Wark, M. (2017), General Intellects: Twenty-Five Thinkers for the Twenty-First Century, Verso, London. Weise, M.R., Hanson, A.R., Sentz, R. and Saleh, Y. (2018), “Robot ready: humanþ skills for the future of work, Indianapolis”, IN: Strada Institute for the Future of Work. Wendling, A. (2009), Karl Marx on Technology and Alienation, Palgrave, Basingstoke. Westerkamp, H. (2003), “Colliding soundscapes’, interviewed by L. Bhagat”, in Sarai Collective (Eds.), Sarai Reader 03: Shaping Technologies, CSDS/Sarai, Delhi, pp. 255-262. Wiener, N. (1948), Cybernetics: Or Control and Communication in the Animal and the Machine, MIT Press, Cambridge, MA. Winner, L. (1986), The Whale and the Reactor: A Search for Limits in an Age of High Technology, University of Chicago Press, Chicago. Wolfe, C. (2010), What Is Posthumanism?, University of Minnesota Press, Minneapolis. Zanella, A., Bui, N., Castellani, A., Vangelista, L. and Zorzi, M. (2014), “Internet of things for smart cities”, IEEE Internet of Things journal, Vol. 1 No. 1, pp. 22-32. Zerzan, J. (1997), “Against technology”, available at: https://theanarchistlibrary.org/library/john- zerzan-against-technology-a-talk-by-john-zerzan-april-23-1997 (accessed 6 August 2019). Zerzan, J. (2008), “Second-best life: real virtuality”, Green Anarchy 25, available at: https:// theanarchistlibrary.org/library/john-zerzan-second-best-life-real-virtuality (accessed 6 August 2019). Zerzan, J. (2012), Future Primitive Revisited, Feral House, Port Townsend, WA. Corresponding author Rhiannon Firth can be contacted at: r.firth@essex.ac.uk For instructions on how to order reprints of this article, please visit our website: www.emeraldgrouppublishing.com/licensing/reprints.htm Or contact us for further details: permissions@emeraldinsight.com

Journal

International Journal of Sociology and Social PolicyEmerald Publishing

Published: May 21, 2021

Keywords: Robots; Cybernetics; Posthumanism; Humanism; Utopia

There are no references for this article.