CEP Colloquium: Rob Sparrow, Monash University

How Robots Have Politics

Abstract: In an influential paper, published in 1980, Langdon Winner asked “Do artifacts have politics?” and concluded “yes!” In this presentation, which draws on my research on the ethics of autonomous vehicles, military robotics, sex robots, and aged care robots, I will explore how robots have politics. I will argue that the embodied and interactive nature of robots means that they have more politics than other sorts of artefacts. Robots have more — and more complex — “affordances” than other technologies. Robots will embody and reflect the intentions of their designers in ways that are very unlikely to be transparent to those who use or encounter them. The choices made by engineers will often have consequences for the options available to the users of robots and will in turn shape relationships between users and those around them. The power this grants designers is itself politically significant. Because, increasingly, robots will occupy the same environments as human beings, and play important social and economic roles in those environments, human-robot relations will become crucial sites of political contestation. The social policy choices necessary to realise the benefits of robots in many domains will inevitably also be political choices, with implications for relationships between stakeholders. Humanoid robots, and their behaviour, will have representational content, with implications for the ways in which people understand and treat each other. More generally, to the extent to which we anticipate that the introduction of widespread automation will produce a Fourth Industrial Revolution, it is vital that we ask who is making this revolution, as well as who will flourish — and who will suffer — if it occurs.

CEP Colloquium: Paul Scharre, Center for New American Security

Autonomous Weapons: Ethics and Policy

Abstract: What happens when a Predator drone has as much autonomy as a self-driving car? Should machines be given the power to make life and death decisions in war? Would doing so cross a fundamental moral line? Militaries around the globe are racing to build increasingly autonomous systems, but a growing chorus of voices are raising alarm about the consequences of delegating lethal force decisions to machines.

Paul Scharre, Senior Fellow at the Center for a New American Security, is the author of the forthcoming book Army of None: Autonomous Weapons and the Future of War. He is a former Pentagon official who led the team that drafted the official Defense Department policy guidance on autonomous weapons, DoD Directive 3000.09. He is also a former Army Ranger who served multiple tours in Iraq and Afghanistan.