Deadly Algorithms: Can Legal Codes hold Software accountable for Code that Kills?

Radical Philosophy, Issue 187 UK, (2014): 2-8.

Reprinted in Hacking Habitat 2015-16

Reprinted in Continent Issue 4.4  2015: 20-27

Drawing on my recent research into the increasing algorithmic oversight of the US Disposition Matrix or kill list that manages targeted assassination by armed drones in Pakistan, this commentary takes a preliminary look at the relationship between moral responsibility and legal liability in order to ask i) whether an ethics can emerge out of mathematical axioms and ii) what kinds of legal frameworks will need to be developed to account for the emergence of new juridical actors derived out of software and code.

In an interview titled “The Future of drone strikes could see execution by algorithm” Sarah Knuckey, Director of the Project on Extrajudicial Executions at NYU Law School emphasises the degree to which drone warfare has strained the limits of international legal conventions and with it the protection of civilians. The “rules of warfare” are “already hopelessly out-dated” she says and will require “new rules of engagement to be drawn up”. Could these new rules of engagement—new legal codes—assume a similarly pre-emptive character to the software codes and technologies that are being evolved—what I would characterise as a projective sense of the law? One that takes its lead from the spirit of the Geneva Conventions passed after WWII protecting the rights of non-combatants rather than from those protocols (Hague Conventions of 1899, 1907) that govern the use of weapons of war and are thus reactive in their formulation and event-based. A set of legal frameworks that is not determined by precedent—what has happened in the past—but on what is arguably stated to take place in the future.


Drone Strikes are...


PDF (French translation)

Situation #18 (Fotomuseum Winterthur)