In this PHAP Expert Legal Briefing, Naz Modirzadeh and Dustin Lewis, from the Harvard Law School Program on International Law in Armed Conflict, explored developments in technology, accountability, and international law pertaining to armed conflict. The background concern is that in war, as in so many areas, power and authority are increasingly expressed algorithmically. Advancements in artificial intelligence and robotics may implicate—and possibly transform—numerous aspects of armed conflict. For instance, increasingly sophisticated forms of technical autonomy may affect the conduct of hostilities (including the development and use of “autonomous weapons”). But they also might relate to other elements pertaining to war, such as guarding and transporting detainees, providing medical care, and delivering humanitarian assistance.
The presenters summarized a recent report from the Harvard Law School Program on International Law and Armed Conflict. That report introduces a new concept—war algorithms—that aims to elevate algorithmically-derived “choices” and “decisions” to a central concern regarding technical autonomy in war. The report defines a “war algorithm” as any algorithm that is expressed in computer code, that is effectuated through a constructed system, and that is capable of operating in relation to armed conflict. Through the “war algorithms” lens, the presenters linked international law and related accountability architectures to relevant technologies.