Remus Titiriga


While robots are still absent from our homes, they have started to spread over battlefields. However, the military robots of today are mostly remotely controlled platforms, with no real autonomy. This paper will disclose the obstacles in implementing autonomy for such systems by answering a technical question: What level of autonomy is needed in military robots and how and when might it be achieved, followed by a techno-legal one: How to implement the rules of humanitarian law within autonomous fighting robots, in order to allow their legal deployment? The first chapter scrutinizes the significance of autonomy in robots and the metrics used to quantify it, which were developed by the US Department of Defense. The second chapter focuses on the autonomy of "state-of-the-art” robots (e.g.; Google’s self-driving car, DARPA’s projects, etc.) for navigation, ISR or lethal missions. Based on public information, we will get a hint of the architectures, the functioning, the thresholds and technical limitations of such systems. The bottleneck to a higher autonomy of robots seems to be their poor “perceptive intelligence.” The last chapter looks to the requirements of humanitarian law (rules of “jus in bello”/rules of engagement) to the legal deployment of autonomous lethal robots on the battlefields. The legal and moral reasoning of human soldiers, complying with humanitarian law, is a complex cognitive process which must be emulated by autonomous robots that could make lethal decisions. However, autonomous completion of such “moral” tasks by artificial agents is much more challenging than the autonomous implementation of other tasks, such as navigation, ISR or kinetic attacks. Given the limits of current Artificial Intelligence, it is highly unlikely that robots will acquire such moral capabilities anytime soon. Therefore, for the time being, the autonomous weapon systems might be legally deployed, but only in very particular circumstances, where the requirements of humanitarian law happen to be irrelevant.