LAWs (Lethal Autonomous Weapons) in the Law
Year of study:
There is an increase in the development and use of lethal autonomous weapons (LAWs), also known as killer robots, in many countries’ military attack and defence systems. Some people argue that LAWs bring less casualty in war due to increased accuracy. However, their development is far from reaching the stage of being able to make decisions that resemble human judgement and compassion. There are also concerns about LAWs attacking with bias and indiscriminately, thus their use can be dehumanising and threaten international security.
In response to these questions, the United Nations gathered government experts to nd legal solutions, and they have assured that LAWs are not exempt from compliance with international humanitarian law (IHL). However, are existing laws sucient if these weapons are unable to take humanitarian considerations into account? Despite reaching international consensus on the need to ensure individual responsibility, countries have not established common standards on how much human control is required for each weapon. This has led to the rise in the stop killer robots campaign, which emphasises human responsibility in relation to LAWs and urges governments to speed up the process of passing new international law governing their use.
This project considers whether LAWs conform with the fundamental principles of IHL, such as ensuring that attacks are proportional and not indiscriminate. My research is important as it highlights the human rights and ethical implications of placing human lives in the hands of killer robots.
Speaker bio to follow