CYIL vol. 14 (2023)

JAN ONDŘEJ CYIL 14 (2023) is compliance with the rules of ius in bello . If we start from the assumption that this is an armed conflict, the basic condition is that armed force can be used by combatants against combatants. For example, in connection with the pursuit of terrorists, many questions and ambiguities arise. Along those lines, the US had two drone programs: a) military, which was, for example, used in the war zones of Afghanistan and Iraq and was directed against the enemies of the American troops stationed there, b) a CIA program targeting suspected terrorists worldwide, including countries where US troops are not deployed. Especially in the second case , a problem arises because the program using drones in Pakistan, for example, was controlled by the CIA and subject to secrecy. If the operators who control the drones or any other systems do not have the status of members of the armed forces, they are civilians, not combatants in the sense of the law of armed conflict (humanitarian law). It can be stated that CIA operatives are not legal combatants 29 and therefore do not enjoy the protection of the 1949 Geneva Conventions relative to the protection of the war victims. Likewise, an autonomous system that does not distinguish between civilian persons and objects on the one hand and combatants and military objects on the other is inconsistent. The principle of prohibition of the indiscriminate attacks is important, when it must be taken into account whether the autonomous system can distinguish a civilian from a combatant. However, with a remotely controlled asset, it is difficult for the operator to have an overview of the battlefield as a complex in order to be able to evaluate the situation as a whole – this may be related to the principle of prohibition of the indiscriminate attacks. In particular, fully autonomous systems raise a whole set of questions, in case they operate completely independently. First , whether computers/robots can make life and death decisions. Second , robots do not meet the requirements of legal combatants and cannot independently participate in combatant operations. Third , people may make mistakes due to emotions, fatigue, or other factors. Human judgment is often decisive for certain abstentions, not resorting to certain actions in an armed conflict. 30 For example, it is said that at present it is difficult to imagine how robots could be programmed to take into account the difference when the status of the enemy is unclear, especially in cases of combatants or non-combatants. Another case is a situation that requires a complex decision that should take into account in terms of proportionality whether a human life is to be saved. 31 International humanitarian law requires parties to an armed conflict to exercise judgment and impose restrictions. On the other hand, there are arguments in favour of fully autonomous weapons . The fundamental question is whether these weapons systems are or will be capable of operating at the level of a human soldier. Robots have a number of advantages over humans, they do not have emotions such as fear or anger and are able to monitor and record unethical behaviour on the battlefield. The robot can also have access to more information related to the objectives and can also better evaluate issues, such as proportionality etc. 32 29 STERIO, M. The United States‘s Use of Drones in the War on Terror: The (Il) legality of Targeted Killings Under International Law. Case Western Reserve Journal of International Law , Volume 45, 2012, p. 212. 30 VOGEL, R. J. Drone Warfare and the Law of Armed Conflict. 39 Den. J. INT’L. and POL’Y 101 , 104 (2010–11), p. 137. 31 Ibid. 32 LEWIS, J. The case for Regulating Fully Autonomous Weapons. The Yale Law Journal , 2015, vol. 124, no. 4, p. 1314.

36

Made with FlippingBook - professional solution for displaying marketing and sales documents online