CYIL vol. 8 (2017)

CYIL 8 ȍ2017Ȏ RESPONSIBILITY OF STATE AND RESPONSIBILITY OF INDIVIDUAL … other weaponized machines used during the warfare that the military commander is fully aware of the capacity of the weaponized equipment. Not every military commander will be able to understand the programming of a weapon in a sufficient manner. The technology is progressively developing, contrary to the preparation of soldiers. Subsequently it allows posing a question whether the superior always has adequate knowledge to predict a potential risk of making a mistake by the machine. Could the commander prevent such a risk? What is the actual ability of control in such cases? Marc Klamberg rightly points out: “The use of autonomous weapons therefore involves a risk that military personnel will be held responsible for the actions of machines whose decisions they did not control. The more autonomous the systems are, the larger this risk looms.” 39 Much of the debate relating to the responsibility for acts of machines must be basically a discussion of the degree of exercised control over the machine. The question of control was already broadly developed in the jurisprudence of the ICJ and ICTY, which were addressing the question of control and the doctrine of command responsibility. The ICTY noted that: the legal duties of a superior (and therefore the application of the doctrine of command responsibility) do not depend only on de jure (formal) authority, but can arise also as a result of de facto (informal) command and control, or a combination of both. 40 The ICJ in the context of state responsibility mentioned the question of “complete dependence”, 41 but obviously complete dependence cannot be demanded in the situation of a military commander v. robot, while such a robot could be uniquely unpredictable. It seems that the ideas of de facto and de jure command and control also will not fulfill expectations as to the attribution of responsibility for the wrongdoing of machines. If the commander had an intent to act, but the machine acted contrary to a given order – how would one define the extent of the control exercised over the machine? Various states and commentators disagree on whether ICL, especially in relation to war crimes, sufficiently addresses the use of autonomous weapon systems and drones and the involvement of various actors engaged in the projecting, programming, developing and acting of such equipment. 42 Because attribution of the responsibility for actions of drones or other weaponized machines is so complicated not only in the legal dimension but also taking into account that this attribution might in some instances be ethically equivocal, it seems that such machines shall not be used or developed at all. 43 Although it must be mentioned that the lack of individual responsibility of a person in this matter is not affecting the responsibility of states under international law for any international damage that might be caused by the organs of that state. International responsibility requires the recognition that States remain legally responsible for the consequences of the use of weaponized machines and also involves the state’s duty to investigate and prosecute potential violations, including reparations for violations. Such a conclusion cannot be, however, formulated if we take responsibility of non-state actors as the point of reference. 39 See KLAMBERG, M., International Law in the Age of Asymmetrical Warfare…, op. cit., p. 168. 40 ICTY, Prosecutor v. Delalić, IT-96-21-T, Trial Chamber Judgment, 16. 11. 1998, para. 348. 41 Application of the Convention on the Prevention and Punishment of the Crime of Genocide…, para. 393. 42 See the debate: LEWIS, D. A., BLUM, G., MODIRZADEH, N. K., War-Algorithm Accountability , Research Briefing 2016, pp. 77-78. 43 The question was developed also here: CROOTOF, R., War Torts: Accountability for Autonomous Weapons…, op. cit., p. 1384.

33

Made with FlippingBook Online document