Emergence of lethal autonomous weapons systems (LAWS) and their necessary apprehension through European human rights law
- Author(s):
- Parliamentary Assembly
- Origin
- Assembly
debate on 27 January 2023 (9th sitting) (see Doc. 15683, report of the Committee on Legal Affairs and Human
Rights, rapporteur: Mr Damien Cottier). Text adopted by the Assembly
on 27 January 2023 (9th sitting).
1. The Parliamentary Assembly notes
that rapid technological progress in the field of artificial intelligence is
paving the way for the emergence, in the near future, of lethal
autonomous weapons systems (LAWS).
2. According to the definition by the International Committee
of the Red Cross (ICRC), the term LAWS encompasses any weapon system
with autonomy in its critical functions. That is, a weapon system
that can select (i.e. search for or detect, identify, track, select)
and attack (i.e. use force against, neutralise, damage or destroy)
targets without human intervention. LAWS, therefore, are neither
remote-controlled systems in which a human retains control throughout,
nor automatic systems in which a particular process has been programmed
in advance so that their action is totally predictable.
3. The emergence of LAWS has prompted concern on the part of
numerous States as well as civil society. Fifty-four non-governmental
organisations have launched a campaign in favour of a preventive
prohibition of research on and development of these emerging technologies
– and even more so of the use of these systems which they refer
to as “killer robots”. This position of principle was adopted by
the European Parliament in a resolution dated 12 September 2018.
4. The “arms race” logic implied in this field prompts some to
see LAWS as the third military revolution in the history of international
relations, after the invention of gunpowder and that of nuclear
weapons. Military powers which fail to invest in this technology
would therefore risk being left behind.
5. LAWS carry the risk of lowering the threshold for engaging
in conflict, by reducing the risk of a country’s own troop losses.
LAWS also raise a fundamental issue of human dignity: allowing machines
to “decide” to kill a human being.
6. The conformity of LAWS with international humanitarian law
hinges above all on the possibility, or not, of complying with the
principles of distinction, proportionality and precaution when attacking.
6.1 The principle of distinction
between civilian and military targets could be complied with by
LAWS that are well designed and programmed to execute surgical strikes
aimed solely at military targets.
6.2 Judgment calls as to whether an attack satisfies the principle
of proportionality are made on the basis of values and interpretations
of the particular situation rather than on numbers or technical indicators.
Making such judgments, which reflect ethical considerations, requires
human judgment which is unique. It is for this reason that at least
a minimum degree of human control is indispensable.
6.3 To comply with the precautionary principle, the course
of action taken by LAWS must be predictable. Users must be capable
of adjusting or nullifying the effects of the weapons systems if necessary,
something that is possible only if they can reasonably foresee how
a weapons system will react.
6.4 The conformity of LAWS with international human rights
law, and notably with the European Convention on Human Rights (ETS
No. 5, the Convention), depends on clear regulation of their use. Article
2 of the Convention requires that the right to life be protected
by law. This means that the State must introduce a legal framework
defining the limited circumstances in which the use of these weapons is
authorised. The case law of the European Court of Human Rights relates
to other types of weapons, however the use of LAWS should not be
subject to standards that are any less strict.
7. From the viewpoint of international humanitarian law and human
rights law, regulation of the development and above all of the use
of LAWS is therefore indispensable. The crucial point is human control. Respect
for the rules of international humanitarian and human rights law
can only be guaranteed by maintaining human control, to varying
degrees according to the stances taken by States and other actors
of the international community. Several levels of human control
may be envisaged: significant control, effective control or appropriate
levels of human judgment. Human control must be maintained over
lethal weapons systems at all stages of their life cycle
.
7.1 Human
control can be exercised at the development stage, including through
technical design and programming of the weapon system (ethics by
design): decisions taken during the development stage must ensure
that the weapon system can be used in the intended or expected circumstances
of use, in accordance with international humanitarian law and other
applicable international norms, in particular the European Convention
on Human Rights.
7.2 Human control may also be exerted at the point of activation,
which involves the decision of the commander or operator to use
a particular weapons system for a particular purpose. This decision
must be based on sufficient knowledge and understanding of the weapon’s
functioning in the given circumstances to ensure that it will operate
as intended and in accordance with international humanitarian law
and other applicable international norms. This knowledge must include
adequate situational awareness of the operational environment, especially
in relation to the potential risks to civilians and civilian property.
7.3 To ensure compliance with international humanitarian law
and other applicable international norms, it may be thought necessary
to exert additional human control during the operation stage, when the
weapon autonomously selects and attacks targets. Human intervention
may be necessary to comply with the law and to remedy shortcomings
at the development stage and at the point of activation.
8. Unlike humans, machines do not have feelings and are not moral
agents. If a person commits a war crime with an autonomous weapon,
it is the human who commits the crime, using the autonomous weapon
as the tool. Humans must be not only legally accountable but also
morally responsible for the actions of LAWS. Some decisions pertaining
to the use of weapons require legal and moral judgments, such as
weighing likely civilian casualties against military advantages
from conducting attacks. These judgments must be endorsed by humans
since they are also moral judgments and have legal scope.
9. The relevant provisions of international humanitarian law
imply that such weapons systems must not be used if they are likely
to cause superfluous injury or unnecessary suffering, if they are
inherently indiscriminate or if it is not possible to use them in
accordance with the law.
10. On the assumption that future LAWS meet all the legal requirements
of the laws of war when they operate normally, malfunctions of such
systems could nonetheless cause erroneous attacks and thereby raise accountability
issues. It must be possible to establish legal responsibility in
the event of a malfunctioning lethal autonomous weapons system by
analysing compliance with the requirement of adequate human control.
It should be possible to link unlawful actions committed by using
a lethal autonomous weapons system resulting in violations of international
humanitarian law and other international norms alternatively to
the individual or groups of individuals behind its design, manufacturing,
programming or deployment and, ultimately, to the user State. In
this regard, the user State has a particular responsibility to test
and verify in advance the weapons it intends to use to ensure that
they are predictable and reliable and not likely to entail violations
of international humanitarian law through error, malfunction or
poor design, and to verify the contexts in which their use is possible
in accordance with the law.
11. The Assembly notes that questions concerning the compatibility
of LAWS with international humanitarian law and human rights are
being discussed by States Parties to the Convention on Prohibitions
or Restrictions on the Use of Certain Conventional Weapons Which
May Be Deemed to Be Excessively Injurious or to Have Indiscriminate
Effects (Convention on Certain Conventional Weapons, CCW), which
have set up a Group of Governmental Experts (GGE). Working on the
basis of the “11 Guiding Principles on LAWS” adopted in 2019 and
the Final Declaration of the 6th Review Conference of the States
Parties to the CCW in December 2021, that group continues to seek
a consensus on the future regulation of this emerging technology.
12. At its July 2022 session, the GGE adopted a statement to the
effect that it had reached agreement that the right of parties to
an armed conflict to choose the methods and means of warfare was
not unlimited and that international humanitarian law was also applicable
to LAWS. Any violation of international law, including a violation
involving a lethal autonomous weapons system, incurred the responsibility
under international law of the State concerned. The group further
proposed extending its work into 2023.
13. The Assembly notes that a group of European States has proposed
a two-tier approach to the GGE:
13.1 first,
the States Parties to the CCW should recognise that LAWS which cannot
be used in conformity with international law, including international
humanitarian law, are de facto banned;
and that, consequently, LAWS operating completely outside any human
control and a responsible chain of command are unlawful;
13.2 second, agreement should be reached on the international
regulation of other weapons systems presenting elements of autonomy
in order to guarantee conformity with international humanitarian
law by:
13.2.1 ensuring appropriate human control throughout
the life cycle of the system in question;
13.2.2 maintaining human responsibility and the obligation of
accountability at any time, in all circumstances and throughout
the life cycle, as the basis of the responsibility of the State
and that of the individual. This responsibility and this obligation
of accountability may never be transferred to machines;
13.2.3 implementing suitable measures to mitigate the risks and
appropriate guarantees regarding security and safety.
14. The Assembly supports this two-tier approach and considers
that the emergence of LAWS requires clear regulation of this technology
to ensure respect for international humanitarian law and human rights
and that the appropriate forum to agree on the future regulation
of LAWS is the Conference of States Parties to the CCW and its GGE.
15. As to the legal form of such regulation, the goal should be
a binding text in the form of a protocol to the CCW or even a specific
international convention.
16. Pending the emergence of the broad consensus needed to draw
up such an instrument, a non-binding instrument should be prepared
in the form of a code of conduct. This instrument, which might be
updated on a regular basis, could codify the guiding principles
that are already broadly recognised and highlight the good practices
adopted by given States Parties to the CCW.
17. The Assembly therefore calls on Council of Europe member States
as well as observer States and States whose parliaments enjoy observer
or partner for democracy status with the Assembly to take a constructive
role in the work in progress within the CCW and its GGE with a view
to regulating the emergence of LAWS and to support the two-tier
approach mentioned above.
18. Should no consensus emerge within a reasonable period of time
for the drafting of a code of conduct and subsequently for the preparation
and negotiation of an international agreement within the meaning
of paragraphs 14 and 15, or should such steps appear to have no
chance of success, the Assembly invites Council of Europe member
States as well as observer States and States whose parliaments enjoy
observer or partner for democracy status with the Assembly to consider
initiating such work at the Council of Europe level.