The brain-computer interface: new rights or new threats to fundamental freedoms?
Resolution 2344
(2020)
| Provisional version
- Author(s):
- Parliamentary Assembly
- Origin
- Text
adopted by the Standing Committee, acting on behalf of
the Assembly, on 22 October 2020 (see Doc. 15147, report of the Committee on Legal Affairs and Human
Rights, rapporteur: Mr Olivier Becht).See also Recommendation 2184 (2020).
1 The Parliamentary
Assembly notes the rapid progress made in neurotechnology in recent
years, including the ability to record and directly stimulate neural
activity, with the potential to create increasingly effective brain-computer
interfaces (BCI). This progress has been driven by a combination
of, improved understanding of the functioning of the brain, technical
developments and the growing power of artificial intelligence systems.
The ability to create a fully symbiotic connection between the human
brain and digital computing systems, including the internet and
artificial intelligence systems, remains a distant aspiration. Nevertheless,
it is a goal that researchers and entrepreneurs are already pursuing
and which many believe may eventually be achieved.
2 Neurotechnology, including BCI, is currently being developed
and applied with a range of uses in mind. Amongst other things,
huge sums are being invested in research to create new medical treatments
for neurological and psychiatric disorders, such as direct control
of robotic limbs, synthetic speech production, or the treatment
of intractable mood disorders or post-traumatic stress disorder.
Military and security establishments are researching neurotechology
for use in intelligence, propaganda, interrogation, surveillance and
combatants’ performance enhancement. Private companies are researching
the possible use of consumer devices to transform thoughts directly
into typing; providing commercial lie-detection services based on
brain scans; and selling direct-to-consumer neurotechnology devices,
for example as computer gaming or wellness products. Researchers
are exploring the development of ‘neuromarketing’ campaigns that
would exploit subconscious preferences, and examining whether patterns
of neural activity may be predictive of criminal recidivism.
3 Access to the neural processes that underlie conscious thought
implies access to a level of the self that by definition cannot
be consciously concealed or filtered. This risks profound violation
of individual privacy and dignity, with the potential to subvert
free will and breach the ultimate refuge of human freedom – the
mind. Cognitive and sensory enhancement through BCI could create
separate categories of human beings, the enhanced and the unenhanced,
with enhancement available only to those with the necessary wealth
and privilege, or used for repressive purposes. Individual identity,
agency and moral responsibility may be diminished through the merger
of neurological and digital sensory experience and decision-making
processes. Such outcomes could change the very nature of humanity
and of human societies.
4 Even if the more spectacular hypothetical applications of
BCI remain speculative, the advances already made, and the resources
being devoted to further research imply an urgent need for anticipation
and precautionary regulation now. Democratic societies should ensure
that basic ethical principles are respected. The huge potential
benefits of neurotechnology, especially in the medical field, are
such that progress and innovation should not be stifled. Nevertheless,
research should be steered away from foreseeably harmful or dangerous
areas and towards positive applications that do not threaten individual
dignity, equality and liberty, which are the foundations also of
democracy.
5 The Assembly considers that a sensitive, calibrated approach
to regulation of emerging neurotechnology, including BCI technology,
is needed, encompassing both ethical frameworks and binding legal
regulation. It notes the similarities and connections between ‘neuroethics’
and bioethics, and the significance of artificial intelligence to
the operation of BCI technology. It therefore welcomes the work
already underway within the Council of Europe by the Committee on
bioethics (DH-BIO) and the Ad hoc Committee on artificial intelligence
(CAHAI). It further welcomes the work of other international organisations,
notably the Organisation for Economic Cooperation and Development
(OECD), which recently adopted a Recommendation on Responsible Innovation
in Neurotechnology. The Assembly notes with interest developments
such as those in Chile, where consideration is being given to constitutional
amendment, legislation and other measures intended to protect human
society from possible adverse consequences of neurotechnology.
6 The Assembly considers that the following ethical principles
must be applied to the development and application of neurotechnology
in general and BCI technology in particular:
6.1 Beneficence and prevention of malign use. This technology
should be developed and applied only for purposes that are consistent
with respect for human rights and dignity. Research aimed at incompatible
purposes should be prohibited. Special attention should be given
to dual-use technology and technology developed for military or
security purposes. New neurotechnology should be subjected to a
prior human rights impact assessment before being put into use.
6.2 Safety and precaution. This technology should be safe
for both the user and, in their intended or unintended consequences,
society in general. Safety must be ensured before any new applications
are put into use.
6.3 Privacy and confidentiality. At a minimum, information
gathered by neurotechnological and BCI devices must be protected
according to general principles of data protection. Consideration
should also be given to protecting ‘neurodata’ as a special category,
for example by analogy to prohibitions on commerce in human organs.
6.4 Capacity and autonomy. This technology should not be used
against a subject’s will or in a way that prevents the subject from
freely taking further decisions about their continued use. Special
care will be needed where such technology is used to treat chronic
pain, drug dependency or other conditions where interruption of
treatment could lead to discomfort or distress.
6.5 Human agency and responsibility. This technology should
not prevent an individual from acting freely and being responsible
for their actions. Human beings, acting freely according to their
natural (as opposed to enhanced or symbiotic) consciousness, must
remain the only decision-makers and the primary actors in society,
especially in matters that may impact human rights and democratic
processes.
6.6 Equity, integrity and inclusiveness. This technology should
not create any form of privileged or superior status for their users;
it should be implemented with respect for human equality and dignity, including
of members of marginalised or vulnerable groups; and it should be
made available as widely as possible, especially insofar as they
are applied for medical purposes.
6.7 Ensuring public trust through transparency, consultation
and education/awareness-raising. The implementation of new technologies,
such as neurotechnology intended for use by individuals, will be best
favoured and accepted if it takes place with the confidence of the
public, in awareness of the benefits as well as the potential dangers.
7 The extent to which BCI technology may have the potential
to change fundamentally the relationship between the individual’s
internal and subconscious self and the outside world implies unique
and unprecedented threats to fundamental values of human rights
and dignity. The Assembly notes with particular interest proposals
to establish and provide legal protection for new human rights,
sometimes referred to as ‘neurorights’. These proposals are intended
to fill the gaps in the existing human rights framework through which
BCI technology might threaten enjoyment of currently protected rights
and, beyond that, respect for fundamental human dignity. The rights
in question have been expressed as cognitive liberty, mental privacy, mental
integrity and psychological continuity.
8 The Assembly therefore calls on Council of Europe member States
to:
8.1 establish ethical frameworks
for research, development and application of neurotechnology, including
BCI technology, taking into account the principles set out in paragraph
6 of the present resolution;
8.2 clearly define the limits of research, development and
application of neurotechnology, including BCI technology, through
specific legal frameworks that ensure effective respect and protection
of human rights;
8.3 ensure that appropriate bodies exist for the oversight
and regulation of research, development and application of neurotechnology,
including BCI technology, so as to ensure effective implementation of
the applicable ethical and legal frameworks;
8.4 consider the establishment and legal protection of new
‘neurorights’ as a particularly effective protection against possible
risks posed by BCI technology.
9 As regards relevant work already underway within the Council
of Europe, the Assembly:
9.1 encourages
the DH-BIO to take an open and constructive approach to the question
of new ‘neurorights’, including the possibility of assuring their
protection under international law through an additional protocol
to the Convention for the Protection of Human Rights and Fundamental
Freedoms (ETS No. 5);
9.2 encourages the CAHAI to take account of the potential
risks and opportunities arising from the application of artificial
intelligence in the context of BCI systems and its particularly
serious impact on human rights.
;