C Explanatory memorandum by Mr Stefan
Schennach, rapporteur for opinion
1. As well documented in Mr Houbron’s
report, it is far too easy to access adult-only content with insufficient age
recognition and regulation systems, especially in a largely unregulated
Internet environment in which extreme violence, including sexual
violence, can be perpetrated anonymously via online media, targeting fictional
characters or human beings. This can all be especially damaging
and traumatising for children and young adults.
2. I would like recall
Resolution
2001 (2014) and
Recommendation
2048 (2014) “Violence in and through the Media”, as well as the
corresponding report which remains relevant to the present analysis.
Insufficient regulation of the online environment enables easy and
fast dissemination of pornography and the amount of new content
uploaded every minute makes it virtually impossible to moderate
extreme sexual violence online.
3. Interactive computer games, social networks, chat rooms, search
engines, online shopping and the universal accessibility of those
media via smartphones have created infinite exposure for children
to adult-only and often extremely violent content. Violence may
also be insidiously conveyed through mainstream media, for example
in the depiction of the hyper-sexualisation of children. This is
the ratio for amendment A.
4. A growing number of children today own smartphones or other
internet-connected devices and have personal social media accounts,
including in the developing world. Pop-up advertisements are the
most common source of pornographic content that reaches children,
especially teenagers. Other common routes for accessing pornographic
content include video and photo-sharing sites or new social media
that are created every day.
5. We must pay special attention to virtual reality environments,
such as the metaverse, where users can interact online in a three-dimensional,
multi-sensory way, that can affect the minds and bodies of children through
potentially violent and pornographic content and sounds. Unwanted
images and contacts can become even more intrusive and difficult
for parents to control. Some metaverse applications even allow children
to enter virtual strip clubs with 3D avatars simulating sex, with
no or little moderation. Children can also experience grooming behaviours,
racist insults, and rape threats.
Note
6. According to the latest Global Threats Assessment Report by
the WEProtect Global Alliance, 1 in 3 (34%) respondents to their
global survey, were asked to do something sexually explicit online
that they were uncomfortable with during childhood. In addition,
the Internet Watch Foundation saw a 77% rise in child ‘self-generated’
sexual material from 2019 to 2020 and the pandemic has further exacerbated
this phenomenon. Research indicates that abuse in virtual and augmented
reality can be far more traumatic than in other digital worlds due
to the multisensory nature of the environment in which it is propagated.
Note
7. To address safety in a comprehensive way as the metaverse
emerges, public authorities need to partner with internet operators,
the gaming industry, designers and tech service providers, as well
as with academia and civil society. All stakeholders must carefully
consider the safety of children’s minds and bodies, not just looking
at profits or sales. Digital platforms need specific terms of service
for immersive environments, based on how this technology interacts
with children’s brains and cannot simply apply rules from existing
social media. Governments should also consider ways to incentivise
safe behaviour on these platforms as part of a safer and healthier
digital future for the younger generations.
8. Cybersecurity is also an issue to consider, as some violators
may spur children, in particular teenagers, to visit pornographic
websites, lure them into believing that they would become “influencers”
or “celebrities” and have them share their personal data, making
them vulnerable to becoming a victim of blackmail and extortion.
9. While artificial intelligence and software-based verification
tools can assist law enforcement, as stressed in the report, trained
content moderation could lead to an additional identification of
non-consensual imagery or explicit sexual violence. Member States
could consider supporting some forms of content moderation, where appropriate,
in particular in relation to virtual reality environments, as purely
automated solutions might be unable to identify the risks for children.
10. Member States should also consider supporting the development
of anonymous complaint and reporting mechanisms, and self-regulation
in the form of stringent codes of conduct aimed at avoiding children's exposure
to adult content. They should also seek to reinforce co-operation
of law-enforcement authorities with the private sector, to effectively
fight the dissemination of illegal content. These are the ratio
for the recommendations put forward with amendment B.
11. Finally, via its General Rapporteur on Science and Technology
Impact Assessment, the Committee on Culture, Science, Education
and Media intends to investigate the risks, challenges and human
rights implications of emerging technology, such as virtual reality,
augmented reality and immersive technologies. The committee would
like to associate the Committee on Social Affairs, Health and Sustainable
Development in this endeavour, with a special focus on the psychological
and physiological aspects of immersive technology on children and
youth.