On 12 October 2021, EIF organised the virtual debate “Protecting and empowering children and young people online”, co-hosted by MEPs Ivan Štefanec and Liesje Schreinemacher.
Five experts joined the debate, moderated by EIF Director General Maria Rosa Gibellini, to discuss how to make the online world fitter for children and which legal instruments are in place or needed in order to ensure that children are protected and empowered online:
- June Lowery-Kingston, Head of Unit, Accessibility, Multilingualism & Safer Internet, and Deputy to the Director for Data at DG CONNECT, European Commission
- Christopher Payne, Director of Digital Responsibility, Government and Public Affairs, at the LEGO Group
- Charlotte Niklasson, Director of European Affairs at Nordvision
- Alexandra Evans, Head of Safety Public Policy, Europe at TikTok
- Leanda Barrington-Leach, Head of EU Affairs at 5Rights
MEP Ivan Štefanec kicked off the debate by underlining the growing presence of children on the Internet daily, amounting, in developed countries, to more than 95%. This strong growth also has some potential threats: child protection must continue to be included in all new European digital strategies and be a priority also for the private sector and civic society.
According to MEP Štefanec, there is the need to align offline and online protection, following all the threats closely; new technologies – including AI – can help, but they must be used wisely.
He then underlined the different recent activities on the subject at institutional level, like the EPP group’s recently adopted position paper on the Rights of the Child and the Better Internet for Kids Strategy being updated by the European Commission.
MEP Liesje Schreinemacher seconded the urgency of the topic: one out of three Internet users is under 18, a number that continues to grow, especially because of the pandemic. The Internet offers endless opportunities, but it was not designed with the child in mind.
MEP Schreinemacher stressed the need to have the right legislation and enforcement, outlining the way the Internet should be for the youngest users, including improving digital literacy among children and their parents.
Clear lines on what we can and cannot accept in our online environment should be drawn: besides asking for an impact assessment and risk-mitigation measures, MEP Schreinemacher believes that European policymakers should end the commercial following of users under 18 years old and tackle these threats through new legislation such as the Digital Services Act, the legislation to counter online sexual abuse, as well as the ongoing Better Internet for Kids Strategy.
June Lowery-Kingston gave an overview of the different EU initiatives for the protection and empowerment of young people online, a theme which is covered by the recent Digital Compass but which has been tackled by the EU institutions since the late 90s with, in 2012, the global benchmark set by the European Commission's European Strategy for a Better Internet for Children.
All these actions have been complemented by relevant EU legislation because, since 2012, the digital environment, especially for children, has changed dramatically. The update of the Big Strategy requested by the European Parliament is now underway, assured Ms. Lowery-Kingston, and will act as the digital arm of the Comprehensive Right for the Child Strategy, calling on the tech industry to raise their standards towards their youngest users.
And in this context, Ms. Lowery-Kingston thanked the European Parliament for their initiative and collaboration.
Christopher Payne explained to the audience the great importance of this topic to The LEGO Group and what they are doing concretely to promote and engage in a digital future that is fit for children. This work, in fact, goes beyond the physical LEGO brick, working with digital consumer technology for decades and offering to the present day a range of digital products and services for children of all ages.
The LEGO Group invests in new technologies and believes in the potential of tech and data to enrich the physical play experience, stimulate creativity, collaboration, and confidence, being aware of its risks. “As a company,” assured Mr. Payne, “we have a lot of experience when it comes to embedding safety into the design of our play experiences and we recognize children as vulnerable citizens, engaging with them to protect their rights and foster their wellbeing.”
According to Mr. Payne, the DSA represents a unique opportunity to stimulate and incentivize investment in a type of innovation that will empower children online and made a call for consistency with online safety legislation across the world.
Charlotte Niklasson from Nordvision reiterated how protecting and empowering children and young audiences online are at the heart of public service media companies’ work. Equally important as safety is to always put children and youth empowerment on top of the agenda, working with media and digital literacy to assist children into making informed choices in strong cooperation with them and ensure that they feel reflected in all forms of diversity in the productions. This is one of the most important parts of content production within Nordvision’s companies at the moment, said Ms. Niklasson: to always bring in kids into the production and launch of new services.
Nordvision, together with the European Broadcasting Union and other media stakeholders, advocates for a safeguard within the DSA to prohibit platforms from interfering or take additional control of what is considered editorial content. Media companies fall very strictly under national legislation and take full responsibility for this content and services, including the very strong responsibility for online safety and empowerment of younger audiences.
Alexandra Evans outlined the different measures that TikTok puts in place to protect young people online: users can't send an unsolicited message; it is not allowed to send videos or images as private messages and the under 16 have even more enhanced protection measures on the platform, such as to have a private account by default, the disabling of direct messaging and live streams hosting. For early teens, push notifications stop at 9PM; for late teens at 10PM. Under 13 are prevented from accessing the platform.
Safety is critical, a foundational pillar when it comes to empowered self-expression, said Ms. Evans. TikTok established three Trust and Safety hubs and a European Safety Advisory Council with external experts. The company is actively following policy initiatives in Brussels, such as the DSA, the European Commission's Child Safety Strategy and the upcoming legislation on child sexual abuse material. Tiktok has been welcoming the European Commission's DSA proposal and appreciates the emphasis on transparency as a means to show accountability.
Leanda Barrington-Leach from 5Rights introduced the audience to the specific vulnerabilities and needs of children online and how the digital world is not optional for children and yet it is a place where their rights are systematically overlooked, ignored, undermined, trampled. Children are routinely served up harmful content and the impact is real and severe. The goal of the 5Rights Foundation is to ensure that the digital world is designed or redesigned to reflect, uphold, and promote children's existing rights. Children's specific needs must be reflected in the design choices of tech companies and all services likely to impact children must reflect their rights, by design and default.
In practical terms, we need to (1) lay down the norm, through horizontal legislation such as the DSA, specifying that one child is anyone under the age of 18, (2) set minimum standards, such as the Draft Standard on Age Appropriate Digital Services Framework by IEEE, the Age Appropriate design code in the UK or the euConsent project, and (3) ensure transparency, oversight, and effective proportional enforcement.