Can the EU’s Digital Services Act Redefine Children’s Right to Privacy?

How can platforms be made responsible for putting children at risk when they don’t recognise that minors use their services?

6 minutes read

Sep 7, 2023

Tech’s rapid evolution has brought to the forefront concerns about its most vulnerable participants: children. Startling data reveals that one in three internet users is a child under 18, with young people in the EU spending almost twice as much time online than they did a decade ago.

Digital Transformation ≠ Digital Equality

Many online platforms fail to prioritise children’s data protection, as they were not designed with younger users in mind. Although age restrictions are generally set at 13, they are frequently ignored. A recent study pointed out the inadequacy of age checks and parental consent on many digital services.

This begs the question — How can platforms be made responsible for putting children at risk when they don’t recognise that minors use their services?

“Technology (or its absence) is now a defining feature of childhood. Children live in a digital world designed by adults, for adults, and driven by commercial interests.”— 5Rights Foundation

While it’s universally recognised that everyone is entitled to the right to privacy as protected by international and national human rights frameworks, the term ‘everyone’ frequently defaults to adults.

Negligence Has a Price

The past few years have seen Global regulatory bodies across the world increase their efforts to ensure the online safety of minors. Various entities have faced penalties and restrictions for practices that put the privacy of young users at risk.

Last year, The Irish Data Protection Commission fined tech giant Meta a massive €405 million fine for violating GDPR rules on processing children’s data on Instagram. Their decision demonstrated the consequences of a lack of oversight from tech giants and called to the the need for effective monitoring and regulation to prevent such violations.

Following this, TikTok also recently came under the ICO’s spotlight. The platform, although setting an age limit, allowed nearly 1.4 million UK children under 13 to access its content in 2020 without parental consent. The ICO initially proposed a £27 million fine for this oversight. Still, after negotiations, they settled on a £12.7 million penalty. In response, the ICO introduced the Children’s code, setting 15 standards for a safer online environment for minors.

The takeaway? Safeguarding children’s online data is not just a moral imperative; it’s a mandated one.

Advancements in Children’s Digital Rights and Policy

International and EU authorities have recently prioritised children’s digital rights, recognising that children should enjoy the same rights online as they do offline. This will help policymakers better understand how digital tools affect children’s rights, their internet preferences, and the regulations they would like implemented.

Both public and private entities are urged to engage children in conversations when drafting policies or designing products that directly impact them. Digital tools are also being promoted as an effective way to encourage children to participate more actively in policymaking.

  • The European Union has long been a proponent of safeguarding the child’s rights. The EU Strategy on the Rights of the Child, launched in 2021, ensures that digital technology is a safe and empowering space for children. The strategy outlines the need for businesses to prioritise children’s rights. This involves privacy, personal data protection, and access to age-appropriate content.

  • The EU Artificial Intelligence Act is the first regulation framework for assessing AI risk levels. AI systems that pose an unacceptable risk to individuals and society include those that manipulate people’s cognitive function, particularly vulnerable groups. For instance, voice-activated toys that encourage hazardous behaviour in children.

  • The OECD’s Recommendations propose a balanced approach to the digital environment for children. It outlines principles applicable to public and private sectors and emphasises children’s best interests, digital literacy, and evidence-based actions.

  • The UK Age Appropriate Design Code sets 15 standards for children’s data protection online. It mandates high privacy settings, minimises data collection, restricts data sharing, and curtails nudging techniques. The Irish Data Protection Commission has developed similar GDPR-based codes of practice to guidelines from other Data Protection Authorities in Sweden and the Netherlands.

  • On July 28, 2023, the US Senate Committee on Commerce approved two children’s online privacy bills — the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0). KOSA provides parental tools and places responsibilities on tech firms, while COPPA 2.0 extends the age of protection and bans targeted ads towards minors.

  • The UK’s Online Safety Bill will hold tech firms responsible for harmful content to protect online safety, especially for children. Ofcom will oversee enforcement, with substantial fines for non-compliance. The Bill is currently under review in the House of Lords.

Children’s Rights in the DSA — A Beacon of Hope?

The promising policy landscape finds its potential stronghold in the EU’s DSA, which came into force on August 25. The basic premise of the Act proposes that what is illegal in the real world must be made illegal in the online world as well.

The new rules are cognisant of the challenges younger users face and seek to reinforce children’s rights in the digital space. Jim Steyner, CEO of Common Sense Mediastated that,

‘The EU has an an opportunity to rewrite the rules with the Digital Services Act and it should seize that opportunity and require that companies take a serious look at how their product design is affecting young people through child risk and impact assessments.’

The Act speaks on several unique aspects to achieve the goal of online safety for minors:

  • Upholding Children’s Rights — The DSA references the Convention on the Rights of the Child and General Comment No 25, recognising children’s rights in the digital environment. The Convention defines a child as anyone under 18, meaning comprehensive protection across all age groups.

  • Swift Action Against Harmful Content — The Act mandates the prompt removal of illegal content that victimises children, such as child sexual abuse material and non-consensual sharing of private images. The DSA requires online platforms to take immediate action, especially after research studies have shown the widespread nature of image-based sexual abuse against children.

  • Mandatory Risk Assessments — The DSA requires Very Large Online Platforms (VLOPs) with over 45 million monthly active users to conduct annual risk assessments. The objective is to identify potential threats to children’s privacy, freedom of expression, and fundamental rights. The mandatory risk assessment acknowledges that VLOPs are more responsible for safeguarding minors’ physical and mental well-being in the face of influential algorithms.

  • Restricting Advertising and Data Practices Targeting Minors — The DSA implements strict regulations against targeted advertising directed at children. Profiling practices that use sensitive personal data categories like sexual orientation or religious beliefs are prohibited. This regulation offers additional protection to young users from manipulative digital marketing tactics.

  • Easier Understandability for Minors — Platforms mainly used by or target minors must make their terms of service easy to understand. The aim is to ensure that children are not just passive digital consumers but also equipped with the necessary knowledge to navigate the online world safely.

EU Sets the Global Standard

If left unregulated, digital service providers can expose children to inappropriate content such as pornography, dieting, and self-harm material. These providers may encourage children to spend money and engage in gambling activities, allowing strangers to track their location and contact them directly. In some cases, they may even allow strangers to enter their bedrooms through livestreams.Implementing the Digital Services Act (DSA) represents a significant step forward in providing online platforms with more robust safeguards for risk management, disinformation, transparency, and content moderation. It is crucial to acknowledge that the safety of children is a non-negotiable aspect of conducting business in Europe.Children are looking up to political leaders to take action and create new regulations that guarantee their safety online. The EU mustn’t miss out on this opportunity.

privacy

law

european union

children privacy

technology