Online Safety

Age assurance is focus of European regulatory action for online protection of minors

Published on 13th Nov 2023

What are the requirements and developments around age assurance in Europe and specifically France, Germany and the UK?

Close up of people in a meeting, hands holding pens and going over papers

One of the main areas of concern at the heart of the public debate about the protection of minors online is age assurance: to give children greater protection, it is necessary for content and platform providers to check the age of their users. This is particularly necessary when regulations prohibit or restrict the access to children to certain content online (for example, pornographic content, purchase of alcohol, online gambling and betting, and some banking services).

These checks also raise concerns around privacy and cybersecurity – and the law on what is required or permitted is not entirely harmonized across the EU and the UK.

The European Union

Among the EU legal and policy framework to strengthen minors' protection online, several provisions introduce the need for providers to check the age of internet users.

In 2018, the revised Audiovisual Media Services Directive introduced an obligation for Member States to “take appropriate measures to ensure that audiovisual media services provided by media service providers [...] which may impair the physical, mental or moral development of minors are only made available in such a way as to ensure that minors will not normally hear or see them" (article 6a).

The General Data Protection Regulation (GDPR) identifies children’s personal data as requiring special protection. The consent of the holder of parental responsibility over a child is needed for information society services up to the age of between 13 and 16 years old depending on the Member State. The age under which parental consent is necessary is set by default at 16 in the absence of specific legislation in a Member State.

More recently, the Digital Services Act (DSA) aims to create a safer digital environment and includes provisions to reinforce the protection of minors online. All providers of online platforms accessible to minors in the EU must put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors on their service.

They are also prohibited from showing ads to children based on personal data of the recipients of the service when they are aware with reasonable certainty that these are minors.

Very large online platforms and search engines must consider any systemic risks concerning their services, including any actual or foreseeable negative effects on the rights of children and their protection. This implies in terms of protection the need to consider "how easy it is for minors to understand the design and functioning of the service, as well as how minors can be exposed through their service to content that may impair [their] health, physical, mental and moral development” and to take “targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate”.

The European Commission has also proposed  a regulation to prevent and combat child sexual abuse online and to require online platforms to track child sexual abuse material in hosted content.

France

Age verification: theory and practice

In France, the law of 30 July 2020 that aims to protect victims of domestic violence introduced special procedures for blocking websites that do not comply with restrictions on access by minors. French law makes it a criminal offence to produce, disseminate or trade a message that is likely to be seen by a child and is of a violent nature, incites terrorism, is pornographic, or is of a nature that seriously undermines human dignity or incites children to engage in games that physically endanger them. The offence is constituted if the child’s access to the message is the result of a mere self-declaration that the child is at least 18 years old.

Since the adoption of this law, most of the pornographic websites continue to ask their users to declare that they are over 18 years old. France's Regulatory Authority for Audiovisual and Digital Communication (ARCOM) has started several legal procedures as a result of this continued practice, which are strongly opposed by the websites providers. According to them, in the absence of a "tried and tested" solution on which there is a consensus, they cannot be asked to do more than a declarative verification of age.

Standards in development

Faced with these challenges, a recent draft bill known as the bill "to secure and regulate digital space" (SREN) intends to entrust ARCOM with the task of setting up a binding set of standards to be met by the age verification systems to be used by pornographic websites, while increasing its enforcement powers. 

France's digital regulator will be able to order internet service providers to block the non-compliant websites without going before a judge and to impose penalties. The amount of these fines will depend on whether an age verification system exists but has been implemented improperly (up to €150,000 or 2% of the pre-tax annual global turnover, whichever is greater) or not (up to €250,000 or 4% of the pre-tax global annual turnover, whichever is greater).

The bill is still being debated before French Parliament. MPs and senators must meet in a joint committee to agree on a definitive version. However, given the fierce criticism this bill has received, an appeal to the Constitutional Council is not out of the question.

Obligations to be extended

In parallel, the law of 7 July 2023 that aims to create a special age of majority for the "digital space" imposes obligations to protect minors on social networking service providers as defined in the DMA. These are platforms that enable users to connect, share, discover and communicate with each other across multiple devices and that operate in France.

The non-exhaustive catalogue of measures includes obligations to refuse the registration of minors under the age of 15 on their services, unless express authorisation is given by one of the persons with parental authority over the minor, as well as to collect the authorisation of the holder of parental authority for accounts already created by minors under age 15.

Echoing the SREN bill, this law specifies that the providers of online social networking services shall use technical solutions that comply with a reference framework drawn up by ARCOM to check the age of users and the authorisation of holders of parental authority. It also gives ARCOM greater enforcement powers, ranging from sending formal notice letters to blocking websites and imposing administrative fines up to 1% of the global turnover.

Online platforms will be subject to this legal framework one year after the law comes into force and will be required to verify the age of existing registrants within two years.

Commission expresses opposition

The French government notified the European Commission to ensure that this legislation complies with EU law. On 25 October 2023, the Commission issued an opinion against this bill, in order to express its great dissatisfaction. In particular, according to the Commission, age verification on entry to online platforms is contrary to the DSA insofar as this requirement would be applied to foreign platforms.

Should the Commission not overturn the law, it will come into force on a date set by decree, which may not be more than three months after the date of receipt of the Commission’s response.

Privacy regulator chips in

It is, therefore, now up to ARCOM to find a robust reference framework in consultation with the French Data Protection Agency (CNIL).

The CNIL guidelines on age verification systems for pornographic websites should be taken into consideration. The CNIL considered the pros and cons of several existing solutions, including the verification by validation of credit card, assessment of age through facial recognition, offline verification, verification by analysis of ID documents, and the use of state tools. 

The CNIL then defined the “main principles” that should be followed to reconcile data privacy and implementation of age verification systems. These were no direct collection of ID, no estimation of age based on the web surfer's browsing history and no processing of biometric data for uniquely identifying or authenticating a natural person (for example, by comparing, via facial recognition technology, a photograph on an ID with a self-portrait or selfie). Finally, for the CNIL, the use of independent and trusted third-parties to prevent direct transmission of identifying data relating to the user to the service provider is of fundamental importance.

The task is, therefore, far from easy and ARCOM has already highlighted in parliamentary discussions that none of the existing technical solutions can currently be applied in a satisfactory manner, with a public consultation expected to be launched this month into this issue.

Germany

Contracting: no separate majority rules

In Germany, there is currently no legislative intention to create a special age of majority for the "digital space" that is comparable to the French regulations. The regular rules on legal capacity apply to joining platforms and social networks and otherwise entering into agreements: children under the age of seven do not have legal capacity. 

Children and adolescents from the age of 7 up to their majority can generally only conclude the contract with the consent of a parent. As consent is also valid if it is just declared to the minor themselves, and, moreover, can also only be given by implication, there is no obligation to generally check the existence of consent.

Age assurance requirements

For content that is accessible online, the Interstate Treaty on the protection of minors and the Youth Protection Act require age assurance when there is a real potential for danger or impairment but not in general for the access of certain services. German law takes a graduated approach that differentiates according to the content concerned and the degree of possible danger or impairment to young people.

The content listed in section 4(1) of the Interstate Treaty on the protection of minors is absolutely inadmissible and is generally forbidden to be distributed either in traditional broadcasting or in online media.

Pornographic content, certain content declared harmful by a federal regulator, and content that is obviously harmful to minors may be distributed on the internet, but only if the provider ensures through closed user groups that only adults have access to it by age verification (AV) systems.

While the law does not specify any requirements or binding approval procedure for closed user groups or AV systems, the Commission for the Protection of Minors in the Media (KJM) has developed criteria and a voluntary procedure of positive evaluation of age verification concepts. 

The authority evaluates concepts upon request from companies or providers and publishes a list of the concepts rated as positive on its homepage. The positive evaluation establishes at least "relative" legal and planning security for the providers.

According to the KJM criteria, the AV system should have a two-level structure. The first level of protection is the one-time identification of the user by personal contact and comparison of official identification documents (identity card or passport), so-called "face-to-face control"; for example, by way of the "PostIdent" procedure (i.e. verification by presenting the identity card/passport in person at a post office). The second level of protection ensures, through the authentication of the user for each individual access to relevant content, that only the respective identified and age-verified person is granted access to closed user groups.

Requirements for unsuitable content

For content that is not outright harmful but inappropriate for minors, providers have a duty to ensure that it is not usually seen by children or young people of the age groups concerned. The measures do not have to make access impossible: it is sufficient if access is made significantly more difficult.

The law leaves the provider with a number of options to comply with this duty. This can be by age rating, which can be read by a (user-side) youth protection programme that has been officially approved. Outside of closed systems (for example, on websites), an age-de.xml label is typically used for this purpose, which can be read by the officially approved youth protection programme JusProg. If JusProg is installed on the device in use, inappropriate websites can be detected and accordingly not displayed.

It can also be by (provider-side) technical or other means. This can be, for example, age verification systems. However, the requirements for these AV systems are lower than for those used to ensure content is only available to closed user groups. Only plausible evidence for the stated age must be obtained. For example, an identity card number check will regularly suffice.

And it could be by limiting the broadcasting time; for example, only broadcast content for 16 years olds or older between 10 PM and 6 AM.

Rules for platform providers

Two relatively new rules in the German Youth Protection Act also set new obligations for providers of platforms. Providers of film and game platforms are not allowed to offer any films or games without sufficient age ratings.

Secondly, host providers, social networks, video-sharing services and communication services that store users' communication content must implement effective structural precautionary measures to shield minors from inappropriate content.

The non-exhaustive catalogue of measures includes the provision of a reporting and redress mechanism for users to submit complaints about inappropriate content or content that is harmful to the development of minors, the provision of a rating system for user-generated audiovisual content, and the provision of technical means for age verification for user-generated audiovisual content

Data privacy

While creating youth protection concepts, attention needs to be paid to data privacy and aligning the protection of minors and data protection. According to the German Telecommunications-Telemedia Data Protection Act, the provider is prohibited from processing data for commercial purposes that is collected from minors to ensure the protection of minors, for example by means of age verification.

The collection and use of personal data from ID cards or with the aid of an ID card may only be carried out in accordance with the special provisions of the Act on Identity Cards and Electronic Identification. The transmission of a photocopy of an ID card for the purpose of identification and legitimation is now permissible on the basis of the consent of its holder. This allows age verification by sending copies of the minor's or legal guardian's ID card in paper form or via e-mail or video chat. The serial numbers of ID cards may not be used by means of automated procedures to retrieve or link personal data.

International enforcement

Even providers based outside of Germany are not safe from the German authorities when it comes to effective age verification according to German standards. A German supervisory authority proceeded with an order against the provider of websites with pornographic content based in Cyprus to block access to the websites, as, according to the authority’s position, websites were aimed at the German market. The position of the authority has been confirmed by a second-instance ruling by the Supreme Court. There may be further supervisory and enforcement measures in the near future against other providers based outside Germany.

UK

Age assurance and data protection

The UK has its own version of the GDPR implemented into its local law following Brexit, which contain the same protections for children as the EU version (with the UK opting to have the age a minor can consent to data processing at 13). 

In addition, the UK's Information Commissioner (ICO) has published a code to protect children's privacy that sets out how the data protection principles should apply to children and outlining how to assess risks when using age assurance techniques.

The ICO has recognised that age assurance may itself require the use of personal data and that age assurance techniques must take account of the principles set out in the children's code. In addition, the risk of using age assurance must be used in proportion to the identified risks to children (for example, risks like discrimination).

While outlining several age assurance (and age estimation) techniques, the ICO does not recommend one approach over the other, since the appropriateness of these depend on the information required and the risks to the children in question.

The Online Safety Act

Recently the UK has brought the Online Safety Act into law, which aims to protect users (and especially children) from harmful content. It does so by effectively putting in a duty of care on "in scope" service providers. These include service providers that provide user-to-user services, search services and other services (such as those who publish or display pornographic content).

Regulated providers will need to carry out assessments of the risk of illegal content appearing on their services. They will then need to take steps to ensure that users do not encounter specified types of illegal content and that, when the content does appear on the platform, it is quickly identified and removed.

The Online Safety Act specifically requires providers to use either age verification or age estimation (or both) to prevent children of any age from encountering content that is harmful. In addition, while the exact tools are not mandated, the legislation requires these age verification or estimation tools to be "highly" effective.

The new Online Safety Act also includes provisions for the regulator to issue large fines for non-compliance. The UK regulator Ofcom will act as the online safety regulator and will be able to fine companies up to £18 million, or 10% of qualifying worldwide revenue, if they fail to meet their new duties of care.  

While similar in some ways to the EU DSA – which also has the effect of protecting minors from harmful content – the mechanisms in which the law uses and regulator powers differ. However, the Online Harms Act is a seismic regulatory shift for service providers in the UK.

Osborne Clarke comment

What then are the prospects for improving age assurance in Europe? Despite the EU and national legal framework establishing expressly or implicitly the need for age assurance to access online services, whether platforms or social networking services, the situation remains unsatisfactory.

Various technological age assurance mechanisms exist, including age estimation – for instance, language analysis or use of artificial intelligence – self-declaration and age verification such as identity documents, credit-card checks and facial recognition technology.

However, the more accurate and robust the mechanisms, the more likely they are to be intrusive and infringe the fundamentals rights and freedoms of users, including the right of privacy and personal data and the freedom of speech. 

There is also a lack of harmonisation of the legal frameworks across Member States on parental consent verification, which raises compliance challenges for online service providers.

Action is expected at a European level. In May 2022, the Commission’s updated its "Better Internet for Kids+" strategy, which aims to complement and support the implementation of existing measures to protect children online, develop children’s skills and empower them to safely enjoy and shape their life online. 

Actions stemming from the BIK+ strategy include the drafting of a comprehensive EU code of conduct on age-appropriate design. It also supports the development of methods to prove age in a privacy-preserving and secure manner that have EU-wide recognition and are based on the euCONSENT project.

Share

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Interested in hearing more from Osborne Clarke?