Social media companies and websites across the UK now have a legal duty to protect children online, following the latest update to the 2023 Online Safety Act.
As of Friday 25th July, sites featuring harmful or adult content must introduce “highly effective” age assurance techniques.
This “primary priority content” – that which children must be prevented from accessing – includes pornography, and any content encouraging self-harm, suicide, or eating disorders.
If companies fail in this legal duty, they could face fines of up to £18 million, or 10 per cent of their qualifying worldwide revenue.
Ofcom, the UK’s regulator for communications industries, has also said that the most serious cases could see court-imposed sanctions, and sites becoming blocked or restricted in the UK.
If tech companies repeatedly breach their duty of care to children and ignore enforcement notices from Ofcom, their senior managers will be criminally liable, and could be imprisoned for up to two years.
11 companies are already under investigation for breaching aspects of the Online Safety Act, with Ofcom now expecting to announce more investigations into sites which host pornography but fail to meet the new age check regulations.
Research from Ofcom found that eight percent of children aged eight to 14 had accessed a porn site or app in a month in the UK, including three per cent of eight to nine year olds. Porn sites will now have to enforce rigorous age checks in line with the latest regulations.
Suggested methods include facial age recognition, open banking, digital identity services, credit card, email-based, or mobile network operator age checks, or photo-ID matching.
Over 6,000 websites containing adult content have introduced age checks, including major pornography provider PornHub. X and Telegram say they are using facial scans to check users’ ages, while Discord and Bluesky claim they are giving users a variety of age verification options. Reddit introduced age checks for forums and threads containing mature content two weeks ago.
Social media platforms are also now responsible for creating age-appropriate experiences for children online. Platforms must assess risks to children who use their platforms, and introduce age-appropriate restrictions.
Online sites must have clear, accessible ways for children and adults to report content, alongside procedures for quickly taking down any dangerous content. They must also identify an individual responsible for children’s safety, and conduct annual reviews of how they are regulating risks to young users.
Both Meta and X have outlined their methods for ensuring that children only view age-appropriate content. Meta says its teen account feature, which is a default setting for anyone under eighteen, gives these users an “age appropriate” experience. Meanwhile, X defaults users to sensitive content settings if it cannot verify that they are over eighteen. This prevents them from seeing adult content.
Technology Secretary Peter Kyle stated that these latest regulations mark “the biggest step forward” for children’s online experience since the internet’s creation, with Ofcom CEO Melanie Dawes describing their introduction as “a really big moment”. However, she also noted that it would be a “challenging path ahead”.
Some of the new age verification technologies have faced backlash by users concerned about their privacy. While Ofcom has assured that “age checks can be done effectively, safely, and in a way that protects your privacy”, concerns about data privacy, especially data leaks, have been raised by adult users.
According to antivirus provider McAfee, age verification systems have built-in protections so that adult websites do not receive users’ personal information. They are compliant with data protection laws, keeping individuals anonymous. Nevertheless, they do recommend choosing facial age estimation rather than photo ID identification where possible to reduce data sharing.
Others have suggested that individuals will try to use VPNs to circumvent the age verification process. Since the introduction of new regulations last Friday, VPN apps have become the most downloaded app on Apple’s App Store in the UK.
However, Ofcom has said that platforms must not share, host, or permit VPN usage to avoid age checks, with the government stating that it would be illegal for platforms to do this.
The government has said that “[p]rotecting children is at the heart of the Online Safety Act”, with Kyle saying he has “high expectations” for the new regulations.
Some campaigners suggest they do not go far enough, with the Molly Rose Foundation calling for additional changes, particularly in content regulation.
To track the success or otherwise of these regulations, Ofcom has launched a monitoring and impact programme. The programme focuses on sites where children spend the most time, including TikTok, Instagram, YouTube, Roblox, and Facebook.
Sites included in the programme must submit a review of their work assessing risks to children on their sites by 7th August this year.
Featured image via Primakov / Shutterstock.