Saturday, March 29, 2025

UK Tech Firms Face Fines If They Fail to Overhaul Algorithms and Age Checks

April 24, 2025 7:31 PM
Latest News

New Regulations to Protect Children Online

In a bid to safeguard children from harmful content online, UK tech firms have been instructed to overhaul their algorithms and strengthen age verification processes. The UK media regulator, Ofcom, has unveiled its final “Children’s Codes,” which impose stricter rules on platforms that host harmful content or target young users.

Platforms hosting pornography or content promoting self-harm, suicide, or eating disorders must take swift action to prevent children from accessing such material. Failure to comply with these regulations could lead to substantial fines, Ofcom warned.

The Importance of Age Checks

Ofcom’s Chief Executive, Dame Melanie Dawes, described the new rules as a “gamechanger,” emphasizing the importance of accurate age verification. She stated that without proper age checks, platforms cannot offer age-appropriate experiences to users, especially children. While some critics argue the codes don’t go far enough, Dawes remains confident the measures represent a major step forward.

Call for Stronger Action

Although the rules are being hailed as a crucial advancement for children’s safety, some, like Ian Russell from the Molly Rose Foundation, have expressed dissatisfaction. Russell, who founded the organization after his daughter tragically took her life at age 14, criticized the codes for lacking ambition.

Despite the concerns, Dame Melanie stressed that the regulations have legal power. “Companies will need to change how they operate if they want to continue offering services to children in the UK,” she said.

Tackling Harmful Algorithms

A key aspect of the new rules focuses on altering the algorithms that determine what children see online. Technology Secretary Peter Kyle highlighted how harmful content often ends up in children’s feeds, even if they do not actively seek it. He stressed the importance of addressing these algorithms to protect young users.

Kyle also mentioned the potential for a social media curfew for children under 16, though he stated further research was necessary before implementing such measures.

Key Measures in the Children’s Codes

The new regulations, which are still subject to parliamentary approval under the Online Safety Act, include over 40 specific actions tech firms must take:

  • Algorithms must be adjusted to prevent harmful content from appearing in children’s feeds.

  • Robust age verification systems for accessing age-restricted content.

  • Quick removal of harmful material once detected.

  • Clear and easily understandable terms of service for young users.

  • Options for children to decline invitations to group chats that may expose them to harmful content.

  • Providing support to children who encounter distressing content.

  • A designated person within the company responsible for child safety.

  • Annual risk assessments regarding children’s safety.

Ofcom also stated it has the authority to impose fines and, in extreme cases, block apps or sites from being available in the UK if companies fail to comply.

Reactions and Calls for Further Action

The NSPCC has largely welcomed the new rules, recognizing them as a significant step for online safety. However, the charity has urged Ofcom to address the issue of private messaging apps, many of which use encryption, preventing platforms from monitoring potentially harmful content shared in private conversations.

Have something to say? Post your comment