A prominent advocate for online safety has called on the UK government to address the addictive features of social media platforms, as technology companies prepare to enforce new protections for children.
Beeban Kidron, a crossbench peer, urged Technology Secretary Peter Kyle to utilize the Online Safety Act to establish updated conduct standards regarding misinformation and digital elements that contribute to excessive online engagement among young users.
"The Secretary of State has the authority under the Online Safety Act to introduce new conduct codes," Kidron stated. "We have urgently requested action, but so far, the response has been dismissive."
Kidron emphasized that curbing the influence of platforms designed to maximize user engagement—particularly among minors—was not an overreach. "Ministers have the power to intervene and mitigate these effects—why not act now?"
Research by 5Rights, an organization led by Kidron, identified tactics that encourage compulsive platform use, such as displaying like counts, push notifications, and time-sensitive content formats like Instagram Stories.
Kidron spoke to *CuriosityNews* ahead of a 25 July deadline requiring online platforms, including Facebook, Instagram, TikTok, YouTube, X, and Google, to implement child protection measures. Pornography websites must also adopt strict age verification.
According to a study from England’s Children’s Commissioner, Dame Rachel de Souza, X is the most common source of adult content for young people. On Thursday, X announced that users who cannot confirm they are over 18 will have restricted access to sensitive material.
Dame Melanie Dawes, CEO of Ofcom, stated, "Platforms can no longer prioritize engagement over child safety. Companies must comply with age verification and protective measures—failure to do so will result in enforcement action."
Under the new rules, social media firms must prevent children from encountering pornography and harmful content promoting self-harm, suicide, or eating disorders. They must also curb violent, abusive, or bullying material.
Violations could lead to fines of up to 10% of global revenue—potentially billions for companies like Meta. In extreme cases, platforms may be blocked in the UK, and executives could face legal consequences for non-compliance.
Ofcom has detailed measures aligned with child safety requirements, including user reporting systems, enhanced content moderation, and restrictions on high-risk algorithms.
Read next

"Big Tech invests $155B in AI this year, with hundreds of billions more planned"
Major Tech Firms Invest Heavily in AI Development
The largest corporations in the U.S. have poured $155 billion into artificial intelligence development this year, exceeding the federal government’s spending on education, training, employment, and social services combined in the 2025 fiscal year to date.
Recent financial reports from

"Airbnb guest accuses host of doctoring photos in £12K damage dispute"
Short-Term Rental Firm Apologizes After False Damage Claims
A London-based academic received an apology and a full refund from a short-term rental company after a host accused her of causing significant damage to an apartment, using images she believes were digitally altered.
The company reviewed the case and refunded her

"Amazon's grim financial forecast fuels tariff concerns"
Amazon did not ease worries about how Donald Trump’s extensive tariffs could impact its online retail business as it released its latest quarterly results on Thursday. Investor enthusiasm for the tech company wavered in response.
Amazon’s second-quarter earnings surpassed Wall Street’s predictions. The company reported a 13.