Ofcom to tell social media sites to protect children from adult content

Stay informed with free updates

Ofcom will this week unveil codes of practice to prevent children from accessing adult content on platforms such as X and Meta under the Online Safety Act, which has emerged as a potential flashpoint in UK-US trade talks.

Britain’s media regulator will tell social media, search and gaming services that they need either to remove or “age-gate” adult content, such as pornography, or find other ways to protect children from certain “legal but harmful” content, said people familiar with the matter. 

The act, which is being introduced in stages after passing into law in 2023, marks one of the largest ever shake-ups to how Britons access social media, including mainstream platforms such as Instagram, X and Facebook.

Melanie Dawes, Ofcom chief executive, last year told the Financial Times that the industry faced a “big change” in how it operated. 

In practice, the codes mean far-reaching changes to how algorithms serve up adult content, removing content entirely or tough new age checks to stop under-18s accessing sites and apps that have any adult content. 

Social media sites may need to use strict age verification tools, such as requiring credit card details for the first time, or adopt technology including age-aware facial estimation.

Tech groups will have a number of other ways to stop children from seeing adult material — including “clean” areas, removing any of the sort of pornography that is often commonplace even on social media sites with age limits under 18. 

Alongside porn, under-18s should no longer encounter suicide, self-harm and eating disorders posts, and should be protected from misogynistic, violent, hateful or abusive material, Ofcom will say.

The codes suggest practical measures that platforms can take to meet their duties, including configuring algorithms to filter out harmful content from children’s social media feeds and internet searches.

Tech companies had until last week to carry out a so-called children’s access assessment to establish if their service — or part of their service — was likely to be accessed by children. Facebook, Instagram, Snap, X and TikTok all allow users from the age of 13.

Tech groups have until the end of July to complete a separate assessment of the risk their service poses to children and then start applying the measures to mitigate risks. Companies that breach the act face fines of up to £18mn or 10 per cent of worldwide revenue.

Ofcom is also set to launch an additional consultation on further measures, including the use of artificial intelligence to tackle illegal content and the use of hash matching to prevent the sharing of non-consensual intimate imagery and terrorist content. 

Hash matching, or hash scanning, compares certain pieces of content, such as videos, pictures or text, to a database of illegal content.

The watchdog will also propose crisis response protocols for emergency events such as last summer’s riots.

Parts of the Online Safety Act — such as ordering social media companies, search engines and messaging apps to remove illegal material quickly and reduce the risk of content from appearing — have already been enacted. 

But online safety campaigners are concerned the US will demand the legislation be watered down as part of any trade negotiations with America, given the new impositions on largely US-based social media sites.

US officials asked about the act in a meeting with Ofcom last month, while vice-president JD Vance brought up free speech infringements related to American tech companies when UK Prime Minister Sir Keir Starmer visited the White House in February.

“I can’t imagine a scenario in which Keir Starmer’s government would offer up child safety for a trade deal, because to do so would be in effect to make it unfit to serve,” said Baroness Beeban Kidron, a crossbench peer in the UK House of Lords and digital rights campaigner.

Snap said it was “supportive of the goals of the Online Safety Act, and continues to work with Ofcom on its implementation”. 

Meta said all UK teenagers using its platforms, including Instagram and Facebook, had been moved to new “Teen Accounts” to help comply with the new regulations, although people aged 17 and 18 can disable the restrictions.  

X said it was “taking all required steps to ensure compliance with UK law”, while TikTok said it would also comply with provisions.


Source link

Total
0
Shares
Related Posts