News: 1758276038

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Charities warn Ofcom too soft on Online Safety Act violators

(2025/09/19)


As UK ministers continue to quiz stakeholders over the effectiveness of the Online Safety Act, one charity chief raised concerns over the robustness of Ofcom's enforcement of the controversial legislation.

Asked about how well the communications regulator has enforced penalties on organizations that violate the OSA, or fail to implement the required safeguards, Andy Burrows, CEO of the Molly Rose Foundation, said: "I do not get the impression that the companies are quaking in their boots at Ofcom's enforcement approach."

The Molly Rose Foundation was established in 2017 by the family of 14 year-old Molly Russell, who took her own life. Her father discovered she had viewed thousands of images online that promote suicide and self-harm.

[1]

Burrows went on to say that some of the enforcement action already taking place may be under supervision, but this is an opaque process and is therefore difficult to say whether it will be "sufficiently industrious."

Supervisory action following OSA violations sees Ofcom working with an organization to improve their compliance with the legislation, either before or instead of issuing harsher penalties, such as fines.

The maximum fine for violating the OSA is £18 million ($24.3 million) or 10 percent of annual worldwide revenues, whichever is greater.

Baroness Kidron, founder of the 5Rights Foundation, agreed with Burrows, adding that if Ofcom doesn't truly test its regulatory powers, then it will not be able to provide the right evidence to ministers if it ever wants part of the OSA amended.

"I would absolutely say in defense of Ofcom that the act is wrong in certain places and does leave certain gaps, and will need some more work," Kidron said.

[2]

[3]

"The frustration is that, actually, where it is clear and where it is mandated, we don't want to see [Ofcom] stroking [platforms] and saying, you know, 'come on guys, do it, do it, do it.'

"We want to see them taking action, being robust, and I'm very sympathetic to Ofcom in anywhere where they feel they need a power and it has not been provided by Parliament."

[4]

Burrows' comments followed a question from the Communications and Digital Committee about the mandatory age assurance and whether Ofcom has learned any lessons from the rollout that could be applied if these rules are extended to other organizations, such as [5]VPN providers.

Baroness Kidron focused on the regulator needing to ensure age assurance mechanisms are privacy-preserving, and not demanding more data than is necessary to verify a user's age.

"I think that what I would now like to see is a recommitment on privacy, and I would also like to see Ofcom use its powers to say that where it is not privacy-preserving, age assurance has not met the bar of being highly effective because it's not highly effective in a cultural sense, even if it actually determines whether you are 18 or not."

As part of the OSA - which is a set of UK laws "that protects children and adults online" and puts "new duties" on social media and search services to be remove illegal content - Ofcom stipulates that in-scope platforms must deploy highly effective age assurance (HEAA) systems.

The regulator published [6]guidance [PDF] on what constitutes an HEAA system in January, including a non-exhaustive list of what it thinks should meet the criteria.

These include: open banking; photo-ID matching; facial age estimation; mobile network operator age checks; credit card checks; email-based age verification; and digital identity services.

She said that failing to privacy-protect these solutions is key to building the public's trust in them – trust that quickly waned following the new rules coming into force in July.

Burrows highlighted the early days of the latest OSA measures coming into force, and how people quickly found crude workarounds for age verification measures, some of which included using [7]video game avatars to pass as adults , as an example of why public trust has diminished.

[8]

"That would raise questions to me about whether that is a highly effective measure that is being deployed," he said. "So, I would like to see Ofcom act quickly because public trust here is precious.

"Clearly, there do need to be privacy-preserving mechanisms, and we know that they exist. And I think enforcement is now the best way of being able to demonstrate to the public A) that this can be done, and B) that where we have seen high-profile examples which have generated public concern, that that is a reflection as to whether or not we have seen compliance rather than as it has been framed, whether or not this can be done."

According to Lord Vaizey, speaking in the House of Lords on Monday, since mandating HEAA systems for in-scope platforms, Ofcom has begun investigating 47 websites and apps that are suspected of non-compliance.

Safe harbor

Another hotly debated topic was the safe harbor provision in the OSA – if in-scope platforms implement every measure Ofcom recommends, then they can rest assured they won't be punished under the act, even if something goes wrong.

The idea is that it provides platforms with a sense of safety that, even if one of Ofcom's recommendations is flawed, then whatever happens as a result of that failure can't bite them until Ofcom changes its tune.

However, that safety net may come with a cost in that it incentivizes platforms against innovating and going beyond what's required of them.

"The way that it's worked in the Online Safety Act is that if you do Ofcom's 44 measures, or whichever number it is now, then you're fine," said Kidron.

"Now you can choose to do something different, but if you do something different, you don't have safe harbor, and that's a problem because if what you could do is different, quicker, better, more advanced, more thoughtful, more nuanced, you could actually not have safe harbor for doing the better thing, and I think that's why we're talking about it as a negative incentive in this instance."

[9]

Rani Govender speaks at a Communications and Digital Committee hearing alongside Andy Burrows and Baroness Kidron

The committee asked why is it that the safe harbor provision is removed when companies aim to do better. Rani Govender, policy and influencing manager at NSPCC, said that larger platforms are likely a step ahead of others in terms of being able to identify trends in online harms, because they have the power to collect more data than the regulator.

[10]UK Lords take aim at Ofcom's 'child-protection' upgrades to Online Safety Act

[11]Experts scrutinized Ofcom's Online Safety Act governance. They're concerned

[12]UK toughens Online Safety Act with ban on self-harm content

[13]Good morning, Brit Xbox fans – ready to prove your age?

She pointed out that these platforms may have access to the information that could improve the act – by compelling all platforms to act against new harms – but there is little benefit to them for doing so.

"Now, if they're spotting new trends, new ways that harms are developing on their platform, but there isn't anything in the codes of practice that addresses that, then there is no obligation on them to address those harms," Govender said.

"So, we're thinking about how do we stay on top of emerging harms. Well, there has to be something that forces companies, once they've identified them, to immediately take action and look at what they could do to mitigate them, and at the minute there is not that incentive there."

Evolving online harms

The resulting discussion also led to concerns being raised about Ofcom's ability to keep pace with the latest online threats, and regulate accordingly.

Burrows acknowledged that Ofcom was doing a great job of understanding and articulating the risks related to matters such as child sexual abuse, and regulating these widely known issues.

However, he said there are newer harms that worry the Molly Rose Foundation "tremendously," namely the threat posed by [14]Com groups .

The National Crime Agency (NCA) issued an [15]alert earlier this year warning of the dangers associated with com groups, composed largely of teenage boys.

Reports of these groups have increased sixfold between 2022 and 2024, and it said they cause harm via a broad spectrum of criminality.

Regular Reg readers will be familiar with the idea of Com networks and their involvement in cybercrime – from data breaches to fraud to ransomware. Com groups are sometimes made up of a [16]new generation of English-speaking cybercriminal .

Also of special concern are the Com groups that share misogynistic content, sexual abuse material, and in Burrows' experience, those that groom even younger people into carrying out acts of self-harm.

He told the committee: "We are seeing them commit a whole range of truly appalling harms, including, essentially, a new type of grooming focused on suicide and self-harm driven by sadistic behaviors.

On something like child sexual abuse, I think Ofcom is doing a very good job of understanding and articulating the risks, but on these newer harms, they are not where they need to be – Andy Burrows, CEO of the Molly Rose Foundation.

"Now as someone who's worked in this space for decades, I have to say this is probably one of the threat types that I find more disturbing and chilling than anything else that I have seen," he added.

"There are law enforcement agencies who are queuing up to say that this is a huge concern that we are starting to see children, and particularly girls, being groomed for purposes of self-harm and suicide.

"The most appalling egregious acts of harm, stories of girls being coerced into self-harm acts relating to the groups and, you know, I've heard from parents here in the UK who are desperate to see the regulator take action, and some of those risks are not being recognized."

An Ofcom spokesperson told The Register : "Online safety rules came into force recently and change is already happening. The majority of the top 100 most popular adult sites in the UK have now deployed an age check, accounting for nearly two thirds of daily visits to adult sites in the UK. We've also seen popular social media, dating, gaming, and messaging apps introduce age assurance to prevent children accessing harmful content.

"We're holding platforms to account and launching swift enforcement action where we have concerns. We've already launched investigations into 69 sites and apps, and expect to announce more in the coming weeks and months.

"Technology and harms are constantly evolving, and we're always looking at how we can make life safer online. We've already put forward proposals for more protections that we want to see tech firms roll out."

Ofcom chief executive Melanie Dawes will field questions from the committee during a session next week, as part of a continuing effort to gather views on the regulator's latest proposals. ®

Get our [17]Tech Resources



[1] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aM1-GAvpKGU-r-lMPx4BlAAAA0w&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aM1-GAvpKGU-r-lMPx4BlAAAA0w&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aM1-GAvpKGU-r-lMPx4BlAAAA0w&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aM1-GAvpKGU-r-lMPx4BlAAAA0w&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[5] https://www.theregister.com/2025/07/28/uk_vpn_demand_soars/

[6] https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-age-assurance-and-childrens-access/part-3-guidance-on-highly-effective-age-assurance.pdf?v=395680

[7] https://www.theregister.com/2025/07/31/banning_vpns_to_protect_kids/

[8] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aM1-GAvpKGU-r-lMPx4BlAAAA0w&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[9] https://regmedia.co.uk/2025/09/19/osa_comms_committee_hearing_16_09_2025_parliamenttv.png

[10] https://www.theregister.com/2025/09/15/uk_lords_take_aim_at/

[11] https://www.theregister.com/2025/09/11/concern_and_sympathy_as_experts/

[12] https://www.theregister.com/2025/09/09/selfharm_online_safety_act/

[13] https://www.theregister.com/2025/08/28/xbox_online_safety_act/

[14] https://www.theregister.com/2025/07/23/irl_com_recruits_teens/

[15] https://www.nationalcrimeagency.gov.uk/news/sadistic-online-harm-groups-putting-people-at-unprecedented-risk-warns-the-nca

[16] https://www.theregister.com/2025/09/18/two_teens_charged_in_tfl_case/

[17] https://whitepapers.theregister.com/



zimzam

This all seems like parents not wanting to deal with the fact that they didn't notice their children were thinking about harming themselves, so they've convinced themselves that "the internet" tricked them into it.

Yet Another Anonymous coward

And the solution is to have history sites, talking about VE-day, self-censoring that the Hitler "self harmed" in the bunker to effectively end the war in Europe

elsergiovolador

Drinking alcohol is self-harm. Every movie that promotes alcohol use - for example, by showing an iconic actor happily sipping a drink - should be censored.

Yet Another Anonymous coward

Now listen to me Bond. We have to think of the children, no drinking, no swearing, no touching ladies

Can I still shoot people in the face, M?

Of course - it's a free country !

Anonymous Coward

You wouldn't even hear about these parents if the government wasn't hell-bent on using them as useful tools to force through their censorship agenda.

Anonymous Coward

It pains me to say it, but there seems to be a fashion for eponymous laws as a method of expiating guilt by blaming "the other" for one's own neglect (even if that neglect is only perceived); which is replacing any fundamental and critical examination of the problem at hand. Teen suicides were a thing long before the internet, and I'm not sure that coroner's reports and evidence from parents are giving us a complete picture of the internal anguish of the deceased, let alone the ways to divert them from self-destruction.

Wang Cores

Again there is no more wealth for the rich to steal so they're working their way to making us tenant-farmers again.

This is just setting up the mechanism for disrupting collective action against it.

Seems fair enough

JimmyPage

After all, data breaches aren't treated with any seriousness either.

Just make sure you don't misgender people and you'll be OK.

Re: Seems fair enough

Anonymous Coward

Is it so hard to be a decent human being?

Government

elsergiovolador

Government website still have not implemented age checks. You can see the bare c*nts without any verification.

Lead by example, eh?

Hah....Another Talking Shop About "Enforcement"

Anonymous Coward

Yup.....millions are using VPN technology....so the Online Safety Act is actually a joke.

Yup.....no one is enforcing the 30mph speed limit in streets in London.

Yup.....GDPR is a joke.

Yup.....Ofwat can't stop our rivers overflowing with sewage.

Yup....."PREVENT" doesn't prevent anything.

Oh....and about the SW1 chatter about enforcement.........just window dressing!

Oh...and another thing.....The Plod don't have adequate IT resources for "enforcement"....see:

- Link: https://www.theregister.com/2025/09/09/nca_legacy_tech/

ALL PURE THEATRE!!!

Tubz

Google one of the biggest rule breakers, if you use their image search, you can easily see stuff that falls foul OSA, yet Google has not been fined 1 penny. A case of if you're too big and American, you can get away with breaking laws?

Helcat

Some of the sites are using some common sense (I know: Doesn't sound likely, but apparently it's happening).

for example: The age of the account. If it's over x years old, it has to be an adult. Similarly, if the account has records of payments via credit card that matches the user: That's proof the person is an adult. Things like that.

With google, you have to turn off safe search to see the adult stuff in images. And it's only pointing people to the websites that provide that material, and those are the sites that have to do the age verification (Or just outright block users from the UK, but you still get the thumbnails. Yes, that even happens when you're looking for 'SFW' pictures and try clicking on a site that also supplies 'NSFW' stuff).

Jamie Jones

YouTube, owned by Google is constantly broadcasting scam ads. I reported 2 in the last few months, and in both cases got the reply "we've determined that the ads don't go against our policy, so no further action will be taken".

The ASA says to let them know, and they'll try and help the ad companies know which adverts are fake, and need removing. Nothing about suing their arses for allowing such ads to be broadcast.

If I was the owner of a terrestrial TV channel whose ad revenue has been decimated by online ads, I'd be pushing this issue all the way.

Can you imagine if ITV showed adverts for:

"The energy companies tried to ban us from selling this. This amazing small device will heat your home for pennies" [1]link

"The electricity companies hate us for this device - it reduces your electricity power by up to 20%"

"This torch is so powerful, the military tried to get it banned"

"The internet providers want to ban this product. It doubles the speed of your internet

"Don't let the cable companies know. For only (something pounds - I.e. translated to a UK audience), this special device plugs into your TV and gives you free channels" [it's a portable TV aerial]

"This media stick gives you all the premium streaming channels, and pay-tv channels for no monthly cost"

[1] https://adstransparency.google.com/advertiser/AR07615603342298841089?origin=ata®ion=GB

The same process is not universally appropriate.

Tron

The rules for the internet cannot match the rules for newspapers and broadcast TV without killing it.

How late would your train or bus be if you had to go through airline-style security before boarding? Is it OK to be less safe on a train or bus? Yes. Because it isn't viable to treat them like airlines.

F***wits

Anonymous Coward

[Caution: May contain traces of rant.]

The worst way you could do this is throw it to every individual site and service, with a long and vague list of requirements, and say "Comply!". That a Baroness now wants to be even harsher (because it's apparently not working) shows just how ridiculously out of touch the "ruling" class is with the online world (and the real world, tbh). Enforce, provide safe harbour and innovate at the same time? They're pulling in different directions!

Let's just legislate for the existence of unicorns, shall we? What? There aren't any unicorns yet? Obviously "Ofunicorn" needs more powers (because it couldn't possibly be that the legislation was totally f***ed in the first place).

People using VPNs? Quick, shut that down! (Or could it be that the idea was terrible from the get go?)

Without endpoint control, kids will find workarounds. If individual sites and services collect IDs, privacy will never be achieved. This is an architectural problem that band-aid fixes won't resolve. And it's frustratingly plain that privacy was never seriously considered when this harebrained scheme was cooked up.

The only reasonable way to protect kids online is to gate their access at the endpoint (device, laptop, desktop). Enterprises enforce endpoint security for good reason. This has been covered in detail, and it's a rather long discussion, so I won't repeat it all here but, for a quick example, give the kids a school laptop and a dumb phone, prevent them from using anything other than those or a school/library computer, and white-list their access. That'd also be cheaper for parents.

That leaves the adults alone to do whatever the heck they want in peace, privacy preserved, and even with the prospect of being able to enhance user privacy with zero impact to kids safety.

But, no. Throw vague requirements and poorly-written legislation to the winds and jump up and down when it dismally fails, because that's how they think. Or, rather, don't.

The country deserves better than to be "ruled" by a bunch of f***wits regardless of whatever good intentions they may have had at the beginning.

Re: F***wits

Anonymous Coward

Well said. Completely agree.

Similar thinking:

Instead of stopping sewage from entering the sea, let's instead just scoop it up from the sea itself.

Stop it as source!

OSA is impossible to enforce

Fonant

The OSA is all-encompassing, vague, and impossible to enforce. But it appears to "do something" about "bad things", so the law must be "good".

The sooner government, and the population, realise that regulating international internet services is impossible (until we get rid of country borders and have a single global legal system for everyone) the better.

The Bad People are not going to stop what they're doing just because Ofcom asked them nicely, or started an investigation into them.

Allowing "dodgy websites" to avoid the OSA by geoblocking "UK" IP addresses is a classic symptom of the problem. Ofcom cannot enforce the OSA in foreign countries, so to avoid losing face ("Oh, look, none of those popular global websites have implemented Highly Effective Age Assurance[*], the OSA is pointless") they allowed geoblocking as a quick-and-easy solution for foreign websites who could be bothered to do something, but didn't want to apply HEAA to all their visitors. So we end up with a flimsy "UK firewall" that is implemented by only a handful of foreign websites, is full of holes, and easy to avoid with a VPN or TOR browser.

There are certainly problems that need to be solved, but the OSA cannot be a solution to any of them.

[*] A whole new can-of-worms, with real privacy dangers.

Lame Chatter....Misdirection....Act Not Fit For Purpose?

Anonymous Coward

Quote One: "Baroness Kidron.......'....the act is wrong in certain places...' Kidron said...."

Quote Two: "....I'm very sympathetic to Ofcom ... where they feel they need a power and it has not been provided by Parliament."

Hah...."wrong in certain places"

Hah...."power...not...provided by Parliament"

So, Baroness Kidron.....where were you when the Act was being drafted?

I think we deserve an expanation!!

....and not lame chatter after the fact!!

....or maybe this is all coded talk meaning the Act is not fit for purpose!!

Doctor Syntax

Will the stakeholders include those who have given personal data to the age verifiers and may have concerns abut security? As the likes of Kidron have this extended to more sources of information such as Wikipedia* it's likely to become an issue for all of us.

* If the kids are using it to research their English history homework they're likely to come across a lot of distressing material.

Anonymous Coward

Maybe the plan is to whitewash English history like Republicans do to american history.

Helcat

Okay, there's often a reason why enforcement is a threat rather than an action. Just remember the extreme p**n law from some years ago.

One case got taken to court. First one. Enforcement in action! A cartoon tiger with a middle eastern woman engaging in some adult interactions. Clearly a breach of the law! Slam dunk case! Until, in court, the defendant asked for the court to turn the sound up... and the court hear the tiger say 'It beats selling cornflakes'.

So not a realistic depiction of man and beast. Case dismissed and the law lost a hell of a lot of its bite.

That's why the OSA isn't being enforced: It would get to go before a Judge, and when that happens, the legality of the law gets tested and it's that point when the courts can limit or even overturn the law as unworkable, unenforceable or excessive. But threaten to enforce it: Get people used to it: Keep it there in the background... that's how the law establishes itself before it can be challenged and so is more likely to survive that first encounter with a Judge.

So perhaps there's a silver lining from all these people screaming it's not doing what they'd hoped it would: More chance a Pss poor case gets to court and it all gets thrown out.

Amusing.

Tron

The activists want to close off, block and shut down even more of the net. They believe they have the backing of the public. When the election comes around, they will find out that they do not, because neither the party that cooked up this censorship nor the party that passed it will stand a snowball in hell's chance of winning.

You censored my net and you want my vote. Nah. Not happening.

o5ky

How do you regulate websites hosted in other countries using TLD's from other countries too? Except for blocking them and them creating new sites just like TPB did a few years back...

Anonymous Coward

The people who are responsible for this have a view of UK law internationally that somewhat resembles what Trump thinks about American law.

Though Trump has clout by bullying the payment providers.. I don't know if the UK thinks it could follow that avenue?

Success is getting what you want; happiness is wanting what you get.