The UK Online Safety Act starts to make itself felt.

The Online Safety Act (OSA) was passed in October 2023 as a bill that effectively aims to make the Internet safe for all of us, especially our children. Ofcom, which has been handed the job of implementing the Act, now has the unenviable task of laying down the ground rules that will put it into effect.

One of the first areas is the rules intended to protect children and young people under the age of 18. These guidelines aim to clarify the new responsibilities and technical compliance that service providers must adhere to to comply.

The children’s safety codes published by Ofcom on 8 April 2024 set out around 40 measures designed to guide social media firms and other service providers to comply with their new legal obligations.

Ofcom to be given teeth to go with new offences

online safety bill

Measures currently under consultation and will come into force once approved by parliament. If companies fail to comply with the new rules, Ofcom have powers to enforce: fines of up to £18 million, or 10% of the company’s annual global turnover, whichever is greater criminal action against companies and/or senior managers who fail to comply with requirements or fail to follow requests from Ofcom business disruption measures, including preventing companies from being accessed or generating income in the UK.

Technology Secretary Michelle Donelan said (The Online Safety Act) “ensures the online safety of British society not only now, but for decades to come.”

https://www.computerweekly.com/news/366583794/Ofcom-publishes-draft-online-child-safety-rules-for-tech-firms

Ongoing Concerns Regarding the Online Safety Bill

Predictably, Social Media companies agree that “something must be done,” though they are not happy with the thought that they will have to do it and that it will affect their bottom line. Online safety is a key objective of the Online Safety Bill, which aims to create a safer life online by regulating harmful and illegal content.

Most people who see the tech company’s profits will probably not be sympathetic.

Tech giants make enormous profits from supplying media and messaging services that result in significant dangers and issues for society in general and, more specifically, their audience. Social media companies, in particular, must understand that internet safety and operating responsibly come with a cost and pay the bill.

Big Tech firms and social media companies are concerned about the implications of tighter regulations.

online safety bill, social media, safety bill

Inevitably, the OSA has caused concern among tech companies defaulting to its too complicated or using the freedom of expression argument to thwart any attempt to regulate their operations. Predictably, many online media companies are kicking back, lobbying, and raising objections to the measures.

The Privacy campaigners at Open Rights Group (ORG) who presumably don’t own cars or buy alcohol, claim that “the implementation of age assurance systems – including photo-ID matching, facial age estimation, and reusable digital identity services – to restrict children’s access could inadvertently curtail individuals’ freedom of expression while simultaneously exposing them to heightened cyber security risks”.

We believe that enforcing age limits through these systems is crucial

Regulated services are vital to protect children from harmful and illegal content and online abuse, aligning with the goals of the Online Safety Act.

Freedom of speech campaigners assert that this is a step too far. “Adults will be faced with a choice: limit their freedom of expression by not accessing content or expose themselves to increased security risks arising from data breaches and phishing sites,” said ORG executive director Jim Killock.

Well, Jim, that is called being an adult.

We adults face choices every day, and some of those choices involve modifying or curtailing our activities to protect our children.

In addition, if you presumably trust these “free” online platforms to provide you services and information, part of that deal is giving them information about yourself that lets them target you with news and advertising.

Supporters of the bill would say this highlights the importance of providing registered adult users with tools to navigate online spaces safely, emphasising the need for tech companies to conduct risk assessments to mitigate these online risks.

We need to trust Social Media companies, and they need to earn trust.

So yes, you, as an adult, will need to trust these platforms with additional information to enable them to implement systems to safeguard children. As a result, the platforms will need to take greater care of your data, and if you don’t trust them to keep that information safe, you can choose not to use their services.

Some overseas providers may block access to their platforms from the UK

Another objection is that “Some overseas providers may block access to their platforms from the UK rather than comply with these stringent measures”.

These presumably are the same overseas providers that profit from access to the UK market but choose to pay their tax overseas. This sounds like an empty threat, but it’s a win-win if more responsible providers replace them.

Tellingly, the arguments against the OSA’s provisions focus on educating children to self-censor and avoid using technology to filter the results.

The Online Safety Act serves as the legislative framework guiding these discussions, emphasising the role of the online safety regulator, Ofcom, in enforcing the bill and ensuring companies comply with its provisions to combat illegal content, including child sexual abuse and illegal drugs.

The Open Rights Group claim that despite OSA, “Risks to children will continue with these new rules. Regulators must shift their approach to one that empowers children to understand the risks they may face, especially where young people may look for content, whether it is meant to be available to them or not.”

This attitude from Groups such as Open Rights ignores the fact that, at a minimum, the regulation will reduce the risk of harm to children from content that is currently all too accessible; each child that is exposed to images relating to self-harm or other harmful content or other age-inappropriate content is a victim of a system that is failing them and that a reduction equates to a real number of actual children saved from harm.

When you look at the argument this way, it underscores the necessity of the safety bill to regulate content promoting self-harm, among other harmful online behaviours, aiming to keep young people safe online.

The approach backed by the ORG inevitably puts the costs of dealing with the fallout from tech companies’ commercial activities back on governments and societies. Presumably, empowering children through education means making children responsible for the content they consume.

Education is important but existing technologies and Artificial Intelligence can help suppliers meet their obligations under the new rules

The problems caused by the companies that dominate content on the Internet are technical but the solution must in part be regulation.

Organisations like banks, the media, and other industries that profit from society expect to be regulated where their services may cause harm. The Internet giants are no different, and they must expect to bear the financial burden.

Robin Tombs, CEO of biometrics firm Yoti, argued while there is “no one silver bullet when it comes to child safety”, the influential age-checking tech will be an essential part of protecting children from accessing harmful content online.

Leave a Reply

Your email address will not be published. Required fields are marked *