Government issues strategic priorities for online safety regulator Ofcom

The UK government has issued a draft of its strategic priorities for Ofcom under the Online Safety Act (OSA), which the regulator will have to consider when it starts enforcing the new rules from spring 2025.

Passed in October 2023, the obligations placed on digital platforms by the OSA mean they must prevent children from accessing harmful or age-inappropriate content (such as pornography or posts promoting self-harm); enforce age limits and implement age-checking measures; and conduct risk assessments of the dangers posed to children by their services. It also includes an obligation to remove illegal content quickly or to prevent it from appearing in the first place.

While the Statement of Strategic Priorities (SSP) is set to be finalised in early 2025, the current version – published 20 November 2024 – contains five focus areas, including safety by design, transparency and accountability, agile regulation, inclusivity and resilience, and innovation in online safety technologies.

Under the OSA, Ofcom will have to report back to the secretary of state on what actions it has taken against these priorities to ensure the laws are delivering safer spaces online, which will then be used to inform next steps.

“Keeping children safe online is a priority for this government. That is why today I will be the first secretary of state to exercise the power to set out my strategic priorities,” said technology secretary Peter Kyle.

“From baking safety into social media sites from the outset, to increasing platform transparency, these priorities will allow us to monitor progress, collate evidence, innovate and act where laws are coming up short. While the Online Safety Act sets the foundation of creating better experiences online, we must keep pace with technology as it evolves to create a safer internet, especially for children.”

Under its safety-by-design priority, the government said that regulated providers will have to look at all areas of their services and business models, including the role of algorithms, when considering how to protect their users online, and focus on embedding safety outcomes throughout the design and development of new features and functionalities.

“The government believes the goal should be to prevent harm from occurring in the first place, wherever possible,” it said in the SSP. “While this is clearly a material challenge, Ofcom has significant powers at its disposal – including information gathering, audit, enforcement and penalty powers – to ensure providers comply with their statutory duties to protect users online.”

The government added that it is crucial to continue building an evidence base on online harms, including the prevalence and types of harmful content, as well as how children interact with them, to effectively promote safety by design.

To foster trust and accountability, the government said it wants Ofcom to create a “culture of candour” via the regulators transparency reporting regime, which should help expose the greatest harms and address any systemic issues they uncover.

“The transparency reporting regime should highlight best practice and provide clear examples of what platforms can do to keep their users safe,” it said. “To ensure that this information can be meaningfully used by the public, transparency reports should be clear, easy to use and accessible.

“They should have sufficient detail and depth to ensure that they can be used by researchers to improve understanding of online harms and understand why they are occurring, and by law enforcement to keep the public safe.”

Any transparency and accountability efforts should include creating clear and consistently applied terms of service; making platforms directly accountable to their users; and providing coroners with access to data via Ofcom following the death of a child, so they can understand how online activity may have contributed.

To ensure regulatory agility, Ofcom will be required to monitor, assess and mitigate the negative impacts of new technologies – particularly artificial intelligence (AI) – that can be used to proliferate online harms.

“Ofcom should identify and mitigate risks to users emerging from the sharing of AI-generated content on regulated services, and the deployment of AI by regulated services on their platforms, such as AI-driven content recommendation systems and embedded AI tools for users,” it said.

The SSP further dictates that Ofcom must encourage innovation in safety technology: “Under the act, Ofcom can recommend the adoption of technologies to services in its codes of practice and guidance to outline a clear path for industry to take to comply with the duties in the act.

“The government will be a partner in helping Ofcom understand the effectiveness of different technologies and approaches and encourages Ofcom to be ambitious in its recommendations and ensure they maintain pace with technology as it develops.”

The government specifically highlighted the need for Ofcom to support the development of more effective age-assurance technologies. It added that these technologies would need to preserve a high standard of privacy while ensuring that children are protected online.  

Each of the strategic priorities also outline the need for further research in a range or areas to help improve the evidence base for online safety action.

The government has therefore announced the launch of new research to explore the effects of smartphone and social media use on children, which is intended to improve policy makers understanding of the relationship between children’s wellbeing and smartphone use, as well as help direct future government action.

“We’re firing the starting gun on research which will help build the evidence base we need to keep children safe online,” said Kyle. “I am committed to using all the tools at our disposal from monitoring the impact of new laws, creating more and better evidence, and working with online safety campaigners and charities to achieve this goal.”

In April 2024, Ofcom published its draft online child safety rules for technology firms, which called on them to use “robust” age-checking and content moderation systems to keep harmful material away from children online.

The draft codes include measures to ensure tech firms compliance, including by having a named senior person accountable for compliance with the children’s safety duties; an annual senior-body review of all risk management activities relating to children’s safety; and an employee code of conduct that sets standards for employees around protecting children.

Source

Shopping Cart
Shopping cart0
There are no products in the cart!
Continue shopping
0
Scroll to Top