Article

The UK's Online Safety Act – where are we a year on?

2 December 2024 | Applicable law: England and Wales | 5 minute read

On 26 October 2023, the much-anticipated Online Safety Act ("the OSA") received royal assent marking new legislation to identify and manage the potential risks of harm caused to the public, and in particular children arising from the use of the internet and social media platforms. 

Many will recall that campaigning for the introduction of online safety laws was heightened in November 2017, following the death of Molly Russell, a 14-year-old girl who took her life after viewing online content related to suicide and self-harm. Her father, Ian Russell, hailed the OSA as a tool to help shield children from the dangers of social media and his positivity was echoed in comments made by a number of children's charities including the NSPCC, who endorsed that the introduction of the Act would mean that the internet would be safer for children and young people. 

Prior to the introduction of the Act, the lack of digital safety legislation had become the subject of public debate following the steady increase in rates of online harm. In the UK Parliament's Thirteenth Report of Session 2023–24, it was found that 'of internet users in the UK, 68% of child users (aged 13–17), and 62% of adult users (aged 18+), indicated in 2022 that they had experienced at least one potential online harm in the previous four weeks.' The NSPCC further highlighted that online sex crimes against children were increasing, with almost 14 times as many crimes of this type occurring between 2021 and 2022 than had occurred ten years ago.

What does the Online Safety Act do?

The OSA set out a number of requirements for companies to tackle illegal content, such as child sexual abuse, illegal immigration, the sale of illegal weapons or drugs and the promotion or facilitation of suicide as well as many other priority offences, making it clear to platforms that this material should be identified and removed to reduce the risks to young users. 

Additional responsibilities were given to social media companies, requiring enforcement of age limits, putting the onus on them to assess and monitor their platforms and any potential risks by setting appropriate age restrictions which should shield children from harmful content.  

The OSA further set out the position where a company fails to comply with its obligations under the Act, thereby making senior managers criminally liable, where it is identified that there has been a failure to comply with information and audit notices and where a company commits offences under the Act with the consent, connivance or neglect of a company officer.

The Act created a number of new criminal offences designed to protecting online users and extended existing legislation in force under the Malicious Communications Act and Communications Act. These new offences include encouraging or assisting serious self-harm, cyberflashing, sending false information intended to cause non-trivial harm, threatening communications, intimate image abuse and epilepsy trolling.  

What has been the response to the Act?

The OSA was welcomed by Ofcom, the communications and now online safety regulator, who were tasked to prepare codes of practice and guidelines to support service providers to fulfil their obligations under the Act. Over the course of the last 12 months, Ofcom have sought consultation from the public, drafted guidance and published proposals for how service providers should approach their new duties in relation to harmful online content, particularly in relation to children. 

On 19 March 2024, practitioners witnessed the first conviction for cyberflashing under the Act. Nicholas Hawkes was sentenced to 66 weeks in jail having been convicted of sending unsolicited images of his genitals to two victims, one of whom was a 15-year-old girl. Despite this case, the conviction rates under this new legislation over the course of the last 12 months have been surprisingly low and we note that the regulatory regime for the Act will only be fully implemented in 2026. 

Despite receiving public attention in October last year, it seems that the wider general public remain unaware of the provisions and obligations introduced by the legislation. A survey carried out by the UK Safer Internet Centre in 2024 shows that only 32% of children and 42% of parents knew about the existence of the act at all. Unfortunately, the lack of continued publicity in relation to the act will undoubtedly mean that those effected by offences under the OSA are unlikely to be aware of their rights or report such offences to Ofcom or the police.  

Whilst the digital and political landscape continues to develop, so do criticisms that the OSA may already be out of date. We saw this in the UK following the anti-immigration riots in summer 2024, which were widely acknowledged as having been sparked by the spread of online disinformation and inflammatory social media posts - in particular posts which incorrectly claimed that the perpetrator of the murder of three young girls in Southport had arrived in the UK as an immigrant on a 'small boat.' Commentary after the riots highlighted how algorithms accelerated the spread of these points and many have suggested that the riots occurred as a direct consequence of the quick spread of social media posts of this type. 

Following the riots, only a small number of individuals were arrested and charged under Section 179 OSA, which created the new false communications offence, making it a criminal offence for a person to send a message conveying information that they know to be false and intend it to cause non-trivial psychological or physical harm to a likely audience. Interestingly, the majority of the individuals prosecuted as a result of their behaviour during the riots were prosecuted under old legislation, and in particular the Public Order Act 1986.  

So what's next for the Online Safety Act?

The failure of the OSA to deliver justice in the wake of the riots led to calls from MP's and other leaders for a speedy re-examination of the newly minted Act. Nevertheless, the government have since made it clear that they will not be seeking to amend the Act and its full implementation will remain a priority.

From December 2024, Ofcom will be publishing Codes of Practice and guidance on how companies can comply with their duties, and the first to be released will be in relation to Illegal content. Companies will need to be alive to their obligations under the Codes of Practice and more generally the legislation itself and where necessary take steps to ensure that they have effective measures to protect users. A senior enforcement official from the UK's communications regulator, Ofcom told GIR in an interview that they are ready to drag uncooperative companies "kicking and screaming into compliance" and it is clear that this will be priority for the regulator over the next few years. 

This document (and any information accessed through links in this document) is provided for information purposes only and does not constitute legal advice. Professional legal advice should be obtained before taking or refraining from any action as a result of the contents of this document.

Share

Related experience

As a full-service law firm, we are able to provide advice and information about a wide range of other issues. Here are some related areas.

Join the club

We have lots more news and information that you'll find informative and useful. Let us know what you're interested in and we'll keep you up to date on the issues that matter to you.