This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields Risk & Compliance

| 7 minute read
Reposted from Freshfields Technology Quotient

The UK’s new online safety laws – what businesses need to know

This article was originally published on 30 October 2023 on 'Thomson Reuters Regulatory Intelligence' by Thomson Reuters. © Thomson Reuters.

The UK’s Online Safety Act received royal assent on 26 October 2023. The OSA takes a systemic approach to tackling illegal content, content that is harmful to children (but not necessarily illegal), and fraudulent advertising online (in respect of high risk, high reach services) by imposing duties aimed at requiring regulated online services to reduce the risk of that content being encountered by the user of the online service. 

The OSA is often compared to the EU’s Digital Services Act (DSA); and whilst there are broad areas of similarity, there are also key differences. For example, the OSA is much more detailed than the DSA in its requirement for in-scope services to conduct risk assessments in respect of content that is harmful to children. The differences in online safety regulatory regimes mean that businesses with a global reach may be faced with cross-cutting compliance requirements. 

In this article we outline the key implications of the OSA, what businesses should do to prepare and likely next steps. 

Which business are in scope of the OSA?

The OSA generally applies to the following services with a ‘link’ to the UK:

  • user-to-user services (U2U services); and
  • search services (meaning an internet service that is, or includes, a search engine (as further defined in the OSA)). 

Key definitions in the OSA are very broad. For example:

  • U2U services include any internet service by means of which ‘content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.’ 
  • ‘Content’ is also broadly defined as ‘anything communicated by means of an internet service, whether publicly or privately’ (this will therefore include not only messages, oral communications, music, photos and videos but any data of any description).

Search services regulated by the OSA is also widely defined and will encompass many internet services which include a search engine that enables users to search more than one website or database. For example, the scope is likely to catch travel metasearch sites that allow users to search the internet across a variety of websites for airlines, hotels and car rental companies.

Therefore, the OSA does not just target larger technology businesses. Based on Ofcom’s initial analysis it seems over 100,000 online services may be subject to the OSA.

Some exceptions apply in various circumstances, including in respect of certain U2U content consisting of emails and SMS messages.

Additional obligations relating to pornography apply to a wider group of internet services, but those are not a focus of this article.

Does the OSA have extra-territorial reach?

Yes, as noted above it applies to in-scope services with a ‘link’ to the UK. 

A U2U or search service may have a UK ‘link’ if:

  • it has a significant number of UK users;
  • UK users form one of its target markets; or 
  • the service is capable of use in the UK and there are reasonable grounds to believe there is a significant risk of material harm to individuals in the UK by user-generated content or search content on the service (as applicable). 

What are the key implications of the OSA for businesses? 

Key aspects of the OSA include the following: 

A duty of care

Many new duties will apply to in-scope services. These duties apply on a sliding scale, with some duties applying to all regulated U2U and search services, additional duties applying to U2U and search services that are likely to be accessed by children and further duties applying to the highest reach U2U and search services. 

Search engines and regulated U2U services are subject to sets of duties that share many of the same features and aims. 

For regulated U2U services the duties will include: 

  • taking various steps to manage and reduce the risks of content causing various harms to individuals or offences (such as illegal content risk assessments, and taking proportionate measures in the design and operation of the service to prevent individuals from encountering certain illegal content);
  • operating a service using systems and processes that allow users and affected persons to easily report illegal content, and content that may be harmful to children (on parts of the service that may be accessed by children);
  • handling complaints by users and affected persons;
  • having particular regard to the importance of protecting users’ rights to freedom of expression within the law and compliance with privacy law when deciding on, and implementing, safety measures and policies;
  • reporting child sexual exploitation and abuse materials (replacing a current voluntary regime); and
  • various related record keeping and review duties.

In addition, specific duties will be imposed relating to:

  • protecting child users, including obligations that impose safeguarding duties in relation to legal but harmful content accessible by children and duties to assess whether services are likely to be accessed by children and undertake risk assessments relating to children; and
  • providers that meet certain thresholds, including in connection with empowering adult users (eg duties to give users greater control over the legal but potentially harmful content they may encounter), use of proportionate systems and processes to protect freedom of expression, and production of annual transparency reports. By way of further examples, the highest reach services meeting certain thresholds will be obliged to offer all adult users the option of verifying their identity and right to a right block anyone who has not verified their identity. 

A duty to protect against fraudulent advertising

Certain online services with the highest reach (the thresholds for which are yet to be confirmed by secondary legislation) will be subject to duties to help prevent paid-for fraudulent advertising (with the extent of those obligations varying depending on the category of the service). For example, certain ‘Category 1’ services will be required (among other things) to implement proportionate systems and processes designed to prevent users from encountering such fraudulent advertising, minimise the time it is present on the service and swiftly take down such content when notified of it.  

In July 2023, the UK government announced it would also be developing an online fraud charter for the tech sector and is planning a wider overhaul of the laws relating to online advertising. Therefore, businesses will need to keep an eye on emerging developments from those related initiatives. 

Enforcement, fines, and offences

The OSA will be regulated by the existing communications regulator commonly known as Ofcom. Ofcom will issue codes of practice in relation to some of the duties. Certain providers will be required to pay fees to Ofcom.

Ofcom will have a wide range of investigatory powers, including rights to enter a company’s premises and access data and equipment and request interviews with employees. 

Ofcom may impose fines of up to £18m or 10% of global annual turnover (whichever is higher) for breaches of the OSA. It will also have a range of other enforcement powers, including the power to require companies to publish details of enforcement actions against them.

New criminal offences will be created, including for companies, senior executives and employees that fail to meet various obligations. The OSA will also create new offences for individuals and companies in connection with sending various harmful content.

When will the OSA apply?

The majority of the OSAs provisions will commence in 2 months, although some of Ofcom’s powers have already commenced to enable it to get set up as the online safety regulator. 

However, whilst the OSA will be in force, much is still to be finalised in secondary legislation and guidance and so for the core duties Ofcom are planning a phased implementation. The initial focus will be on the illegal content duties, with Ofcom expected to publish draft codes of practice and guidance on 9 November 2023 which will kick off a consultation period. It is not expected that regulated services will need to comply with those duties until towards the end of next year. 

Organisations that may be in scope of the OSA therefore need to undertake appropriate assessments and planning in good time before the OSA becomes operational.

How should businesses prepare?

The Government anticipates businesses spending billions of pounds to prepare for OSA compliance. The extensive obligations under the OSA will require many in-scope organisations to invest in new compliance systems and processes. 

Businesses should promptly consider their online activities and whether they are likely to fall within the scope of the OSA and, if so, the implication of the OSA for their duties and operations. Businesses may also want to consider engagement with Ofcom whether through consultations on guidance and codes of practice or more informally.  

Regulated entities will, for example, need to:

  • perform a risk assessment of illegal content;
  • assess the likelihood of their services being accessed by children and of content that may be harmful to children; and 
  • consider how they will comply with other duties under the OSA. 

In many cases businesses will also need to consider how to comply with laws of other jurisdictions covering similar matters, such as the EU’s DSA. This may create conflict of laws or other additional compliance challenges. Businesses subject to both the UK’s OSA and the EU’s DSA should ensure their work in relation to compliance with both regimes is aligned. See here for further background on the DSA.

Certain key details (such as thresholds, registers of risks, and risk assessment guidance) will not be confirmed until secondary legislation has been put in place. The regulator, Ofcom, will also need to develop codes of practice and guidance to help define a number of obligations and has outlined its  regulatory approach. 

Importantly, the OSA provides that regulated services are to be treated as complying with a relevant duty if the provider takes or uses the measures described in a code of practice that are recommended for the purpose of compliance with the duty in question. Business should therefore keep an eye on Ofcom’s codes of practice as they emerge, as well as forthcoming Ofcom consultations on its approach.

 

 

Tags

platforms, regulatory, tech media and telecoms, social media, media