It’s no secret that parliament has been rocked on many occasions over the last 12 months by a constant series of government upheavals and reorganisations, meaning that many high-profile bills have been left in limbo as they are passed between shifting government ministers. Despite this, the highly anticipated, and often controversial, Online Safety Bill has managed to survive four Prime Ministers and seven departmental secretaries in order to make its grand return to parliament discussions. The Bill has reached the House of Lords and is due for its second reading on 1st February where it will be open for debate by members of the Lords.
So, what is the Online Safety Bill?
The general idea of the bill is to bring the law up to date with rapidly evolving technology by making social media companies legally responsible for keeping children and young people safe, when online. As a result, those companies will be expected to remove illegal content quickly or prevent it from appearing in the first place.
In early drafts, focus was placed on regulating ‘legal but harmful’ content, particularly concerning vulnerable adults and children. This came in the wake of the inquest into the tragic death of Molly Russell in 2017, where it was found that the algorithms used by apps such as Instagram and Pinterest effectively bombarded her with harmful content on suicide and self-harm. The proposed regulations would mean that platforms such as Instagram would be required to monitor and regulate potentially harmful content effectively and a failure to do so could lead the platform to incur financial penalties.
Critics of the bill, however, have suggested that this translates to the censorship of free speech, with MPs calling it an attempt to ban “hurt feelings”. As a result, the latest draft has axed the concept of ‘legal but harmful’ in favour of a ‘triple shield of protection’ where businesses using online platforms will be legally required to remove illegal content, monitor for material that breaches their own terms of service and create options for users to have greater control over the content they engage with. There is also greater emphasis placed on ensuring that the age verification processes in place are effective, with tech companies being asked to demonstrate the methods they use to ensure that minimum age limits are met by users.
What does this mean for UK businesses?
Aspects of the Online Safety Bill could apply to the vast majority of businesses, however the biggest impact will be felt by companies using services that host user generated content, meaning those that aallow users to post their own content online or have interaction between users. This will include everything from big social media platforms down to small companies who utilise forums, online gaming or cloud storage. The introduction of this bill could, therefore, have substantial financial and legal implications for small tech businesses in the UK as further set out below.
The bill introduces direct legal responsibility for online content producers for user actions and covers all content available on sites available in the UK. This means that businesses that provide these online services can be held directly responsible for harmful content distributed by users on their site or app. There is also an imposed duty to proactively protect children, a major political driver behind the introduction of the bill, which adds an extra layer of compliance to an increasingly long checklist.
Businesses who fail to comply with the new regulations can face consequences ranging from fines to imprisonment of senior management. Potential financial penalties imposed by Ofcom have been compared to those given under GDPR regulations. . Furthermore, the bill suggests that senior management who fail to take ‘all reasonable steps’ to ensure compliance could face up to 2 years imprisonment, emphasising the need for businesses to act quickly.
What should businesses be doing?
While the bill is still going through parliament and, as such, may be subject to further alterations, it is a good idea for businesses to start to consider and /or take steps to address any potential liabilities, rather than taking the risk of being non-compliant later.
Companies should begin by checking whether or not they will be impacted by the provisions of the bill, as not all businesses using online content will fall within the scope of the new legislation. For example, companies using online content that does not allow any user generated content or interaction will not be impacted by this Bill. For those businesses that are affected, it is a good idea to take proactive action, such as preparing risk assessments and making the necessary changes to ensure compliance. These changes will likely involve ensuring that any standard terms and conditions of website use being utilised by the company are up to date and offer effective protections.
When introduced, the Bill will require companies to have vigorous governance processes to identify and manage risks, as well as efficient systems for users to report harmful content and, consequently, it would be prudent for businesses to undertake some preparations for this.
The Commercial & IP Team at Berry Smith can assist in drafting and amending terms and conditions to ensure compliance with the Online Safety Bill, as well as providing general commercial and business advice.
Please contact us if you would like more information about the issues raised in this article or any other aspect of Commercial law at 029 2034 5511 or email@example.com