The Online Safety Act recently received royal assent. This is likely to be seen as an important milestone in regulating the online environment for both adults and children and preventing the dissemination of malicious or hurtful content online.
The act seeks to place more responsibility on certain platforms to prevent and remove illegal content that is hosted. This is likely to be seen as a divergence from the position in the past, where platforms providers were seen as intermediaries only.
Practically, it will place an onus on platforms to implement controls to verify users.
The government press release states as follows:
“The act places legal responsibility on tech companies to prevent and rapidly remove illegal content, like terrorism and revenge pornography. They will also have to stop children seeing material that is harmful to them such as bullying, content promoting self-harm and eating disorders, and pornography.”
The act will be regulated by Ofcom. Failure to comply with the rules can result in significant fines up to £18m or 10% of global annual revenue, whichever is higher. This has the potential therefore for fines to be levied for billions of pounds.
There will be a phased approach to bringing the Online Safety Act into force but most provisions will be enforceable within two months.
For more information, please see the government’s press release here.