The UK’s Online Safety Act (OSA) has taken a significant step forward as the country’s internet watchdog, Ofcom, officially launched an enforcement program targeting online storage and file-sharing platforms. The move, which came into effect on Monday, aims to tackle illegal content, with a particular focus on child sexual abuse material (CSAM).
The crackdown is part of a broader effort by UK regulators to hold digital platforms accountable for the content shared on their networks, especially in spaces where offenders could exploit file-sharing systems.
According to Ofcom’s latest findings, file-sharing and cloud storage services are especially vulnerable to being used for the distribution of CSAM. Unlike social media platforms, where content is more visible and moderated, private file-sharing services provide a degree of anonymity that can be exploited by offenders.
To address these risks, Ofcom has launched a full-scale investigation into these services, examining the effectiveness of their existing safety measures. The regulator is now requiring companies operating file-storage platforms to prove they have proactive systems in place to prevent, detect, and report illegal content.
As part of the enforcement program, Ofcom has written to multiple services, though it has not named specific companies. These platforms are being placed on notice, and formal information requests will soon be issued. Companies must provide:
- Illegal harm risk assessments outlining their CSAM mitigation strategies
- Details of any existing or planned security measures
- Evidence of efforts to monitor and prevent CSAM within their networks
This marks a significant shift in the regulatory landscape, as the UK government pushes for stricter oversight of online services that handle user-generated content.
Major Penalties for Non-Compliance
One of the most pressing concerns for file-sharing companies is the severe penalties they could face if they fail to comply with the Online Safety Act. Ofcom has the authority to impose fines of up to 10% of a company’s global annual turnover for violations, a penalty that could result in billions of pounds in fines for major tech firms.
For smaller cloud storage providers, failing to comply could mean crippling financial consequences, making it critical for businesses to immediately assess their compliance status.
The Online Safety Act is designed to ensure digital safety across all online platforms, including:
- Social media networks (Facebook, Twitter, TikTok, Instagram)
- Messaging apps (WhatsApp, Telegram, Discord)
- Cloud storage services (Google Drive, Dropbox, OneDrive, iCloud)
- Peer-to-peer file-sharing platforms
These companies must take aggressive action to detect and remove harmful content or risk significant legal and financial repercussions.
UK’s Broader Push for Online Safety
The UK government has been ramping up internet safety regulations, aiming to make the country one of the safest places in the world to use digital services. The Online Safety Act grants Ofcom extensive regulatory powers, allowing it to monitor and enforce compliance across a broad range of online platforms.
While the law has received strong support from child safety advocates, tech companies have expressed concerns over the potential impact on user privacy, encryption, and free speech. However, Ofcom maintains that the primary goal is to protect users—particularly children—from online harm.
Companies affected by the OSA must now act swiftly, ensuring they implement robust content moderation, security, and monitoring technologies to stay compliant with the new legal framework.