On 1st August 2025, the Online Safety Act came fully into force, establishing new rules to protect children under 18 from harmful online content. Research from Ofcom indicates that children as young as eight can access adult-oriented content online (GOV.2025). NSPCC Chief Executive Chris Sherwood has highlighted that children frequently encounter sexual, emotional, or otherwise harmful material online (GOV.2025).
The Act is designed to strengthen existing safeguarding measures in schools and ensure a safer online environment for students.
The legislation requires platforms to limit access to harmful content, particularly for underage users. Measures include:
• Age verification systems – Some use technologies such as facial estimation, though accuracy can vary.
• Risk of circumvention – Increasingly tech-savvy students may use VPNs or other methods to bypass restrictions (M.Briggs, 2025; L.Mcahon, 2025).
• Privacy considerations – Collecting sensitive data, especially for minors, raises important ethical and legal concerns (L.O.Boult, 2025).
Schools are expected to assess risks, implement safeguarding policies, and ensure transparency in content management.
Under the Act, schools continue to uphold existing safeguarding duties and now have additional obligations, including:
• Conducting risk assessments on online content.
• Maintaining transparent content management practices.
• Monitoring and reporting harmful content incidents.
Implementing these measures requires a balance between proactive protection and respecting student privacy.
While the legislation sets the framework, schools can consider tools and strategies to support compliance and safeguard students. For example, digital solutions exist that allow schools to:
• Monitor devices in real time for unsafe or inappropriate activity.
• Provide human and AI-assisted alerts to identify potential safeguarding concerns.
• Apply content filtering policies tailored to educational standards.
• Share lessons and resources efficiently across devices without compromising safety or privacy.
These types of systems can support schools in meeting the requirements of the Online Safety Act, providing a proactive approach to online safety.
(For context, some schools choose platforms such as S4S Sentinel, which integrates device monitoring, safeguarding alerts, and web filtering. Mentioned here only as one example of many tools available to assist with compliance.)
S4S Sentinel is an all-in-one digital tool designed to help educators manage their classrooms and ensure student safety.
To comply with The Online Safety Act, S4S Sentinel Provides:
The Online Safety Act gives schools a legal framework to reduce children’s exposure to harmful online content. Compliance is more than policy—it involves practical steps to create safe, responsible digital environments. By combining clear policies, staff training, and appropriate digital tools, schools can protect students while fostering trust and engagement in the online classroom.
References
GOV.UK. (2025). What’s changing for children on social media from 25 July 2025. Retrieved from https://www.gov.uk/government/news/whats-changing-for-children-on-social-media-from-25-july-2025
GOV.UK. (2025). Keeping children safe online: Changes to the Online Safety Act explained. Retrieved from https://www.gov.uk/government/news/keeping-children-safe-online-changes-to-the-online-safety-act-explained
Government Events. (2024). Online safety in education: Protecting students in the digital age. Retrieved from https://www.governmentevents.co.uk/ge-insights/online-safety-in-education-protecting-students-in-the-digital-age/
Briggs, M. (2025). Age verification and the challenges of digital identity. [Industry commentary].
McCahon, L. (2025). The rise of VPN usage among young people: Circumventing online safety. [Industry analysis].
Boult, L. O. (2025). Privacy concerns in age verification technologies for minors. [Research article].