Safeguarding our youth from online weapon-related harms
By Michael Railton, Weapons and Associated Violence Policy Lead, Ofcom
Ofcom is the UK’s online safety regulator. Our job is to help make the internet safer for people in the UK, particularly children.
As parents, carers, and professionals working with young people, we all share a common goal: to help children grow up safe, supported, and free from harm. In today’s digital world, the risks they face are not only in physical spaces but online too. The Online Safety Act gives Ofcom new powers to tackle these harms. We’re using them to protect UK users, especially young people, from illegal and harmful online content that fuels real-world harm - such as the promotion or the sale of weapons, and the glorification of weapon carriage for self-defence.
Repeated exposure to violent content online, especially videos that glamorise knife use, can create a climate of fear. This fear can lead young people to feel they need to carry weapons for self-defence and distort their view of reality. Sellers often exploit this fear by marketing illegal weapons online as solutions to safety concerns. It’s a dangerous cycle, and we are determined to break it.
Under the Online Safety Act, Ofcom now regulates platforms like social media and search services to ensure they are tackling both the normalisation of weapon use, and the availability of illegal weapons online. Our work supports the Government’s ambition to halve knife crime by 2030.
Online harms lead to offline consequences
The link between online content and offline behaviour is clear. Harmful content does not stay on the screen. It can influence attitudes, decisions, and actions. That is why our approach is survivor-informed and trauma-aware. We work closely with individuals who have lived experience of weapons-related violence, including victims, survivors, and frontline professionals. Their insights shape our strategy and help us reflect the realities of harm and trauma.
Understanding illegal and legal but harmful content
One of the most important distinctions we make under the Online Safety Act is between illegal content and legal but harmful content.
Everyone must be protected from illegal content, such as the online sale of illicit weapons. Platforms are required to remove this swiftly when they become aware of it and ensure systems are in place to prevent its spread.
Young people under 18 must also be protected from content that may not break the law but can still cause harm – often referred to as “legal but harmful content”. This includes videos that glorify knife use.
Platforms are legally bound to assess the risk of weapons-related content and put proportionate measures in place to reduce risks to users. Ofcom has issued guidance to help platforms identify illegal knife-related content and understand how to respond.
If platforms fail to comply, Ofcom can take action – including issuing fines of up to 10% of a company’s qualifying worldwide revenue.
Reframing reporting as safeguarding
Reporting harmful or illegal content is a vital safeguarding step. It helps protect our children, our communities, and ourselves. It also holds platforms accountable. A single report can trigger platform action. When one user speaks up, it can help protect thousands.
We would not ignore someone selling weapons to young people in our community – so we should not ignore it online either. Every illegal weapon sale ignored could put lives at risk – including young people who may be harmed or face criminal consequences.
We are working to normalise reporting by co-designing messaging with trusted individuals and organisations – people who already have credibility in the communities we want to reach.
Age-appropriate online experiences matter
Another key part of our strategy is making sure young people have age-appropriate online experiences. That starts with something simple but powerful: making sure their accounts reflect their actual date of birth.
We urge parents, carers, and professionals to check that young people’s accounts are set up correctly. It is a small step that can make a big difference.
Why prevalence matters
To tackle harmful content effectively, we need to understand how widespread it is. That is why we are focused on identifying the prevalence of harmful and illegal content across platforms in scope, like social media and search platforms.
This data helps us target our efforts, hold platforms accountable, and measure progress.
A shared responsibility
Ofcom’s role is to drive systemic change. We do not manage individual user complaints about content or remove specific content, but we do expect platforms to have the right systems in place to do so.
We are pushing for a culture shift in tech firms, one where safety is built in by design, and platforms proactively assess and manage the risks of harmful and illegal content, not just react after harm has occurred.
Our greatest asset is our youth. That includes those who have experienced the challenges we aim to prevent and those currently at risk. My team and I are committed to engaging with these individuals at every step of the way.
Together, we can create a safer online environment – where harmful and illegal content is challenged, not ignored. Where reporting is seen as protection, not taboo. And where every young person can explore the digital world without fear or harm.
Let us make that world a reality, together!
Learn more about our Fearless campaign: It takes a village | Crimestoppers
