How to Report Online Harm: A Parent's Guide to Protecting Children
When children encounter harmful content or behaviour online, knowing how to report it effectively can protect your child and others. This guide covers how to report to platforms, authorities, and specialist organisations.
Why Reporting Matters
Every report of online harm serves two purposes: protecting your child or the child involved in the immediate incident, and contributing to the broader effort to remove harmful content and hold perpetrators accountable. Platforms that receive reports can remove content, suspend accounts, and in serious cases share information with law enforcement. Specialist organisations receive reports that inform national and international efforts to combat child sexual exploitation online.
Yet many parents either do not know how to report online harm, are unsure whether a situation is serious enough to report, or feel that reporting is unlikely to result in action. This guide addresses all of these concerns and provides practical steps for every type of online harm you might encounter.
What Counts as Reportable Online Harm?
A broad range of online incidents involving children can and should be reported. These include:
- Child sexual abuse material (CSAM) or grooming behaviour
- Online predators making contact with a child
- Cyberbullying, including harassment, threatening messages, or sustained social exclusion
- Sextortion or image-based abuse (the sharing or threatening to share intimate images without consent)
- Online fraud or scams targeting children
- Radicalisation or extremist content targeting young people
- Self-harm or suicide promotion material, particularly in groups or communities
- Hate speech or discriminatory content targeting a child
- Account hacking or impersonation
You do not need to be certain that a crime has been committed to report a concern. It is always better to report and have professionals assess the situation than to fail to act when a child is at risk.
Reporting to Social Media Platforms and Apps
Every major platform has reporting mechanisms built into the app or website. The process varies by platform, but typically involves:
- Navigating to the post, message, account, or content you want to report
- Tapping or clicking a menu icon (usually three dots or a flag icon)
- Selecting the relevant category: harassment, sexual content, exploitation, spam, etc.
- Adding any additional context if prompted
- Submitting the report
Before reporting, take screenshots of the content, including the username, the date, and the full message or post. This evidence may be needed if you subsequently report to the police or a specialist agency, and content can be removed or accounts deleted before you have time to gather it.
Platform-Specific Guidance
Most platforms have dedicated processes for the most serious types of harm:
- Instagram, Facebook, WhatsApp (Meta): The Meta Transparency Center provides detailed reporting guidance. For child safety concerns, Meta has a dedicated child safety reporting pathway accessible through the standard reporting flow.
- TikTok: Reports can be submitted via the in-app report function on any post or account. TikTok has a dedicated Safety Centre with specific guidance on reporting CSAM and grooming.
- Snapchat: Reports can be submitted in-app. Snapchat's Trust and Safety team can be contacted directly for urgent child safety concerns.
- YouTube: Flag any content using the three-dot menu. YouTube has a dedicated CSAM reporting pathway through the standard flag menu.
- Discord: Use the in-app report function or email Discord's Trust and Safety team directly for serious safety concerns.
- Gaming platforms: Major gaming platforms including PlayStation Network, Xbox Live, and Steam all have in-game and online reporting mechanisms for player behaviour and content.
Major platforms are legally required in many jurisdictions to report child sexual abuse material to national authorities when it is discovered on their platforms. Your report accelerates this process.
Reporting to Specialist Organisations
Internet Watch Foundation (IWF)
The IWF is an international organisation that works to remove child sexual abuse material from the internet. If you encounter CSAM online, report it directly to the IWF at www.iwf.org.uk. The IWF operates independently of the police and can act quickly to have content removed. They also share reports with law enforcement.
CEOP Safety Centre
The Child Exploitation and Online Protection Command (CEOP) is part of the UK's National Crime Agency. If you are in the UK and believe a child is being groomed, sexually exploited, or is in contact with a predator online, you can submit a report to CEOP directly. CEOP also accepts reports from outside the UK involving UK-based offenders or UK-related content.
NCMEC CyberTipline
The National Center for Missing and Exploited Children (NCMEC) operates the CyberTipline in the United States. Reports can be made at www.missingkids.org and cover CSAM, online enticement of children, and related offences. NCMEC shares reports with law enforcement agencies globally through its partnership networks.
Other National Reporting Mechanisms
Most countries have a designated national body for reporting online child exploitation. These include:
- Australia: The Australian Centre to Counter Child Exploitation (ACCCE) at www.accce.gov.au
- Canada: Cybertip.ca, operated by the Canadian Centre for Child Protection
- European Union: Each EU member state has designated reporting points; a directory is maintained by Europol
- India: The National Crime Records Bureau operates www.cybercrime.gov.in
- New Zealand: Netsafe (www.netsafe.org.nz) and the New Zealand Police
Reporting to the Police
Contact the police if:
- A child has been sexually assaulted or abused, including online sexual exploitation
- A child is in immediate danger
- You have information about someone who is producing or distributing CSAM
- A child is being blackmailed or threatened online
- Criminal fraud or theft has been committed against a child
When contacting police about online harm, provide:
- Screenshots and records of all relevant communications
- Usernames, account links, and any identifying information about the person involved
- A clear account of what happened, when, and how
- Any context about how the contact began
Do not delete evidence before reporting to police. Even content that seems minor may be part of a broader pattern that police are investigating.
Reporting Cyberbullying
Cyberbullying that takes place between children who attend the same school should be reported to the school as well as to the relevant platform. Schools have a responsibility to address bullying that occurs online, even if it happens outside school hours, when the perpetrators and victim are members of the school community.
If cyberbullying involves threats of violence, sexual content, or material that constitutes harassment under the law, it should also be reported to the police.
Many countries have introduced specific legislation covering cyberbullying and online harassment. If you are unsure whether the behaviour you have witnessed crosses a legal threshold, a call to your local police non-emergency line or citizens advice service can help you understand your options.
Reporting Self-Harm and Suicide Content
If your child has encountered content promoting or depicting self-harm or suicide, this can be reported to the platform through the standard reporting mechanism, typically under the category of self-harm or eating disorders. Major platforms including Instagram, TikTok, and Facebook have specific policies on this type of content and dedicated review pathways.
If you believe your child or another young person is in immediate danger of harming themselves, contact emergency services. Many countries also have crisis lines for young people, and if you are concerned about another young person you encounter online, you can often report to the platform's crisis team who can attempt to make welfare contact.
After You Report
Most platform reports do not result in direct feedback to you about what action was taken. This can feel frustrating, particularly in urgent situations. It does not mean nothing has happened. Platforms review reports and act where they have violated their community standards or legal obligations.
For police reports and reports to specialist organisations, you will typically receive a case reference or acknowledgement. Follow up if you do not hear anything within a reasonable timeframe, particularly if the situation is ongoing or urgent.
Keep copies of all reports you make, including screenshots of the reporting interface, any reference numbers provided, and records of communications with platforms or authorities. This documentation is useful if you need to escalate the matter or make a formal complaint about the platform's response.
Talking to Your Child After an Incident
After you have taken action to report, your child needs support. Reassure them that:
- They did the right thing in telling you or in coming forward
- What happened was not their fault
- You have taken action to address it
- They are safe
Depending on the severity of the incident, professional support from a counsellor or therapist may be appropriate. Children who have encountered CSAM, experienced grooming, or been victims of sextortion often experience significant distress, and professional support can help them process what happened in a safe and structured way.