✓ One-time payment no subscription7 Packages · 38 Courses · 146 LessonsReal-world safety, wellbeing, and life skills educationFamily progress tracking included🔒 Secure checkout via Stripe✓ One-time payment no subscription7 Packages · 38 Courses · 146 LessonsReal-world safety, wellbeing, and life skills educationFamily progress tracking included🔒 Secure checkout via Stripe
Home/Blog/Mental Health
Mental Health8 min read · April 2026

Safe Messaging Around Self-Harm and Suicide Online: A Guide for Teenagers and Parents

The internet is full of content about self-harm and suicide, ranging from genuinely helpful to actively harmful. This guide helps teenagers and families understand safe messaging principles, how to find trustworthy support, and what to do when online content feels dangerous.

Why This Matters

The internet contains an enormous range of content related to self-harm and suicide: some created by mental health professionals following safe messaging principles, some created by well-meaning peers sharing their own experiences, and some that is actively harmful. The difference between these categories matters, because research has demonstrated that certain types of content about self-harm and suicide can increase risk in vulnerable individuals, while other types of content can provide genuine support and save lives.

This guide is for teenagers who encounter content about these topics online, and for parents who want to understand the landscape and support their children navigating it.

What Safe Messaging Means

Safe messaging guidelines are evidence-based recommendations developed by mental health researchers and organisations for how to communicate about suicide and self-harm in ways that reduce rather than increase risk. They were originally developed for journalists and media outlets, but the principles apply equally to online content.

The core principles of safe messaging around suicide include:

  • Not describing methods in detail, as method exposure increases risk in vulnerable individuals
  • Not presenting suicide as a solution, a glamorous act, or a form of revenge
  • Not focusing on the details of a specific person's suicide in ways that can lead to identification with them
  • Framing suicide as a public health issue with prevention possibilities, not an inevitable outcome
  • Including crisis resource information alongside content that discusses suicide
  • Not using phrases like committed suicide (which implies criminality), but rather died by suicide

For self-harm, safe messaging principles similarly include not providing detailed descriptions of methods, not presenting self-harm in ways that could be instructional or aspirational, and ensuring that content includes pathways to support.

Harmful Content Online

Several types of online content related to self-harm and suicide have been identified as potentially harmful:

Pro-self-harm communities: Online communities that present self-harm as a coping mechanism, share images of self-harm, provide method information, or normalise the behaviour. These have existed on various platforms and continue to emerge despite platform removal efforts. Research has found that participation in such communities is associated with increased self-harm frequency and severity.

Suicide method content: Detailed content about suicide methods, whether presented as information, discussion, or fictional narrative, can increase risk for vulnerable individuals through the contagion effect. This is the reason that major news organisations follow strict guidelines about method reporting.

Romanticised or glamorised portrayals: Content that portrays suicide or self-harm as romantic, powerful, or meaningful without balancing this with realistic consequences and support pathways can influence vulnerable young people's thinking.

Triggering content without warning: Content that contains graphic descriptions or imagery related to self-harm without any content warning can ambush vulnerable individuals in ways that increase distress.

Helpful Content Online

Not all online content about these topics is harmful. Genuinely helpful content includes:

From HomeSafe Education
Learn more in our Aging Wisdom course — Older Adults 60+
  • Content from established mental health organisations that follows safe messaging guidelines
  • Peer support communities moderated to follow safe messaging principles, where people share experiences in ways that emphasise recovery and connection rather than method sharing
  • Personal stories of recovery and help-seeking that model positive pathways
  • Crisis resource information that makes it easy to access professional help
  • Psychoeducation about why self-harm happens and what helps, provided in a non-triggering way

The key distinguishing features of helpful content are: it does not describe methods; it frames these experiences as deserving of support and treatment rather than as permanent states; and it includes pathways to professional help.

What to Do When You Encounter Harmful Content

If you encounter content online that feels harmful, several responses are appropriate:

Report it to the platform. All major platforms have reporting categories for content that promotes or glorifies self-harm or suicide. Use these. Platform reporting is imperfect but does result in content removal.

Use content filtering tools. Some platforms offer settings to reduce exposure to potentially sensitive content. Instagram and TikTok both have sensitivity settings that reduce but do not eliminate exposure to potentially triggering content.

Follow accounts that promote recovery. Actively curating a more positive feed by following mental health recovery accounts, organisations, and individuals who share supportive content changes what the algorithm shows you over time.

Talk to someone. If you have encountered content that has affected you, talking to a trusted adult or a crisis line is the right response. You do not have to manage the emotional impact of difficult content alone.

If a Friend Is Sharing Concerning Content

If someone you know online is posting content that suggests they may be struggling with thoughts of self-harm or suicide:

  • Take it seriously. Research shows that people who express suicidal thoughts online are at genuine risk and should not be dismissed
  • Respond with care and without panic: I saw what you posted and I'm worried about you. Are you okay?
  • Encourage them to talk to someone who can help: a parent, a counsellor, or a crisis line
  • If you believe there is immediate risk, tell a trusted adult who can check on them
  • Report the content to the platform's crisis response team, which most major platforms have
  • Look after yourself too: exposure to a friend's crisis is distressing, and you also deserve support

Crisis Resources

If you or someone you know needs support right now, crisis lines are available 24 hours in most countries:

  • UK: Samaritans, 116 123 (free, any time)
  • US: 988 Suicide and Crisis Lifeline, call or text 988
  • Australia: Lifeline, 13 11 14
  • Canada: Crisis Services Canada, 1-833-456-4566
  • International: findahelpline.com lists crisis resources for most countries

If someone is in immediate danger, call emergency services immediately.

Conclusion

Online content about self-harm and suicide ranges from genuinely harmful to genuinely life-saving. Developing the ability to distinguish between these, to use platform tools to reduce exposure to harmful content, and to respond helpfully when encountering it is an important part of digital literacy for teenagers. Most importantly: if you or someone you know is struggling, professional help is available and effective. No one has to manage these experiences alone.

More on this topic

`n