Protecting vulnerable users against “suicide games”

Mental health is a recurrently reported topic amongst young people, and as one of the most pressing issues facing young people today, it’s important that these issues are addressed early to ensure that people are educated on the risks and dangers associated with technology’s impact on mental health.

Which is why when Smoothwall hear about a new game that could potentially prey on vulnerable users, we make sure we are able to categorise it correctly to help the end user get the help they require.

Last year there were various news reports warning parents and teachers about the dangers of a “game” called Blue Whale. At the time, we released updates to our ‘Self Harm’ category to include Dynamic Content Analysis phrases to make sure we were on top of it. This year is no different.

More recently, another game has reared its head in the virtual space – this time called “Momo” – below is everything you need to know:

What are they?
  • Suicide “games” (Blue Whale, Momo) tend to “work” in the same way
  • User is contacted by the “game admin” over a chat room, WhatsApp or other messaging service, and the admin will set challenges for the vulnerable user
  • If the user refuses, then the admin will become threatening and post vulgar/distressing images and threaten the user with any publicly available information they can get (usually from a public social media profile)
  • Challenges include self-harm and dares that could put the user’s life in danger
  • Leading up to the final challenge of suicide.
Where do they come from?
  • These types of “games” have been seen to come from the Middle East/Asia/South America
  • They can manifest over different platforms – blue whale was more popular over Facebook and the Russian social media platform VK
  • Momo is being shared via WhatsApp and other messaging services
  • Momo has also been spotted being used within Minecraft, potentially to “promote the game” or raise awareness about its scary character. Microsoft have since removed any mods that contain Momo on Multiplayer Servers of Minecraft.
What Smoothwall does
  • Smoothwall’s digital monitoring solution will be able to flag up any concerning activity to your Safeguarding team so action can be taken to help the vulnerable person.
  • With our dynamic content web filter technology, we are able to detect certain phrases for these “games” and to ensure such sites are categorised correctly and users are protected.
  • Keep up with trends; daily updates of our product to make sure we stay on top of threats like this.
What you can do
  • As a teacher, parent or carer you can make sure your child knows the basics of WhatsApp and other messaging services, like knowing how to block a number
  • Noticing strange patterns of behaviour like dares being issued by somebody not in their social circle or someone they have never met.
  • As a teacher, parent or carer you can make sure your child knows the basics of WhatsApp and other messaging services, like knowing how to block a number
  • Noticing strange patterns of behaviour, like dares being issued by somebody not in their social circle or someone they have never met. For more online safety tips around keeping children safe online, check out the vast range of resources produced by National Online Safety.
Final Thoughts

It’s also important to note that as much as suicide “games” like Blue Whale and Momo appear to have been linked to real life deaths, much of these reports are unverified. However, as a company dedicated to Safeguarding the public, especially younger audiences, we must take these “games” seriously, which is why we act quickly to constantly adapt our Filtering technologies to keep up to date with these horrific trends that manifest online.

https://www.smoothwall.com/education