Dealing with inappropriate content

Making sure content that children and vulnerable people are accessing online is appropriate is a key concern for many. There have been reports of ‘Challenges’ such as the Momo Challenge or Blue Whale Challenge, where the content targets children or young adults and persuades them to complete a series of tasks often leading to self harm. However, it is important to note that there is very little evidence behind these reports or evidence that these challenges actually exist.

This guides covers both this type of challenge content, as well as where you are concerned that adult or inappropriate content is being placed online directly targeting young or vulnerable users.

Online Challenges

A recent phenomenon on social media, often fanned by online reporting, is the concept of online ‘challenges’ targeting children and vulnerable individuals. In these challenges - such as the Momo Challenge or Blue Whale Challenge - individuals are reportedly persuaded to complete a number of tasks often leading to self-harm or suicide. It has been reported that the perpetrators either connect directly with individuals on messaging platforms or hide video content in popular online games and TV shows.

There is very little evidence that these challenges actually exist or that they have been linked to any actual incidents of harm. If anything it is thought that the reporting and popularisation of these challenges has caused more harm. However, there is debate so it is best to vigilant.

If you are concerned about these challenges consider the following steps:

  1. Do your research - If you hear about an online challenge like this make sure you understand the facts before taking action. Don’t trust social media posts and headlines alone, instead do some fact checking and see if the concerns are legitimate. Good sites for fact checking news stories include thatsnonsense.com and snopes.com.

  2. Talk to the individual - if the individual is currently unaware of the challenges, but you want to make sure they know what to do if they do come across that type of content, then it is best to talk more generally and not name the challenges. There is a good chance you will pique their interest and they will go searching. Instead talk about being safe online and when they should come and talk to you about something that makes them uncomfortable.

  3. Report any serious risk of harm - if you believe that the individual is getting messages from an individual or has seen content online and has taken action then you should report it to the police. If the individual is a child then you can report directly to the Child Exploitation and Online Protection command (CEOP).

  4. Safeguard the individual online - making sure you do what you can to protect young and vulnerable people online is critical. This is a mix of education (talking to them), technology (ensuring secure settings and parental controls) and monitoring (checking what content they are engaging with). You can see more about this topic from the NSPCC and Get Safe Online.

General Inappropriate Content

If you think you have seen inappropriate content online that you think should be removed consider the following steps. This may be adult content easily reached by children, hate speech or offensive content.

  1. Report it on the platform - all of the major platforms used for sharing content have the option to report a piece of content. For example on Facebook you click on the three horizontal dots in the top right hand corner of any post and select ‘Give feedback on this post’. It will then allow you to report the post as inappropriate and describe why. This will alert the platform to the content and allow them to review it against their own content standards.

  2. Report hate speech to the police - Hate crimes are any crimes that are targeted at a person because of hostility or prejudice towards that person’s disability, race or ethnicity, religion or belief, sexual orientation or transgender identity. This can be committed against a person or property. If you are targeted or see hate speech online then you can report it via the True Vision website.

  3. Support the individual - if an individual has been impacted by seeing offensive content then make sure that they have the support they need. This might range of listening to their concerns through to putting them in touch with a specialist charity who can help.

Need more help?

If you would like to speak to a volunteer cyber security expert then you can use the chatbot window on this page - or found here - to be put in touch with our team. The chatbot will ask for your first name, email address and phone number and then one of our volunteers will get in touch to set up a chat.

Donate

To help people like you we rely 100% on donations from people like you.

Without donations we cannot keep our service free and provide help to the most vulnerable victims of cyber crime when they need it most. As a not-for-profit organisation, 100% of your donation goes towards keeping The Cyber Helpline up and running - so 100% goes towards helping people like you. Donate now and help us support victims of cyber crime. 

To help people like you we rely 100% on donations from people like you.