TikTok challenge shocks users with hidden pornographic and violent videos

TikTok said it is banning the ‘dontsearchup’ hashtag and monitoring users’ profile pictures

Adam Smith
Wednesday 21 April 2021 18:08 BST
Comments
(Getty Images)

TikTok is banning users who are posting pornographic and gory videos in their profile pictures as part of a viral trend.

The “Don’t search this up” trend, first reported by the BBC, shows users sharing innocuous videos where they tell views not to look up a specific TikTok user.

The profile picture of that user is usually explicit content; videos seen by The Independent include masturbation videos and jump scares. Other content reportedly includes an Islamic State video.

TikTok, as opposed to other social media sites, allows for short video clips to be posted as profile images, rather than a still image.

The videos have reportedly accumulated over 50 million views before the short-form video site took action.

"Protecting our community from potential harm is our most important work. Our Community Guidelines extend to all content on our platform, and we work vigilantly to detect and remove content that violates our policies, including making reports to the National Center for Missing & Exploited Children and other relevant authorities when appropriate”, TikTok said in a statement.

“We have permanently banned accounts that attempted to circumvent our rules via their profile photo and we have disabled hashtags including #dontsearchup. Our safety team is continuing their analysis and we will continue to take all necessary steps to keep our community safe."

The billion-dollar company is reviewing hashtags related to “dontsearchup”, and once a hashtag is banned it is unable to be created again nor searched for.

The Independent attempted to contact users who are participating in the the “dontsearchup” challenge, but many accounts were protected by TikTok’s privacy restrictions which only allow friends – users which follow each other – to communicate.

This is not the first instance where TikTok has failed to properly moderate its platform.

A 12-year-old boy died after taking part in the "blackout challenge" on TikTok that calls for people to choke themselves until they become unconscious.

TikTok also had difficulties moderating its platform for antisemitic content in the past, as a report from The Independent showed that conspiracy theories about George Soros and the Rothschilds received millions of views despite the company’s restrictions.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in