Tiktok User Ronnie Mcnutt committed suicide. Managers warn about video
Tiktok User Ronnie Mcnutt committed suicide. TikTok Warns Users About A Suicide Video That Is Going Viral.
Ronnie Mcnutt Suicide video that went viral
TikTok is trying to remove a suicide video uploaded to the platform as of Sunday. TikTok management, which made statements about the video that a group of users uploaded repeatedly or used sections in their content, announced that the video in question would continue to be removed and user accounts would be closed.
TikTok has been on the agenda for a few days with an interesting video. A suicide video posted on the platform on Sunday went viral within hours. Interestingly, a bunch of users aren’t trying to get this suicide video spread. As such, TikTok management took action on the issue and made important warnings.
TikTok spokesperson Hilary McQuaide stated that the video in question was removed immediately after it was shared, but some users have uploaded this video over and over again. Stating that some users use sections from the suicide moment while creating content, McQuaide said that the content was carefully examined and such videos will also be removed. Moreover, according to the TikTok spokesperson, users’ accounts are also in danger due to this suicide video.
The suicide footage in question was not actually uploaded directly to TikTok. The suicide incident that took place last week, when a man living in the state of Mississippi in the USA ended his life and transferred those moments live on Facebook Live, somehow moved to TikTok. But interestingly, users paid more attention to this kind of objectionable video than expected and tried to make it viral. To be honest, it should be said that this audience has achieved its purpose.
Suicide footage is automatically detected and banned by TikTok's algorithm
In the statements made by TikTok officials, it was mentioned that the images of the moment of suicide, statements praising the suicide and incidents that encouraged suicide violate the platform rules. Stating that every content uploaded to the platform is controlled by algorithms, the officials say that even if this and such videos are uploaded repeatedly, they will be removed instantly.
“Your accounts can be closed”
TikTok spokesperson Hilary McQuaide said that users who try to share the suicide video, which has recently started to spread, will have their accounts closed immediately. Expressing that most of its users have already reported the accounts that shared this video, McQuaide thanked them and said that they were grateful for their efforts to protect the platform.
By the way, the shared suicide video is not the first. Users prefer Facebook Live for such events. In a study conducted by BuzzFeed News in 2017, at least 45 sensitive cases were detected in Facebook Live, which was launched in 2015, in a short period of two years. These include topics such as suicide, shooting, murder, torture and child abuse. Although Facebook says that it stops such content with its algorithms, it seems that some content could not be blocked.
TikTok should stop people from uploading the video where he shot him dead
Ronnie McNutt, 33, filmed his own death live on Facebook while sitting in front of a table at home. The TikTok app is struggling to remove all the clips of the event that happened in Mississippi on August 31 and was reposted on the video platform. Ronnie suffered from mental health and suffered from PTSD after serving in the Iraq War. It was reported that he recently lost his job and left his girlfriend before he died.