TikTok Rolls Out Software to Automatically Remove Porn Videos

TikTok announced a new update to their platform. They revealed new software that will automatically detect any kind of indecent content, anything not safe for minors, or anything illegal. When the system identifies disallowed content, it will automatically remove and flag the content in question. Creators are able to appeal their content being removal if they believe a mistake has been made, and it will be later reviewed by humans. How is this change affecting the app, and why did they do this? Read on to learn more.

What is the problem?

The app currently has more than 400 million monthly active users, and also has 500 million daily active users. Given its popularity, the platform has become popular for porn too. Amateur girls and porn stars alike have found success in uploading porn to the app. This content is often reached minors, and TikTok themselves are not doing a good job of preventing that previous to this new update.

Once the software is in place, users should find it more difficult to upload anything that could be considered indecent, pornographic, or illegal. How is the platform preventing such content? TikTok’s entire platform is automated. Everything from the images and videos that appear on users’ timelines, to the content that appears in comments, is made in order to help create new content that users will engage with.

How does the new system work?

It all begins with an image or video. TikTok uses image recognition technology to detect a picture or video, and if it thinks it contains a photo of nudity, they will remove the video and flag it for review. This is a huge change for the app.

Many creators have voiced their opinions against this change, and because of this many have been removing their content or removing it all together and moving elsewhere. A new app, Tik Tok Porn, has come to fill the need for a new app for porn, replicating some of the most popular features like vertical videos, filters, and effects.

Are there other downsides to the update?

Other than the fact that the process is automated and that the creator has to have the software, it’s a good thing. This will help remove inappropriate content. It could also result in quicker responses to law enforcement requests that the creator receives.

There are obviously other downsides to this update. I can’t imagine that the people receiving law enforcement requests are happy with the update. As for content creators, it could limit the amount of videos that are posted to the platform.

How much of an impact will this have? This is a fairly good change. It will probably result in a higher rate of being reviewed by humans. The company will probably want to err on the side of caution and wait until they have the opportunity to review a video.

The censorship problem

Since it started, TikTok has already become quite popular. It’s still very new, and it’s still growing at an insane rate. It’s now the second most popular social app out there, and the most popular app in China. It’s recently reached a billion users. TikTok is a popular platform for both teenagers and adults.

People of all ages use the platform to share their ideas, experiences, and what they do, and to engage with one another. But there are some negative aspects to the app. It doesn’t allow you to use obscene language or any content that makes people uncomfortable.

This has been a major issue for the app. Parents have been concerned about children being exposed to inappropriate content. Content that’s often mentioned are gun violence, suicide, sex, and crimes.

How will this affect creators?

When content is flagged, it gets added removed from public review, and added to a list for the creator to see. This to keep any flagged content organized and visible to creators. When a creator sees a post they made getting taken down, they will now see a “submitting a report” button at the top of the video.

After clicking on that button, they can begin a review process. If a report is accepted, the creator can review the content to see if it should be removed. If not, then the report is rejected and it is safe for everyone.

How will this affect TikTok users? When content gets flagged, users will now see a notice warning them not to try to view the flagged content, or won’t be able to view it at all.

Our Final Thoughts On The Update

When the developers first began development on TikTok, they wanted the app to be used by high school and college students. This is the demographic who use apps like Snapchat and Instagram to share content with their friends. However, they didn’t anticipate some people using the app for less wholesome purposes.

Teens would upload videos with their friends dancing and singing, and then share the videos to TikTok. Some of the content featured violence, porn, and other content that was inappropriate. After finding out what was going on, they promised to clean up the app by adding more systems as moderators to monitor uploads. TikTok, as part of the update, added a new function that detects and detects all kinds of harmful content.