Recent articles have highlighted TikTok’s problem with promoting child pornography and how its algorithm encourages young girls to act in a way that borders on child pornography.

TikTok and the National Center on Sexual Exploitation
TikTok and the National Center on Sexual Exploitation

Other articles have discussed TikTok’s Community Guidelines for minors and how the video-sharing site has taken down videos promoting or glorifying sexual solicitation.

Saudi public security authority arrests resident with sexual tiktok

Saudi public security authorities have arrested an Egyptian resident with a reputation as a popular TikToker. Tala Safwan was arrested on July 25, after posting a live video of her encounter with a Saudi woman. The video went viral and attracted a lot of backlash.

The arrest came after the Saudi Public Security Authority (PSA) released a statement on Monday. The statement said that the resident had been detained for broadcasting sexually explicit videos. The statement included a blurred clip from Safwan’s TikTok live stream, which has over 1.6 million views.

The video prompted a backlash online, with some users accusing the woman of being a lesbian. Others said it was immoral and a sign of deviance. The public security authority said on Twitter that the woman was an Egyptian citizen but did not disclose her name. The Twitter account of the Saudi Public Security said the video had “sexual content” and “innuminations.” It is unclear whether the video was a genuine one or not.

TikTok’s algorithm encourages young girls to perform acts that appear to toe the line of child pornography

A bipartisan group of state attorneys general recently launched an investigation into TikTok and called for additional safeguards on the site. The National Center on Sexual Exploitation has also issued a statement saying the company has to do more to protect children on its platform. According to a TikTok spokesperson, the company has not heard about the Department of Homeland Security investigation but appreciates the state attorneys general’s focus on child safety.

TikTok has responded by removing reported videos and limiting their distribution to make the site a more safe place for teens. They have also begun to make it easier to block videos that are considered explicit.

Many parents and educators are puzzled by this growing problem. While it may be hard to imagine why a young girl would want to perform acts that stray into child pornography, the platform’s algorithm is enticing teenagers to engage in dangerous acts. One example of this is a recent choking challenge that was viewed by thousands of people. The video was shared across social networks and the teenager who uploaded it lost her life.

TikTok’s Community Guidelines for minors

TikTok has made several policy changes to ensure that its users don’t post content that is harmful to minors. In particular, the app has disabled direct messaging for minors and has enabled parents to lock their account with a pin code. It has also released extensive Community Guidelines defining its terms and laying out the content that is forbidden.

These guidelines prohibit pornography, explicit nudity, half-naked men and content that sexually harasses minors. In addition, content that depicts non-consensual sexual acts, pornography, nudity, or fetishes is also banned.

Another example of content that’s prohibited is content that’s “gratuitously shocking.” These types of videos depict extreme violence or suffering and are removed from the website. Violent content also has no place on the site, and violating these guidelines will lead to a permanent ban.

The video maker says that videos that promote illegal activities are also prohibited on the platform. These videos are filtered by a community team, which recognizes quality content.

TikTok’s removal of videos that promote or glorify sexual solicitation

TikTok has made it clear that videos that promote or glorify sexual solicitation will be taken down from the platform. The company hasn’t revealed how they vet videos, but they have already banned content that promotes or glorifies sexual violence within two months.

Some creators have been booted from the video-sharing platform, including creators of the popular social media app OnlyFans. The creators say their videos did not explicitly violate TikTok’s policy, but their accounts were deleted because they linked to their OnlyFans accounts.

TikTok has pledged to continue to make its content moderation process more effective. It has announced a new system that will automatically remove videos that promote or glorify sexual solicitation. Previously, the content had to be flagged by a safety team member, who would determine if it was appropriate to remove it. With this new system, content removal will be more automated and will be less time-consuming.