The hugely popular video social app/site is under investigation in the United States for failing to moderate CSAM. According to the Financial Times, the company is being investigated by the Department of Homeland Security (DHS), which is looking into how it handles child abuse content. Furthermore, the Department of Justice (DoJ) is investigating how predators may be abusing a privacy feature on TikTok. The DHS claims TikTok is the social platform of choice for predators, likely because of its huge popularity and wide userbase of younger people. It is worth noting the company employs 100,000 human moderators around the world. However, this team is not only searching for CSAM, but also other sensitive and/or illegal content. The reports points to the following response from TikTok: “TikTok has zero-tolerance for child sexual abuse material. When we find any attempt to post, obtain or distribute [child sexual abuse material], we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary. We are deeply committed to the safety and wellbeing of minors, which is why we build youth safety into our policies, enable privacy and safety settings by default on teen accounts, and limit features by age.”

Moderation

The privacy feature the DoJ is concerned about is the ability to use private accounts. This could provide predators with a safe way to publish and share CSAM. There is also the “Only Me” feature that only shows content to people who log into the account with a password. This news comes the same week TikTok started testing a dislike button for comments. The company says the feature will not be public, instead providing feedback to improve moderation. Tip of the day: To prevent attackers from capturing your password, Secure Sign-in asks the user to perform a physical action that activates the sign-in screen. In some cases, this is a dedicated “Windows Security” button, but the most common case in Windows is the Ctrl+Alt Del hotkey. In our tutorial, we show you how to activate this feature.

Investigation Opens into TikTok over Failure to Moderate Child Abuse Content - 98Investigation Opens into TikTok over Failure to Moderate Child Abuse Content - 98Investigation Opens into TikTok over Failure to Moderate Child Abuse Content - 85Investigation Opens into TikTok over Failure to Moderate Child Abuse Content - 61Investigation Opens into TikTok over Failure to Moderate Child Abuse Content - 99