To counter inauthentic, misleading, or false content on its platform, TikTok removes misinformation as they identify it. They work with fact checkers at Agence France-Presse (AFP) and Lead Stories to help them assess the accuracy of content. If fact checks confirm content to be false, we’ll remove the video from our platform.
However, fact checks are sometimes inconclusive or content is not able to be confirmed. In these cases, a video may become ineligible for recommendation into anyone’s For You feed to limit the spread of potentially misleading information.
To strengthen their combat against misinformation, the video-sharing app will flag unverified content to users before they share it. This feature was initially rolled out in the US and Canada and globally over the past few months. It is now available to all Philippines users.
Here’s how it works: First, a viewer will see a banner on a video if the content has been reviewed but cannot be conclusively validated.
The video’s creator will also be notified that their video was flagged as unsubstantiated content.
If a viewer attempts to share the flagged video, they’ll see a prompt reminding them that the video has been flagged as unverified content. This additional step requires a pause for people to consider their next move before they choose to “cancel” or “share anyway.”