It doesn’t seem like TikTok is going to slow down anytime soon. In fact, I wouldn’t be surprised if it overtakes Instagram and even Facebook in the next few years.
The platform’s most recent positive boost came from the smashing success of Ratatouille: The TikTok Musical, a crowdsourced, parody musical that went from being a fun, silly video posted by American elementary school teacher named Emily Jacobsen to a massive production that included several TikTok content creators, Broadway professionals and the all-female orchestra, Broadway Sinfonietta. It raised over USD2 million for The Actors Fund, a charitable organization that supports performers and behind-the-scenes workers in performing arts and entertainment.
Now, there is a lot that can be praised about TikTok but there is a lot of criticism that’s been leveled against it too. Notably the ease with which underage users can be targeted by those who may wish to harm them, as well as how quickly information – and disinformation – is shared on the platform and beyond.
TikTok recently announced two new features in response: new privacy settings for users under 18 years old, and a new prompt banner warning users that they may potentially be sharing inaccurate information.
Let’s explore these a bit more.
New Privacy Settings
TikTok said that it will switch all accounts by underage users to Private, instead of Public, while videos of users between 13 and 15 cannot be downloaded and used for Duet or Stitch features. Users who are 16 and 17 years old will be able to duet and stitch their videos, but the default feature for this will be changed to Friends but can be reverted through the Settings menu.
It’s not a permanent solution but it’s a good start for protecting vulnerable users.
TikTok’s official blog details how this process will unfold, starting with a flagged video: “First, a viewer will see a banner on a video if the content has been reviewed but cannot be conclusively validated. The video’s creator will also be notified that their video was flagged as unsubstantiated content. If a viewer attempts to share the flagged video, they’ll see a prompt reminding them that the video has been flagged as unverified content. This additional step requires a pause for people to consider their next move before they choose to “cancel” or “share anyway.””
It is being rolled out in phases across its entire platform. Initial results seem promising; TikTok noted that when they tested the feature: “we saw viewers decrease the rate at which they shared videos by 24%, while likes on such unsubstantiated content also decreased by 7%.”
Misinformation has become so rife everywhere that we can’t tell what’s genuine fact and what’s been made up or manipulated. Hopefully, social media platforms will be able to find a solution so history doesn’t repeat itself with more disasterous consquences.