Social Media Platforms are Stepping up Their Community Protection

Yes, Social Media is fun.

Communicating with friends and families, connecting with like-minded individuals, and keeping up with crazy trends. What’s not to like?

Behind what seems to be all fun and games, serious things such as abuse, misinformation, bullying, and such happen on a daily basis- especially in the comments section!

Of course, this is no secret and we all know for sure that social media users are aware of the daily precautions that come with being on public platforms.

That being said, it seems that Social Media platforms are just as aware of the social issues that users face on a daily basis.

Let’s have a look at what actions platforms like Instagram and TikTok are doing to take these matters more seriously, and how they’re doing it.

Instagram’s new safety tools

Instagram just announced their new tools and features to help protect their community from abuse including:

  • Limits

Want only your closest of followers to publicly comment on your posts? Need only a select few to slide through your DMs?

This feature is now available globally with easy access for users, allowing them to automatically hide comments and DM requests from people who don’t follow them, or who only recently followed.

It’s intended to prevent abuse, especially towards creators and public figures, during spikes of unwanted interactions.

  • Warnings

Instagram is also pushing out stronger warnings to discourage harassment. The platform already warns the user that tries to post a potentially hurtful comment. If the user tries to post the comment again, a stronger warning pops up to remind them about the Community Guidelines and may remove it if they proceed.

  • Hidden Words


By the end of the month, Instagram will be globally rolling out the option to filter DM requests and comments with offensive words, phrases, and even emojis into a Hidden Folder, even if they don’t necessarily break the Community Guidelines so you never have to see them.

TikTok Adds New Bulk Comment Deletion and Reporting Tools

Just like Instagram and other social platforms, TikTok is working on making its creators feel more empowered over their experience through new safety tools:

  • Bulk Deleting Comments

TikTok has added a new option that will enable users to bulk delete comments or report them for potentially violating Community Guidelines on their clips, providing another means to control the user experience in the app and avoid unwanted attention.

Accounts that post bullying or other negative comments can now be blocked in bulk too.

  • Comment Filters

Creators get to decide which comments will appear on their videos when enabled. The comments aren’t displayed unless the video’s creator approves them using the new comment management tool.

The feature also enables users to filter using specific keywords to automatically mark comments as spam and offensive, providing even more capacity to control what’s displayed on the creator’s video clips.

  • Review Tools

TikTok’s also adding new comment prompts that automatically detect potentially offensive remarks, which will ask users to reconsider posting the comment- similar to the community warnings of Twitter, Instagram, and YouTube.

Overall, it’s great to see Social Media platforms’ ongoing efforts towards implementing more protective measures to ensure a safe and improved user experience for creators and viewers alike.

Many aspects of digital usage such as commenting and making conversations are one of the greatest experiences that come with being on social media. By providing users more ways to control what they see and empowering them to prevent harassment, social media becomes a safer place (no matter how much of a stretch it may sound), especially for young creators who are vulnerable in these spaces.

Leave a Reply