TikTok upgrades privacy options within Family Pairing feature

TikTok is aiming to make it easier for parents to keep their teens safe on its platform by enhancing its Family Pairing features with new privacy options.

Building on its existing Family Pairing options, TikTok is now giving parents the ability to manage the experience of their kids by linking their TikTok accounts to change discoverability and safety controls directly.

Family Pairing allows a parent to link their TikTok account to their teens, and already offers features such as Screen Time Management, Restricted Mode, and Direct Message controls.

Now TikTok will offer a swathe of extra features including:

  • Search: Directly control whether your teen can search for content, users, hashtags, or sounds.
  • Comments: Decide who can comment on the teen’s videos (everyone, friends, no one)
  • Discoverability: Decide whether your teen’s account is private (your teen decides who can see their content) or public (anyone can search and view content)
  • Liked videos: Decide who can view the videos your teen liked

The announcement is the latest step that TikTok has taken over the past year to improve and enhance its teams, policies, controls and educational resources.

Australia and New Zealand general manager, global business solutions, Brett Armstrong, said TikTok is committed to helping families be informed about internet safety.

“Parenting a teen’s digital life can be daunting and many parents feel as though they’re playing catchup when it comes to the latest technology and apps their teens use,” he said.

“Working with our community and industry partners, we are committed to helping facilitate important conversations within families about internet safety.”

TikTok also has a number of resources for parents, including its youth portal, parents’ page, educational safety videos and more.

The platform’s transparency report recently revealed it took down over 100 million clips in the first half of 2020, following 1,800 legal requests and 10,600 copyright takedown notices in that period.

At the time, TikTok said the videos it deleted make up less than 1% of all content loaded onto the platform.

A third of those deleted contained nudity or sexual activities, while 22.3% fell into the category of ‘minor safety’. 19.6% showed ‘illegal activities’ and 13.4% included ‘suicide, self harm and other dangerous acts’.


Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.



Sign up to our free daily update to get the latest in media and marketing.