Tiktok introduces new safety and wellbeing features
Tiktok is launching a suite of safety and wellbeing features that include content prevention filters, a topic management system, and guided meditation exercises.
These measures come as the company faces a potential ban for users under 16 in Australia.
According to a media announcement issued by the company, safety and wellbeing is “the highest priority for Tiktok”, and these features help users modify their feed in order to avoid unsuitable material being served to younger users.
The new ‘Manage Topics’ feature will allow users to oversee the topics they are fed on the app, giving them control to increase or decrease how much content they see of a particular subject. Tiktok warns this tool “won’t eliminate topics entirely, but can influence how often they’re recommended” to users.
Smarter keyword filters have also been introduced, letting users include up to 200 search terms for topics they don’t wish to see in their feeds.
In addition to these keyword triggers, artificial intelligence will be used to “capture additional videos featuring similar words, synonyms, and slang variations to prevent them from being shown to users”.
After 10pm, users will have their scrolling interrupted by prompts to engage in a new in-app guided meditation exercise.
Adults can choose to opt into this feature, however it is switched ‘on’ as a default for users under 18. If the teenage users ignore the first reminder, a second “harder to dismiss, full-screen prompt” is sent.
Tiktok’s own testing found that 98% of teenagers kept the meditation feature switched on.
As the company explained: “These features are designed to reflect best practices in behavioural change theory, by providing positive nudges that can help people develop balanced long-term habits.”
Recently, Tiktok added a feature that can allow parents to see who their teens are following, who follows them, and which accounts their children have chosen to block. In addition, when an underage user reports a video for a content violation, they can also choose to alert an adult at the same time.
Parents can also choose ‘time away’ periods for their children that block access to the platform during certain times of the day (e.g. school hours), set daily screen time limits, and switch their teen’s account back to the default private setting, if their child has made it public without parental consent.
The platform is also providing six Australian and New Zealand mental health organisations with AU$115,000 in free advertising on its platform, after research by youth mental health service Reachout Australia found 73% of young Australians turn to social media for mental health support.
Keep up to date with the latest in media and marketing
Have your say