Millions of accounts on the video-sharing app will only allow approved followers to comment on videos.

TikTok accounts of users aged under 16 will now be changed to private, in a bid to improve child safety on the app.

This means videos posted to these accounts will not be available to download and all followers will need to be approved first.

It comes a month after federal regulators ordered the social media giant to reveal how the app protects and processes children and teenagers.

TikTok’s head of privacy, Elaine Fox said: “We want to encourage our younger users to actively engage in their online privacy journey, and by doing so early we hope to inspire them to take an active role and make informed decisions about their online privacy.”

Further updates include collaborative creative tools on the app such as “Duet” and “Stitch” can no longer be used by under-16s.

The NSPCC’s head of child safety online policy, Andy Burrows, welcomed the “bold package of measures” by TikTok.

He added: “Abusers are taking advantage of the pandemic to target children spending more time online and we urge other platforms to be similarly proactive rather than wait for regulation to come into effect.”

The social media giant has already restricted some access for young users by not allowing them to send direct messages or to host live streams.

Other features enable parents and guardians to set restrictions for their children through the Family Pairing feature as well as TikTok for Younger Users, which offers pre-selected, age-appropriate videos.