RadioandMusic
| 04 Nov 2024
TikTok: Under-16s will no longer be able to get an access to 'private messaging'

MUMBAI: New rules on the hugely popular TikTok app mean under-16s will no longer be allowed to send or receive direct messages.

It is the first time a major social-media platform has blocked private messaging by teenagers, on a global scale. A survey by UK regulator Ofcom suggested TikTok was used by 13% of 12- to 15-year-olds last year. Critics say the new rules will not stop children lying about their age online.

Until now, all users have been able to send direct messages to others, when both accounts follow each other. The change means those under the age of 16 will no longer be able to communicate privately on the platform under any circumstances. They will still be able to post publicly in the comments sections of videos. TikTok says those affected will receive an in-app notification soon and will lose access to direct messages on 30 April.

In 2018, Facebook introduced rules to make WhatsApp available to over-16s only across the EU, to adhere to its General Data Protection Regulation. "The interesting thing here is that TikTok's biggest group of users are teenagers. This restriction will impact a large number of their core demographic. Also, blocking use of a core feature such as messaging between its biggest sub-set of users is bold move” said social-media consultant Matt Navarra.

The grandad who became a TikTok star without realising it. Does being 'TikTok famous' actually make you money? How Drake harnessed TikTok to slide to number one.

But Mr Navarra added: "Depending on how cynical you are, you could view this as TikTok following the same strategy as Facebook and others, whereby they launch new 'digital wellbeing' or safety features in advance of any potential regulatory hearings or investigations. It gives these platforms something to fight back with. It's possible TikTok has observed some concerning incidents or activity on the platform and is now trying to get ahead of the issue with this new restriction."

NSPCC child safety online policy head Andy Burrows said: "This is a bold move by TikTok as we know that groomers use direct messaging to cast the net widely and contact large numbers of children. Offenders are taking advantage of the current climate to target children spending more time online. But this shows proactive steps can be taken to make sites safer and frustrate groomers from being able to exploit unsafe design choices”. 

"It's time tech firms did more to identify which of their users are children and make sure they are given the safest accounts by default."

"It's good that TikTok are showing an awareness of these issues but without having any meaningful way of checking children's ages it's a lot less than it appears to be” said British Children's Charities' Coalition on Internet Safety secretary John Carr.

He said research "when Facebook was the dominant app amongst children" had suggested in some countries about 80% of children above the age of eight had a Facebook account - with the proportion at about two-thirds in the UK.

"No-one's done it specifically for TikTok but all the evidence that we have shown there are gigantic numbers of under-age children on the site," he said.

"We all know children tell fibs. If all the older cool kids are on, that's where you want to be. It's potentially dangerous because parents might allow children to go on an app believing that age means something, and it doesn't, because they never check" He adds.