TikTok is a magnet for paedophiles and has been for years – it’s a scandal our vulnerable children are STILL allowed on – The Sun

I ONCE told The Sun that TikTok was a “magnet for paedophiles” and I have no doubt that thousands still lurk in its shadows, looking for a chance to snare innocent, unsuspecting victims.

It’s the nature of the beast that predators go where children go. And there are millions using what has rapidly become one of the world's most popular networks.

Not that long ago, it felt like Facebook was the must-have app for kids. In some countries, around 80% of all under 13s with an internet account also had an account on Facebook – in the UK it was around two-thirds.

Now TikTok is the latest craze, with more than 1.5billion downloads worldwide. Despite its greater power, however, it is failing to take greater responsibility.

Just this week, I heard from a distressed mum. Her primary school age child had been badly bullied in the school playground and a video of it was on TikTok.

The mum was having a hard time finding out how to get the video removed and got in touch. But why was it allowed up in the first place?

The problem is quite simple. TikTok’s minimum age is 13, but they do nothing during the sign-up process to check  – meaning huge numbers of children join below the required age.

Even truthful 13 year olds are then given an option to make their accounts public.

In their excitement to connect with their pals, do they really understand the implications of this?


TikTok has spread like digital wildfire, snapping up over 1.5 billion users since its global launch three years ago — including millions in the UK. 

On the surface, the world's fastest growing social media platform shows short clips of  lip-syncing to songs or showing off dance moves but there’s a far more sinister side. 

It’s become a magnet for paedophiles as well as a hotbed for violent and extremist content, with TikTok predators exploiting the platform's young user base and lax security to prey on the vulnerable.

We've seen kids as young as eight being groomed on TikTok, while other creeps take advantage of young girls posting sexualised content of themselves on the platform.

And that's especially worrying on a site which is attracting millions more children every year, with 53 per cent of kids now owning a smartphone by the age of seven.

That's why we launched our TikTok Time Bomb series — to make sure parents are aware of the risks their kids' are being exposed to, and what they can do to better protect them. 

Everyone agrees social media can be a force for good, but it has to be used the right way and with proper controls in place.

We want TikTok to better moderate its content so that it’s not being left to kids to protect themselves online.

Invitation to paedos

If a child decides to go public they are inviting people to follow them and message them.

TikTok’s settings even allow your videos to be promoted “to users interested in accounts like yours”. That’s where the danger begins.

Paedophiles look for videos of children. Likely they will pretending to be children themselves.

They will target someone, follow them and set about trying to lure and trap the innocent. The results can be devastating.

A Sun investigation recently found that kids as young as EIGHT are being groomed – and bombarded with sickening messages

Horrified parents told how their children had been subjected to unwanted aggressive direct messages from older men.

One dad explained how his ten-year-old son was "terrified" by a deluge of texts from a predator telling him to "stop f***ing ignoring me".

Since early last year TikTok has undoubtedly made huge changes to their systems in order to make kids aware of some of the dangers that could confront them while using their app.

The company is also using many more technical tools to spot dodgy messages, for example messages which suggest someone is interested in self-harm or is interested in hooking up with a stranger.

But unless they can prove otherwise, and so far they haven’t, TikTok and the rest of us have to assume a substantial quantity of under 13s are present and using the app.

TikTok must know this has been true for years, but it does not seem to have had any impact.

Take control of TikTok – change these settings now

Parents should do the following immediately…

Go private:

  • Head into Settings > Privacy and Safety and look for the Discoverability heading at the top.
  • Under that you'll see a setting called Private Account. Toggle this on.
  • TikTok recommends your page to lots of other users to improve video circulation.
  • Switch the setting off and the account will no longer be recommended to other users.

Shut out weirdos:

  • In Privacy and Safety > Safety, you can prevent other users from interacting with you.
  • Most of the settings are on Everyone by default, but can be changed to Friends or Off.
  • You can prevent interactions on comments, Duets, Reacts, users seeing which videos you've liked, and also messages.

Restricted Mode ON:

  • Restricted Mode tries to limit age-inappropriate content from appearing for children.
  • It's not perfect, and works through using computer-scanning systems – so some dodgy content will inevitably be missed.
  • It's also possible to set a passcode to prevent your child from changing this setting later on.
  • You'll find this in Settings > Digital Wellbeing > Screen Time Management.

Culture of cruelty

Even if every single user was 13 I would still have grave reservations about its suitability for them.

Ok, I'll admit I'm not exactly the demographic TikTok is aiming for. And I get that the culture has moved on.

But young, impressionable minds model themselves on older kids’ behaviour.

Do we really want 13 year old girls regularly immersing themselves in an environment where being sexy is valued above everything else?

Where boys are encouraged to think being macho around girls is the best way to capture their attention?

Where being cruel to animals or other people is just treated as hilarious, everyday behaviour? I don’t think so.

There is a debate to be had about whether TikTok is simply reflecting contemporary culture or actively helping to shape it.

I think it is undoubtedly a little of both, but it’s the latter that bothers me most.

The internet has a unique ability to swiftly establish new norms and crazes. It magnifies and encourages extreme behaviour.

It helps make it a norm – then people say “whatever, it’s what the kids are doing, get over it”. Well, I won’t.


Time to clean up

TikTok is not a place for children. A great deal of what I saw on Tik Tok last time I looked should only be visible to adults, or at the very least youngsters who are over 16 and can prove they are.

Either way, until it introduces a truly effective way of keeping youngsters off its service – particularly sub-13 year olds – it’s going to be in constant trouble. And not just from the likes of me.

The Government recently announced its intention to create a new legally enforceable “Duty of Care” backed by stiff penalties, maybe even including criminal penalties.

Not before time. Internet businesses such as TikTok were given an opportunity to clean up their act and make themselves acceptable to UK norms.

They did too little – and they did it way too late.

John Carr is a former Government adviser on online child safety.


Source: Read Full Article