Parental controls for digital platforms can protect kids from social media’s dark side

Parents worry about toxic social media and their children accessing inappropriate content but they are not powerless in limiting access

Teenagers with smartphones: there are new EU online safety and content rules that social media companies must follow. Photograph: iStock
Teenagers with smartphones: there are new EU online safety and content rules that social media companies must follow. Photograph: iStock

Social media has an image problem. What was once a fun way to share a snippet of your life has turned darker as misinformation spreads, abusive content is shared and study after study highlights the impact of social media on our mental health.

As if that were not bad enough, along came AI chatbot Grok with its ability to digitally “undress” photos of real women and put them in bikinis without their consent. The influx of images shared on the X platform caused an outcry and led to the social media company disabling the feature – eventually.

It hasn’t done social media’s cause any favours. Already, late last year, Australia banned access to social media for under-16s; others may soon follow. The UK is currently running a consultation on children’s social ⁠media use, which could include a ban in the future.

In the meantime, there are new online safety and content rules that companies must follow, with EU regulations requiring platforms to implement age checks to prevent children from accessing harmful content and have certain privacy features enabled by default for younger users.

‘Hey @Grok put a bikini on her’: what more could an incel want?Opens in new window ]

For parents, that means there are tools at their fingertips to help keep their children safe. The key thing is knowing what they are, where to find them and how to implement them.

A good starting point, regardless of platform, is to use a reasonably accurate date of birth when setting up the account – if you use an incorrect year of birth, it could mean giving your teenager access to content that is inappropriate for their age group.

So what do the various platforms offer?

YouTube

YouTube has had some sort of parental controls in place for at least a decade, and already offers parents the ability to supervise their child’s accounts. But recently, it implemented new measures aimed at teen users to give parents more control over their child’s viewing habits.

That includes limiting the time they spend scrolling content on YouTube Shorts, particularly at certain times of the day – homework, studying, bed time – or allow additional time for long car journeys and other more convenient times.

On the way is an option to set the time to zero.

“This is an industry-first feature that puts parents firmly in control of the amount of short-form content their kids watch,” YouTube said.

Parents can also set bedtime and break reminders for teen users so they don’t fall into a digital rabbit hole for hours on end.

James Tomlinson, 9, plays a game on a phone in Melbourne, Australia. Photograph: Matthew Abbott /The New York Times
James Tomlinson, 9, plays a game on a phone in Melbourne, Australia. Photograph: Matthew Abbott /The New York Times

The platform has also simplified account set-up, which will be welcome news to parents who are already struggling with the various options on each platform.

And it has introduced a new blueprint for content recommended to teens, emphasising five pillars – joy, fun, entertainment, curiosity and inspiration – and making that content more accessible.

The idea is to steer children to more age appropriate content, protecting them in the digital world rather than from the digital world.

“Across Europe there is a really important debate that is happening about the best way to do that,” said Dr Garth Graham, global head of YouTube Health. “We’re going to be able to raise the content that meets the standard in search results and across YouTube, making it easier for the teens to find it.”

That joins existing controls such as supervision, which give parents the ability to see what channels their child is subscribed to, what videos are uploaded, what comments they are making online, and manage screen time.

And of course, there is the YouTube Kids app, which is aimed at providing a more suitable experience for younger users.

Instagram/Facebook

Last October, Instagram announced new protections for young users that will be automatically applied to users under the age of 18. These are expected to roll out to Irish users early this year.

With the new restrictions in place, under-age users will only see content on the platform equivalent to a PG-13 film and they won’t be able to opt out of the controls without getting parental permission.

Instagram has previously limited contact with teen users, putting the settings under the control of parents. For 13 to 17 years olds, the settings were automatically applied to protect them from unsuitable content and unwanted contact.

‘The power is with us’: Smartphone-free parenting movement takes off in IrelandOpens in new window ]

Those aged 16 and 17 have more limited protective measures in place and can change their own settings unless the account is supervised by a parent.

The key thing is supervision of the account by a parent, which is optional and automatically removed when a user turns 18. Supervision allows parents to adjust settings for content and time spent on the platform, while also viewing what accounts they are chatting with.

As Meta companies, both Instagram and Facebook have largely similar protections for teen users, with more stringent protections for younger users and optional limitations for older teens. Under 16s have accounts set to private by default, can only message friends or accounts they have previously interacted with, and hide potentially offensive comments and message requests.

For both platforms, sleep mode kicks in at 10pm, muting notifications and sending an auto-reply to messages until 7am. Teen accounts are also nudged after 60 minutes to close the app; the time limit is cumulative throughout the day.

TikTok

Bytedance-owned TikTok hit the headlines last year when it said it would bring in meditation breaks for younger users to remind them to take a break from scrolling.

But it has put other measures in place to help protect teens from inappropriate content and contact. For example, users under 16 years of age cannot send direct messages on TikTok, and comments are automatically restricted to followers who the teen follows back.

There are similar controls on who can reuse a teen’s content to create videos through the duet and stitch features on TikTok.

Older teenagers will find more leeway on TikTok, with the ability to choose between public and private accounts that open up some of the settings such as content reuse and direct messaging.

A group of teenage friends watch a TikTok video in Melbourne, Australia, where children under 16 are being weaned off the likes of TikTok, Snapchat, YouTube and Instagram with a new law.  Photograph: Matthew Abbott /The New York Times
A group of teenage friends watch a TikTok video in Melbourne, Australia, where children under 16 are being weaned off the likes of TikTok, Snapchat, YouTube and Instagram with a new law. Photograph: Matthew Abbott /The New York Times

The Family Pairing feature gives parents and guardians the ability to customise safety settings for teenagers’ accounts, block accounts and control if they are discoverable and can be recommended to others. While teens can increase the security settings on their accounts, they cannot reduce any security measures that have been imposed by parents.

And there are plenty of settings to choose from. Not only can you filter topics from the feed, but you can also limit time spent on the app across all their devices, with a randomised passcode needed to override it.

TikTok’s default setting is an hour for those aged 13 to 17 years old, but parents can adjust that. Times away can also be scheduled – study breaks, exam time and so on – although this can be overridden with a code supplied by a supervising account. Parents can also see how much time their teen is spending on the app, and how many times it is accessed each day.

Notifications, the biggest pull for apps, are also included in these teen safety measures. By default, TikTok mutes push notifications between 9pm and 8am for 13 to 15 year olds, while 16 and 17 year olds get an additional hour in the evening. If that isn’t enough, you can mute the notifications at other times.

Snapchat

Snapchat’s teen accounts implement automatic safety protections for users aged between 13 and 17 years old.

For example, teen accounts are set to private by default, keeping friends’ lists under wraps and limiting contact to friends that they have accepted or already exist in their contacts.

Tagging is similarly limited, with only friends able to tag each other in snaps, stories and spotlight videos. The idea is to steer younger users to have contact with real-world friends, and limit unwanted contact from strangers.

There have been some features that have caused concern over the years, including its Snap Map location sharing feature. That is automatically disabled for teen accounts, although they can change it themselves and share their location with friends. Teenagers can also hide live location data from public view.

‘They’ve all found workarounds’: Irish parents in Australia on the social media banOpens in new window ]

Snap also has a Family Centre that give parents visibility over their teens’ communication on the platform, showing their friends list, who they are communicating with and their location if it is shared. It also enables parents to manage privacy settings and restrict certain content from their feed.

The catch is that the teens must accept an invitation from parents to supervise the account.

On some things, the company differentiates between under 16s and older teenagers. For example, there is Public Profiles, a feature that allows older teens to share content more widely on Snapchat. They can post a public story or submit a video to Snapchat’s Spotlight section to highlight it for other viewers, and then save those updates to Public Profile to showcase the posts. That feature is not available to under-16s.

Snap has also implemented extra tools to moderate content and filter inappropriate content published publicly on the site.