Even young children now have a level of access to the online world that was unthinkable 20 years ago.
With this increased access comes a growing level of concern among parents about how they can protect their children from online harms that include cyberbullying, online grooming and (s)extortion. The popular apps and games children are using are designed to be addictive, and even with the aid of parental controls, it can be a lot for any parent to grapple with.
As we mark the 20-year anniversary of Safer Internet Day this week, we also note – with more mixed emotions – the 20th anniversary of the social media platform Facebook. No one could have predicted the immense changes to the online landscape during this time. Clearly there was recognition from an early stage that children and young people would need to be protected in these online spaces, but progress has been painfully slow. The reality is that the safety of younger internet users still remains low on the priority list for all of the most popular online services, especially when they are generating revenue from them.
A recent Harvard study estimated that Meta, YouTube, TikTok, Snapchat and X made about $11 billion (€10.2 billion) in 2022 in the US alone on the back of targeted advertising to their younger users – including children under the age of 12, which perhaps sheds some light on why they do not generally put the best interests of children first. Recent whistleblower revelations from a former Instagram employee only reinforced this notion. Arturo Béjar argued that Instagram did not act to protect its child users, even when it knew that its service was sending harmful content of a sexual nature to them.
China may be better prepared for Trump this time
The best restaurants to visit in Britain and continental Europe right now
Planning regulator Niall Cussen: We can overcome the housing crisis, ‘if we put our minds to it’
Gladiator II review: Don’t blame Paul Mescal but there’s no good reason for this jumbled sequel to exist
In 2023, there were strong statements from the US surgeon general Dr Vivek Murthy about the “profound risk of harm” to the mental health of social media’s younger users. Also in 2023, we saw the filing of a master complaint combining 400 lawsuits of families with children impacted by online harm from across the US against social media companies for their “role in creating a youth mental health crisis through their addictive services”.
While it has subsequently taken some steps to offer additional safeguards, Meta arguably took a step back from its responsibilities in 2023 by encrypting its Messenger service – a move that will make it much harder for it to detect child sexual abuse material. Chris Philp, the UK’s policing minister, described it as “grossly irresponsible”, saying: “Meta is putting profit before the safety of protecting children from predatory sexual abusers.”
The internet brings unparalleled opportunities online for children to learn, create and socialise, but there are also inherent risks. We have seen some positive changes in terms of legislation and policy initiatives but it all falls far short of where it needs to go.
Online grooming and (s)extortion have undergone an alarming increase both during and since the pandemic, showing children are at ever-greater risk of being contacted online with harmful intent. Hotline.ie recorded its highest number of reports in 2022, 94 per cent of which contained suspected child sexual abuse material. The proliferation of “self-generated” child sexual imagery online is also a big concern, with the UK’s Internet Watch Foundation recording a 66 per cent increase in such imagery featuring children under the age of 10 in its latest findings.
How can we reform the borderless online world for our children and future generations? Ultimately, we need a societal shift. We must approach children’s online lives with the same care, attention and supervision that we apply to their offline lives – this is the fundamental message of our Same Rules Apply campaign launching this Safer Internet Day. Parental engagement in children’s online lives is really important to help protect them and mitigate against online risks. Children with less parental supervision and unrestricted online access are more vulnerable to risks online.
Parenting is one part of the solution but the burden of responsibility to support children online cannot fall on parents alone. Some communities have taken matters into their own hands by trying to ban or limit the use of smartphones in schools in a voluntary capacity, but we need to go further by putting in place a mandatory online safety education programme in schools so that the conversation around risks and opportunities online continues from the home and is supported in school.
A rock-solid regulatory framework is also an essential part of the solution. We need robust regulation of Big Tech. The onus has to be on tech companies to introduce age-assurance measures to eradicate underage users as well as better protect younger users, to address the addictive nature of their services for children and adults, and to fix the reality that their algorithm-driven content recommendations can cause real harm to child and other vulnerable users.
It is important to acknowledge that Ireland is making some progress in this regard with the establishment of our first Online Safety Commissioner. We will soon see the first legally-binding Online Safety Code established and it is the fervent hope of CyberSafeKids, and many other organisations working to protect children online, that it will show real teeth in holding these companies to account in relation to their child users. We also have the Digital Services Act in place, obliging all platforms to ensure minors’ privacy, safety and security.
Will these changes to the regulatory landscape make a true difference in the lives of our children online over the next 20 years? They have to.
Alex Cooney is co-founder and CEO of CyberSafeKids. More about the Same Rules Apply campaign can be found at cybersafekids.ie/samerulesapply