The Spectator

Why won’t phone companies stop kids using social media?

When it comes to social media, parents find it difficult enough to keep up with their offspring’s online world. What hope, then, do governments and regulators have of keeping up with digital technology?

This week, Ofcom has announced a new code of practice which aims to use powers granted under the Online Safety Act in order to ‘tame aggressive algorithms’. Such a move seems well-meaning, but official bodies will always be several steps behind the latest online trends. The onerous-sounding new rules will probably end up restricting the online freedom of less savvy adults more than of children. 

A decade or so ago, parents had no idea about the potential negative effects of these addictive apps on young minds

That’s not to say the phone-based childhood is not a significant social concern. WhatsApp, SnapChat, Instagram are new worlds which parents are ill equipped to help their children navigate. Recent research from Ofcom showed that a quarter of British children have a smartphone at the age of seven. By the age of 12, 90 per cent of them do.

Adults fear that social media sucks children into a digital vortex, encouraging them to scroll instead of read and making them ever more anxious, depressed and isolated in the real world. Yet parents who seek to restrict their children’s phone usage often find that they push back with the fair argument that friendship is digital now. Not being on social media means being a pariah. Who would want to do that to their child?

Jonathan Haidt, a psychologist at New York University, argues that smartphones fuel misery and paranoia and that image-based social media teaches children to judge themselves and their peers by appearance. He also links smartphones to a surge in mental health problems in young people in America, especially in girls who use Instagram.

Britain is not short of evidence on this score. The number of children seen by NHS mental health services has risen by a third in just the past four years. NHS estimates that a fifth of eight- to 16-year-olds now have a probable mental health disorder, up from 12 per cent six years ago. Just 0.7 per cent of British university students reported mental health issues in 2010; now 5 per cent do. 

The Children’s Society compiles annual research into how happy young people are about their friends, appearance and school: all the scores have been falling for years. The Prince’s Trust, which also tracks happiness and confidence among the young, reports similar declines.

For Professor Haidt, the advent of social media (rather than smartphones per se) has led to what he calls ‘the great rewiring of childhood’ and the results are calamitous. Others argue that, thanks in part to social media, talking about mental illness carries less stigma now, so everyone – young and old – is more likely to report issues; what we’re seeing is an explosion of candour. That, however, does not explain the rise in hospital admissions for intentional self-harm in the under-35s, up by a third over ten years.

We have no clear answers. Mental health issues among adults, which are driving the current surge in worklessness, are a problem for public health officials. Still, there is a strong case for restricting children’s use of social media until we have a better understanding of its implications. Such decisions should not be taken lightly in a free society, but growing parental concern, along with substantial evidence, does make the case for further regulation.

The tech firms are never short of ploys to fob off campaigning parents or to keep regulators at bay. Instagram and TikTok claim to have brought in age-verification tools to analyse ‘selfies’ and estimate how old the users are. But the firms have no incentive to enforce age limits, lest they lose their share of the pre-teen market to rivals. TikTok, Instagram, Snapchat, Facebook and YouTube are all limited to the over-13s – but in reality just over half of British children are using social media by the age of nine.

There is an easier solution to the one Ofcom proposes. Smartphone providers should be obliged to demand age verification and underage users should simply be unable to download the relevant apps. No such law is watertight of course. Children may figure out how to fake an online ID, but responsible parents should be able to check such details, so verification at the smartphone level would be a useful tool.

Several US states are moving forward with heavier-handed legislation. Florida has just outlawed all social media for under-14s. In Texas, Louisiana and Utah, the law has been changed to say that under-18s need parental consent before creating social-media accounts. It’s easy to see calls for such measures over here: but it would be better if smartphone companies adopted technology that enforced the age limits that already exist without the need for state intervention.

A decade or so ago, parents had no idea – no one did – about the potential negative effects of these addictive apps on young minds. But as the picture becomes clearer, so does the responsibility of tech companies – and the threat of regulation becomes more necessary. No social media company should be trying to harness the pre-teen market. Asking phone companies to verify the age of users is not a radical step: the British government should ask them to take it.