Can governments regulate how children access social media

US state of Utah brings legislation to restrict screen time for children on social media. But can it solve the growing pandemic of mental health issues in minors due to social media use?

AP Archive

For years now, activists and authorities have grappled with the risks associated with children being granted unfettered access to social media.

Minors as young as 13 have shown tendencies of self-harm or violence and faced mental health issues after facing cyberbullies and hackers. 

Last week, Utah became the first US state to require social media sites to get parental consent for accounts used by under-18s, placing the burden on platforms like Instagram and TikTok to verify the age of their users.

The law, which takes effect in March 2024, was brought in response to fears over growing youth addiction to social media and to security risks such as exploitation and collection of children's personal data.

Anxiety, depression, and low esteem are the main issues in teenagers addicted to social media.

Experts say that addiction to social media also leads to reduced productivity and social isolation.

Tougher restrictions

According to the Society for Research in Child Development’s empirical report, kids get their first mobile phone at the average age of eleven-and-half-years. 

Social networking companies such as Facebook, Snapchat and TikTok allow children 13 years and above to download their apps. 

But doctors and experts say the current age restriction is too low and needs to be higher. 

“I, personally, based on the data I’ve seen, believe that 13 is too early. It’s a time where it’s really important for us to be thoughtful about what’s going into how they think about their own self-worth and their relationships and the skewed and often distorted environment of social media often does a disservice to many of those children,” the US Surgeon General Vivek Murthy said in an interview with CNN. 

Children’s advocacy groups have welcomed Utah’s steps taken that may help to drop some addictive habits in teenagers. 

"It adds momentum for other states to hold social media companies accountable to ensure kids across the country are protected online," said Jim Steyer, Common Sense Media's founder and CEO.

Recently, the popular video content social media app TikTok decided to limit daily screen time for users to 60 minutes for those under 18.

Utah sets an example to other states for the US and other countries whether they can impose the same rules on tech giants.

Is it possible?

The main problem persists in what regulators should do to prevent the harmful sides of social media applications.

A common understanding is that joint efforts conducted by parents and state regulators can help safeguard children’s safety and health. 

On the one hand, government bodies can work closely with tech companies and on the other hand, parents can take care of their children. 

Regulators can set age restrictions, but also it may be open to cheating on information provided by teenagers. For example, children can easily fake their age or information asked by social media apps.

According to research conducted by Ofcom, a communications watchdog, a third of social media users aged between 8 to 17 have signed up with false adult age. 

But parents’ intervention can help to monitor their child’s social media use by checking their accounts regularly or installing software. This also may create a privacy problem between children and parents.

The bill proposed by the Utah government imposes social media curfew between 22:30 and 06:30 unless adjusted by their parents. The companies will also not be able to collect any data. In addition, it aims to prevent any data collection from children for ads and related businesses. 

It is also unknown how governments will implement these measures, including the effectiveness of age verification, as social media apps can be accessed globally. There are thousands of active social media applications, and they keep growing every day. 

Route 6