Who should pay for the harm caused by children’s social media use?

Across countries from Türkiye to Australia, a new shift in regulations reflects a shared view that parental oversight alone cannot prevent social media harms to children.

By Emre Yilancik
Across the world, governments take steps to regulate children’s social media use / AP

The global push to regulate children’s social media use is shifting responsibility away from parental control and towards platforms engineered to profit from children’s attention.

Across the world, as governments take steps to regulate children’s social media use, the debate around digital harm is undergoing a fundamental transformation. 

From Australia to the United Kingdom and from Norway to Türkiye, regulations being developed in many countries converge on a common point: the harms caused by children’s social media use cannot be prevented solely by parental supervision. 

Responsibility must be shared with the digital platforms that design, manage, and profit from online environments.

This shift does not absolve parents or public authorities of responsibility. Families and states remain central actors in child protection. 

However, governments are now adopting a clearer and more insistent stance that responsibility should be extended to the platforms that design, optimise, and commercialise digital environments.

Like other users, children do not simply “use” social media; they are continuously steered, incentivised, and kept on platforms by infrastructures optimised for profit maximisation. 

Infinite scrolling mechanisms, algorithmic recommendation systems, and reward loops are deliberately designed to sustain the attention economy.

Under these circumstances, expecting parents to be the only counterforce and to deal with trillion-dollar attention economy machines is neither realistic nor fair.

A substantial body of multidisciplinary academic research on children and adolescents shows that excessive social media use is strongly linked to anxiety disorders, depression, sleep problems, attention deficits, body dysmorphia related to filtered visual content, and eating disorders.

Surveillance capitalism

Shoshana Zuboff’s concept of surveillance capitalism provides a powerful framework for understanding why responsibility is shifting upward. 

Social media platforms systematically gather and analyse user behaviour, forecast future actions, and monetise these insights through targeted advertising and behavioural guidance.

Within this system, children are not just vulnerable users but also valuable data subjects. 

Every swipe, pause, like, and emotional reaction produces behavioural surplus.

For children whose cognitive and emotional development is still growing, this data extraction generates far deeper and longer-term consequences.

Algorithms do not only learn what children like; they also shape what they will come to desire. 

For this reason, current regulations increasingly aim to impose limits on data collection from children, ban targeted advertising, require transparency in recommendation systems, and sanction architectural designs that maximise engagement.

Technofeudalism

Yanis Varoufakis’s idea of technofeudalism offers a complementary perspective for understanding this power asymmetry. 

Digital platforms are increasingly resembling feudal estates rather than traditional capitalist firms. 

Users do not own these digital spaces; instead, they are granted conditional access under rules unilaterally determined by platform owners.

Children grow up within these privately governed ecosystems. 

Their socialisation, leisure time, and increasingly their educational experiences are shaped within largely opaque systems governed by algorithms rather than democratic oversight. 

In this context, parental authority is structurally disadvantaged in competing with continuous, invisible, and scalable algorithmic authority.

From this perspective, blaming parents is akin to holding land-bound serfs responsible for the ownership conditions of a feudal estate. 

States that recognise this power asymmetry are increasingly shifting their focus from the “subjects” to the “lords”.

Australia’s ban on social media use for those under 16, which came into force on 10 December 2025, stands as one of the most visible examples of this transformation. 

With limited exceptions, such as YouTube Kids, platforms are required to implement strict age-verification systems; failure to do so exposes them to fines of up to $32 million. 

Former Meta executive Stephen Scheeler’s remark that the company could earn this amount in less than two hours has raised serious questions about the deterrent effect of such penalties.

In the United Kingdom, the Online Safety Act grants the media regulator Ofcom the authority to fine companies up to 10 percent of their global turnover. 

Prime Minister Keir Starmer, among many other political actors, has openly stated that excessive screen time threatens children’s well-being.

Decisions by the European Parliament setting a minimum age of 16 for social media use and 13 for artificial intelligence tools and video platforms also reflect this trend. 

While France debates a complete ban for under-15s and a form of “digital curfew,” Spain is preparing regulations requiring parental consent for users under 16.

Norway, meanwhile, has recognised the ineffectiveness of current restrictions and is developing more effective oversight mechanisms.

In the United States, the data privacy–centred age threshold of 13 remains in place, but stricter state-level restrictions face legal challenges on free speech grounds.

China applies one of the strictest digital controls for children. While a daily screen time limit of 40 minutes is imposed on children under 14, digital access is completely restricted between 10:00 p.m. and 6:00 a.m local time. 

Although TikTok has more than half a billion users globally, it operates in China under a separate version called Douyin, which uses different algorithms and places greater emphasis on educational content.

French President Emmanuel Macron has openly stated that while China uses TikTok to weaken children's attention spans globally, it guides its own children through Douyin with more disciplined and educational content. 

This distinction illustrates that social media platforms function not only as commercial entities but also as instruments of cultural soft power.

The fact that Western social media platforms are banned in China further demonstrates that states increasingly view digital platforms not as free-market actors, but as strategic infrastructures.

Big Tech and ethics washing

Although the major technology companies commonly referred to as Big Tech have implemented these measures, they remain broadly opposed to such restrictions. 

Following the ban's introduction, Australian Prime Minister Anthony Albanese stated that more than 4.7 million social media accounts belonging to users under 16 had been deactivated, deleted, or restricted. 

Big Tech representatives argue that age-verification technologies threaten privacy, violate children’s rights, and may even reduce online safety. 

However, these objections often conceal a deeper concern: the risk of losing an extremely profitable user base that generates free data and content.

Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Snapchat CEO Evan Spiegel are set to face trial in lawsuits alleging that they designed addictive products despite evidence of harm to young users. 

Meanwhile, voluntary commitments on ethics and safety increasingly resemble what can be described as “ethics washing”: the diffusion of responsibility without altering underlying business models.

Child protection policies in Türkiye

Debates surrounding children’s presence in digital environments in Türkiye reflect local sensitivities but largely align with global regulatory trends. 

A draft report titled Threats and Risks Awaiting Our Children in Digital Environments,” prepared by the Child Rights Subcommittee of the Turkish Grand National Assembly’s Human Rights Inquiry Commission, outlines the framework of this approach.

The report includes proposals such as night-time access restrictions for those under 18, a social media ban for under-15s, limits on digital devices in school settings, strengthening counselling services, and special SIM card applications for children. 

Minister of Family and Social Services Mahinur Ozdemir Goktas has announced that legislative preparations for a social media regulation covering children under 15 will soon be brought before Parliament. 

The rationale behind the regulation includes rising levels of depression, anxiety, behavioural disorders, and the risk of contact with criminal networks through digital platforms. 

Minister Goktas has stated that children will not be treated as commercial resources or data pools by social media platforms. Rather than constituting an intervention against freedom of expression, these measures are presented as a strategic public policy aimed at protecting children from the structural risks of the digital ecosystem.

Within this framework, the “Cocuklar Guvende” (Children are Safe) digital platform provides guidance and notification mechanisms for children and parents. 

Authorities particularly emphasise that the fight against harmful content must be proactive and carried out by platforms themselves, rather than relying on reactive interventions.

Children are not the only group exposed to digital harm. Adults with low digital literacy, and especially older individuals, are also becoming increasingly vulnerable to disinformation, emotional polarisation, and behaviourally manipulative practices shaped by algorithmic steering.

Accordingly, platform regulation is not merely a pedagogical child-protection measure. 

It is also a struggle to fortify the digitalised public sphere against disinformation, to free democratic processes from algorithmic capture, and to preserve social cohesion.

If platforms design digital environments, train algorithms, and systematically profit from the attention economy, responsibility cannot be placed solely on individuals, families, or user preferences. 

Responsibility must be shared, and power must be brought under public oversight.