Snapchat, TikTok and YouTube under fire over child safety

Policy reps from each company faced US Congress to talk about kids and online safety, marking the first time Snapchat and TikTok appeared in a major tech hearing.

AP

Executives from Snapchat, TikTok and YouTube faced a barrage of questions and accusations from US lawmakers on Tuesday over doing more to protect children on their platforms, as the fallout from revelations around teen mental health on Instagram continues.

“Everything you do is to add more eyeballs, especially kids’, and keep them on your platforms for longer,” said Democrat Senator Richard Blumenthal, who heads the Senate Commerce subcommittee on consumer protection.

“This is for Big Tech a Big Tobacco moment…It is a moment of reckoning. There will be accountability. This time is different.”

The hearing featured testimony from Snapchat’s VP of Global Public Policy Jennifer Stout, TikTok’s VP and head of Public Policy Michael Beckerman and Leslie Miller, who leads government affairs and public policy at YouTube.

The senate panel wants to learn how algorithms and product designs can magnify harm to children, including fostering addiction and intrusions of privacy, in order to develop appropriate regulatory safeguards for child protection.

Both YouTube and TikTok called for the creation of comprehensive laws around online privacy, with Beckerman deeming a legal framework for national privacy laws “overdue.”

Democrat Senator Ed Markey pushed for what he called a “privacy bull of rights for the 21st century” during the hearing, pointing to his proposed amendments to the Children and Teens’ Online Privacy Protection Act (COPPA) that would bolster protections for young users on social media.

COPPA would ban tech companies from collecting data of users between the ages of 13 and 15 without explicit consent, implement an “eraser button” that would make it easy to delete minors’ personal data and more broadly restrict information gathering from the start.

Markey pushed each of the company reps on whether they would support the COPPA changes and hammered their refusal to take a firm stance on the measure.

Markey and Blumenthal also highlighted last month’s reintroduction of the KIDS (Kids Internet Design and Safety) Act. The bill would protect online users under 16 from engagement-juicing features like auto play, push alerts and like buttons, as well as ban influencer marketing and create a reporting system for tackling harmful content.

All three companies committed to sharing internal research on how their products impact kids – an issue that has recently come to the forefront after a trove of leaked Facebook documents revealed how the social media giant negatively affected teenagers’ mental health. Some documents showed that teen girls reported Instagram made their body image issues worse.

Last month, Facebook executive Antigone Davis testified in Congress, facing accusations from senators that the company buried internal research about how its products could harm children.

Thus far, Snapchat and TikTok – which are popular with millions of teens – have faced less scrutiny from authorities, with Tuesday marking the first time the two companies have appeared at a major tech hearing.

“Facebook is just not the only game in town,” said Evelyn Douek, a Harvard Law School lecturer who studies the regulation of online speech. “If we’re going to talk about teen users, we should talk about the platforms that teens actually use, which is TikTok, Snapchat and YouTube.”

Snapchat says 90 percent of 13 to 24-year-olds in the US use its service. It reported 306 million daily users between this July-September.

TikTok reports that it has over 1 billion monthly users, though it does not break them down by age.

Route 6