Content

/

Blog Posts

/

Protect Kids, Not Big Tech’s Market Position

blog posts

Protect Kids, Not Big Tech’s Market Position

April 17, 2023

The featured image for a post titled "Protect Kids, Not Big Tech’s Market Position"

Lawmakers across the political spectrum are increasingly focused on reining in the market power and abuses of Big Tech, particularly when it comes to social media. If the history of tech policy is a lack of regulation and firms avoiding responsibility for “moving fast and breaking things,” then there’s perhaps no more illustrative example than children’s online safety.

For years, teenagers and even younger kids have run rampant on platforms such as Instagram, YouTube, and TikTok—with few meaningful barriers and restrictions. As the harms continue to multiply, this sociological experiment is one that parents and lawmakers increasingly view as a failure.

With that backdrop, few causes could be as noble and important as protecting children from online dangers—and Congress is rightly focused on the issue. The Senate Judiciary Committee recently held a hearing on the subject, and multiple bills seek to address and prevent harms, including the Kids Online Safety Act (KOSA), led by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN).

The Senators should be commended for their leadership, and parents everywhere can hopefully look forward to real legislative action on this issue. With KOSA likely to be reintroduced in the 118th Congress, it’s paramount that the bill keep its focus on the problem: algorithmic amplification of user-generated content (UGC) that harms kids. Straying from that mission would run the risk of cementing Big Tech platforms’ dominance in the attention economy and weakening would-be competitors who don’t rely on UGC to entertain and inform children.

The environment is toxic. Late last year, the New York Times reported that TikTok starts recommending videos about “eating disorders and self-harm to 13-year olds within 30 minutes of their joining the platform.” Viral trends have even led to deaths, including the “blackout challenge” that encouraged two girls, who were eight- and nine-years old, to choke themselves.

These are not just isolated incidents. A growing body of research shows that social media use is associated with sleep deprivation, anxiety, depression, low self-esteem, and suicidality. Facebook’s own internal research, as reported by the Wall Street Journal, found that Instagram worsens body image issues among a significant number of teenage girls using the platform. And U.S. Surgeon General Vivek Murthy said, based on the data he’s seen, that 13 is too young for social media use.

Social media poses unique problems for kids. These services are largely ad-supported and rely on UGC that is generally unvetted by humans before it's posted. The platforms rely on artificial intelligence for moderation, and humans only intervene after the fact—often after the damage is already done. The companies have a financial incentive to addict their users, as the more time teens spend scrolling, the more ads they see. And in order to keep kids staring at their screens, the platforms have an incentive to feed users content that hooks users—sending them down rabbit holes that serve up ever more extreme versions of that content.

Another key factor is Section 230, which immunizes the platforms from liability for harm caused by user-generated content, as long as the content is legal. So kids can watch videos that encourage them to hurt themselves, the social media companies’ algorithms can amplify and deliver similar videos, and the firms bear no responsibility if tragedy strikes.

KOSA seeks to close the rabbit role by requiring platforms to provide protective tools to parents, and, given the harms caused to children, regulation is warranted. But policymakers should ensure that they don’t impose needless regulations on services that don’t rely on UGC and are fundamentally different from social media.

Consider streaming services. The libraries of services such as HBO Kids or Kidoodle are human-vetted and human-curated. Unlike in the Wild West of UGC, that review happens long before content is posted. Moreover, HBO and Kidoodle are the “speakers” of those videos, so they are legally responsible for the harms they might cause—they’re not shielded from liability by Section 230.

Beyond being human-curated and vetted, streaming libraries—from movies and television shows to music and even video games—are subject to longstanding rating systems that communicate to parents whether content may be age-appropriate. They also offer parental controls that can’t be easily replicated on social media platforms.

Kids spend an average of 107 minutes per day on TikTok and 67 minutes on YouTube. These numbers dwarf the time spent on even the most popular video streaming services, such as Netflix (48 minutes) and Disney+ (40 mins). And while parents probably wish their kids spent less time online in general, not all screen time is created equal, and there is enough evidence to argue that shifting eyeballs from social media to human-curated streaming would be a societal benefit.

Not all algorithms are equal, either. TikTok’s may recommend videos posted by anyone in the world. A streaming service’s algorithm only recommends non-UGC that is rated and vetted. Put simply, a TikTok rabbit hole bears no resemblance to “you watched this cartoon, you may like this other cartoon.” And while KOSA, like any regulation of algorithms, might trigger First Amendment challenges, policymakers should still consider the universe of available content on a platform when evaluating the potential harms from algorithmic amplification.

At a time when the White House, regulatory agencies, Congress, and state governments are seeking to curb the market power of tech giants, policymakers should take care not to weaken the market position of streaming services and other platforms that have not facilitated the harms caused by the amplification of unvetted UGC.

If UGC is causing harm to kids, then that’s where legislation should be focused. Overbroad regulations that drag down Big Tech’s competitors only strengthen the social media giants. We should avoid setting a precedent that when certain firms abuse power, their competitors must pay a price.

Explore More Policy Areas

InnovationGovernanceNational SecurityEducation
Show All

Stay in the loop

Get occasional updates about our upcoming events, announcements, and publications.