
Every technology has its tradeoffs. Over the past two decades, we’ve seen the light and dark sides of major tech platforms, from social media to app stores to AI chatbots. The economic and societal benefits of these advancements shouldn’t be discounted, but neither should the harm on children from certain technologies and use cases.
Over 90 percent of U.S. parents are concerned about children’s online safety—more than any other child safety issue. The majority of parents think tech companies and politicians aren’t doing enough and support new laws to keep kids safe, and they have good reason to feel that way.
Mounting empirical evidence is confirming what is intuitive to most. A recent study by the American Academy of Pediatrics found that smartphone use among preteens is associated with higher risk of depression, obesity, and sleep deprivation. It’s just the latest in a series of alarm bells for online children’s safety.
40 percent of 8–12 year olds, and up to 95 percent of 13–17 year olds, use social media, which is likely contributing to rising rates of depression and anxiety among young people. Two-thirds of teenagers say they’re on social media every day, and one-third say they use it “almost constantly.” Teens average over four-and-a-half hours on social media per day, and children spending more than three hours daily on social media face double the risk of depression and anxiety. Youth suicides rates have risen 62 percent from 2007 to 2021.
The percentage of children exposed to pornography has climbed consistently over the past two decades despite the widespread availability of content filtering and parental control tools. Pornography use among children is linked to negative developmental outcomes, including increased sexual aggression, relationship problems, and social isolation. Seventy-three percent of teens ages 13–17 have watched online pornography, and the average age of first exposure is 12 years old. Some 15 percent were first exposed at age 10 or younger, and 52 percent have encountered violent content including depictions of rape.
This is what market failure looks like, and one must be divorced from reality to believe that the status quo is unworthy of significant government action.
Thankfully, the House Energy and Commerce Committee is not in that camp. The Subcommittee on Commerce, Manufacturing, and Trade recently held a hearing on—and will soon mark up—a range of bills to address the challenges facing parents and children today. No single bill can or should be the proverbial silver bullet. A complex problem requires layered solutions, and the following will focus on a few key bills that the committee and the full House should approve.
Shielding Children’s Retinas from Egregious Exposure on the Net (SCREEN) Act
While constitutional challenges have long plagued efforts to shield children from indecent and obscene content online, the Supreme Court’s ruling in Paxton made clear that requiring adult websites to verify the ages of their users can be consistent with the First Amendment. While half of U.S. states now require age verification to access online pornography, the SCREEN Act would establish a national standard while providing flexibility to providers in how they comply with the mandate.
While critics warn that the legislation creates privacy and data security risks, the bill includes specific measures to protect user data. Minimization requirements limit data collection to only what is necessary to verify age, and those data must be deleted once that process is complete. The bill also allows site operators to contract with third-parties that specialize in privacy-protecting age verification, which would further separate sensitive information from pornography providers, including those that lack the expertise to conduct age verification themselves.
Some critics cannot be swayed, arguing that adults’ free speech rights supersede the compelling government interests in this matter.
Policymakers should reject that view just as the Supreme Court did in Paxton. Eighty-three percent of Americans support a federal age verification requirement for adult websites, and 86 percent of parents believe it is “too easy” for children to access online pornography. The vast majority of sensible Americans would agree that the need to address widespread harms to children from exposure to obscenity must outweigh the sensitivities of some adult users to age verification—especially when those sensitivities are addressed thoughtfully with privacy and security protections in the legislation.
Kids Online Safety Act (KOSA)
Exposure to obscenity is not the only problem facing children and parents today. Social media and other apps have promoted self-harm, illegal drugs, eating disorders, and other problematic content to children based on design features that aim to maximize engagement. Existing parental controls are simply not equipped to combat platforms that have a financial incentive to deprioritize the well-being of child users.
Unsealed court documents reveal that in 2020, a Meta employee said, "Child safety is an explicit non-goal this half." Internal Meta research found that Instagram's recommendation feature suggested 1.4 million potentially inappropriate adults to teenage users in a single day. A TikTok internal report acknowledged that "minors do not have executive mental function to control their screen time"—yet executives rejected screen time limits because they would mean "fewer ads" and hurt revenue.
KOSA helps address this problem head on. It requires platforms likely to be used by minors to, by default, implement the most stringent safeguards for users under 18 years of age. Covered entities would also have to provide enhanced controls to parents that enable them to disable addictive design features, such as autoplay, infinite scroll, and certain algorithmic recommendation systems. Platforms would also be barred from advertising illegal products to minors.
This bill helps address a longstanding problem: platforms were not designed with the safety of children in mind. And despite mounting problems, that safety has simply not been top of mind for platforms. Companies have failed to demonstrate that, absent government action, they will address these problems adequately. KOSA would make them do the right thing.
As with the SCREEN Act, no amount of careful tailoring will ever be enough for some detractors. Several organizations continue to warn that any legislation to address design flaws in online platforms will inevitably lead to censorship of lawful content. Against the backdrop of rising rates of teen suicide, eating disorders, depression, and more, the answer cannot be that Congress should do nothing because some platforms may implement a law’s requirements poorly. However, Congress is capable of walking and chewing gum at the same time. Policymakers should take good-faith critiques to hear and consider amendments in marking up the bill to help address free speech, privacy, and security concerns further.
Ultimately, as with age verification requirements, the potential for some platforms—whose products are already harming children—to sloppily comply with legislation is not a good argument against legislating. No one would suggest repealing traffic laws because some drivers are wrongly ticketed.
App Store Accountability Act (ASAA)
Google and Apple control 99 percent of American smartphones. Through their app stores, they are gatekeepers to the mobile internet, and that position should come with a responsibility to safeguard the welfare of young smartphone users. Given their track record of failing to protect children from harms in their app stores, legislation is needed.
ASAA would require Google and Apple to verify the age of a user when they create an account. If a user is a minor, the account must be affiliated with a parent, and the app stores would have to obtain verifiable consent from the parental account to download apps or make in-app purchases. The bill would also require Google and Apple to protect user data and prominently display age rating information.
By putting the onus on well-resourced firms that already have age data on users, ASAA minimizes the burden on small businesses while helping them comply with current and future laws to protect children at the platform layer. For instance, the Children’s Online Privacy Protection Act of 1998 has been essentially toothless for decades. The law only applies when platforms have “actual knowledge” that they're collecting data from children under 13. Platforms exploit this by deliberately avoiding age verification to maintain plausible deniability.
ASAA would create actual knowledge that cannot be denied. The bill also complements and enhances KOSA, as apps that know a user is a minor, thanks to ASAA, would have to enable the strictest settings to protect kids. Under ASAA, COPPA compliance in the mobile ecosystem becomes automatic and enforceable.
Critics continue to claim that age verification at the app store layer is impossible, or cannot be done in a way that preserves privacy. This claim is belied by Apple and Google’s own compliance with ASAA in states where it has passed. (For more on the technical feasibility of privacy-preserving age verification in app stores, see FAI’s paper last year, “On the Internet, No One Knows You're a Dog: Examining the Feasibility of Privacy-Preserving Age Verification Online.”
Conclusion
These three bills work as a comprehensive system. ASAA establishes the foundational infrastructure—parents verify their children's ages once at the app store level, and that protection follows them everywhere on smartphones. KOSA uses that age information to require platforms to enable safe defaults for known minors. The SCREEN Act addresses pornography accessed through web browsers that bypass app store controls entirely. Together, they create layered, redundant protections covering different access points. They also work well with other bills under consideration by the committee, including ones that address targeted advertising, illegal drug sales, data brokerage, and more.
The committee’s effort to protect children online has elicited some familiar pushback. Parents should simply use existing tools, critics say. Industry self-regulation is working. Age verification cannot be done without violating privacy. And these bills may all be unconstitutional anyway.
Child online safety cannot be solved by suggesting that parents should simply parent harder. Even wealthy, two-parent households are struggling in this environment; imagine what it’s like for a single parent with two jobs. The evidence is clear that self-regulation has failed, that today’s parental controls are inadequate, that age verification can protect privacy, and that children’s safety can be addressed in a constitutional manner.
After years of federal inaction, the House Energy and Commerce Committee is poised to meet the moment. The bills discussed above should be among those that pass, and let’s hope they make it into law as quickly as possible.