Content

/

Blog Posts

/

Will There Be Major AI Legislation Before 2025?

blog posts

Will There Be Major AI Legislation Before 2025?

August 11, 2023

The featured image for a post titled "Will There Be Major AI Legislation Before 2025?"

This piece originally appeared at Second Best.

What is the probability of major AI legislation passing in the next 12-18 months? It may not be high in absolute terms, but I bet it’s higher than most.

We’re conditioned to be cynical about the speed of Congress, and for good reason. From immigration reform to national privacy legislation, there are wonks in Washington who’ve pushed the same issues for decades, writing and re-writing suspiciously similar reports as they wait for the political stars to align. It’s basically the Myth of Sisyphus but with drunken Sunday brunches.

Occasionally, however, Congress moves shocking fast. Consider that the $2.2 trillion CARES Act was drafted and passed just two weeks after the WHO formally declared a pandemic, and just a few months after Covid was still a blip on public health officials’ radars. That shows Congress can act quickly, especially under crisis conditions.

I think we're still in the January 2020 stage of AI awareness. Most people have heard something about AI on the evening news but are going about their business as usual. This isn’t helped by NGO and media types who talk about short AGI timelines in subtly mocking tones. Like Andreessen-Horowitz’s February 2020 policy against handshaking, they construe AGI worries as just the latest hype from SF tech bros — Web3 all over again. Meanwhile, a handful of Balaji-esque Twitter prophets have already begun doomsday prepping.

Others are aware of the trend line but brush off AI as ebola-scale at best. After all, exponentials are always secretly sigmoidal, right? Right?!? Yes, technically. However, there’s no sign that AI progress will asymptote anytime soon, so be careful that you’re not making the same mistake with AI that the International Energy Agency routinely makes with their solar projections:

AI, like solar, is nowhere it near its plateau. Credit: Ramez Naam.

Others still can see the wave coming but suspect, due to false historical analogies, that we’ll adapt to AGI as easily as the seasonal flu. They believe in a Great Barrington Declaration for AI: instead of trying to “bend the curve,” let AI rip and hope for herd immunity. This is a group that includes many of my techno-libertarian friends, and I sympathize with their Promethean spirit. I just wish they’d read the waiver more carefully before rushing human civilization into a world-historical Titan submersible.

By now, there have been umpteen Congressional hearings on AI, so we’ve arguably transitioned from January 2020 to late February / early March. Nonetheless, the scale of what’s happening still clearly hasn’t registered, or else Marsha Blackburn would be less worried about AI’s impact on Nashville’s music industry. The nearness of AGI simply isn’t tangible enough. Thus, when Sam Altman tells Congress that superintelligence is just around the corner, they may take him seriously, but still aren’t taking him literally.1

As Emad Mostaque put it, the "Tom Hanks got Covid" moment for AI has yet to arrive. But it's coming, and when it does, it will be the Bayesian update heard around the world. I think this update could happen within the next 12-18 months, and will represent a starting pistol for Capitol Hill.

EpochAI: Modelling the Future of AI

It is hard to know in advance what the trigger will be, but it’s over-determined. EpochAI’s baseline model suggests there is a 50% chance of transformative AI arriving before 2036 with a modal forecast for 2029. Yet as those dates approach, it’s not as if nothing else will be happening. Congress woke up to the pandemic in March, 2020 — well before deaths peaked — and so I think they’ll likewise wake up to AGI sometime before it’s literally at our doorstep.

By this time next year, we’ll have multimodal GPT-5-level models, LLMs hitting every enterprise, more reliable AutoGPTs, an explosion in deepfakes, scams, and cyberattacks, and many other known unknowns. These will generate ad hoc debates, but they may also finally aggregate into the existential realization — the bitter lesson — that many of us had years ago, but which has been regrettably slow to diffuse.

Bet on it

To put my money where my mouth is, I’m making a public bet with Ezra Brodey that Congress will pass “major AI regulation” before EOY 2024 (with some wiggle room for a billed passed in the December rush to get to the President’s desk):

Should the U.S. Congress pass, and the President subsequently sign, a bill by January 10, 2025 11:59 PM EST, that meets one or more of the following criteria for "Major AI Regulation", irrespective of the bill's specified effective date, Ezra Brodey agrees to pay Samuel Hammond $400. Should the event not occur, Samuel Hammond agrees to pay Ezra Brodey $800.

Note, I’m not betting on the “major AI regulation” being good regulation, nor does it have to be comprehensive. If anything, I suspect Congress will mess it up by default, just like they did with Covid testing. Instead, as the contract indicates, it will be sufficient for Congress to pass a law affecting any one of a handful of non-trivial AI policy domains, from “Privacy and Data Rights” to “Training Runs and Compute.”

The full terms of the bet are outlined here.

The case against the case against Congress acting

The case against Congress taking up major AI legislation before 2025 is simple:

  • The base-rate expectation for Congress is always that they do nothing, especially if they haven’t had time to build a consensus.
  • Congress still hasn't finished the NDAA and FAA reauthorization, et al., and thus won't have the bandwidth for something totally new.
  • 2024 is an election year and that historically hinders legislative productivity because of constraints on bipartisanship and time lost to campaigning.

Here’s my counter case:

  • There is a strong autoregressive bias in how people think about major policy change.

Most people assume the future will look roughly like the recent past. That’s a reasonable heuristic, but on Capitol Hill the reality is closer to the old adage about war being “long periods of boredom punctuated by moments of sheer terror." This makes base-rated forecasts of Congressional productivity unreliable for Talebian / Black Swan reasons. That is, even if "no change" is the correct point estimate, the underlying probability distribution has multiple modes, fat tails, etc. From the Great Financial Crisis to the Arab Spring, change happens slowly then all at once.

Change is also endogenous to people's expectations for change. If it is widely believed that nothing is likely to happen it can become a self-fulfilling prophecy, and vice versa.

  • Congress can turn on a dime if leadership wants it too.

Remember, Congress was also in the middle of debating the NDAA circa April 2020. Covid derailed it and many other things scheduled at the time. Relief legislation ended not only taking precedence, but also affected the NDAA itself, which became a vehicle for pandemic supply chain policies and the like.

The NDAA was ultimately signed into law on Jan 1, 2021, and incidentally contained the first law concerning AI — the “National Artificial Intelligence Initiative Act.” The 2024 NDAA will likewise be a vehicle for other AI policy reforms this Congress, including reports that could lead to policy actions in the NDAA for 2025. Indeed, the version of the NDAA that the Senate passed in July contains 131 mentions of “artificial intelligence,” covering everything from a DoD deepfake detection competition to the establishment of a Chief Artificial Intelligence Officer at the Department of State.

The broader point, though, is that circa January 2020, no one would have predicted that Congress was about to do an about-face on its scheduled priorities and pass several multi-trillion dollar pandemic bills. Those who said "I doubt it; they still have to vote on reauthorizing the Small Business Administration" were failing to account for endogeneity. After all, why would Congress prioritize reauthorizing something that transformative AI is about to render obsolete?

  • 2020 was also the start of election season, but that didn’t matter.

While fears that another stimmy check would hand Trump reelection did eventually affect Covid relief debates, 2020 as a whole was one of the most legislatively productive sessions of Congress in modern history. Indeed, being an election year arguably cuts both ways. With Democrats poised to lose their Senate Majority, they’ll feel pressure to accelerate their vision for AI regulation before Republicans take over, especially if Trump gets the nomination and thus a coin-flip chance at the Presidency. This may be the subtext to Chuck Schumer's “SAFE AI Innovation” framework, which he pitched as an alternative to the too-slow committee process: too slow because AI is moving too quickly or too slow because he wants to get something done before losing his leadership?

Being an election year also augurs for passing something well ahead of November 5th, 2024, lest deep fakes, advanced robocallers, and AI-enabled voter manipulation run wild. If lawmakers have by then updated on the nearness of AGI, we could thus see a scramble to pass a bundle of AI-related reforms that ride along on election integrity.

Some other points to consider:

  • Metaculus currently gives the question "Before 2025, will laws be in place requiring that AI systems that emulate humans must reveal to people that they are AI?" a 75% chance of resolving "yes." This is so high because the EU AI Act counts, and that's expected to pass around April 2024. If there's a Brussels effect or other extraterritorial compliance fall-out, Congress may be pushed to act on similar timelines.
  • Chip production is ramping up in a big way, and by 2024 the bottlenecks in memory and advanced packaging will have eased. With NVIDIA selling China A800s by the billions, there may be an urgency to react before the orders are fulfilled (there's typically a 6+ month turnaround from when an order is received to when it’s shipped). The White House and BIS could act unilaterally — and I hope they do — but Congress will feel growing pressure to weigh in as well.
  • While it’s always hard to pass laws that affect a rich and powerful industry, the AI companies are asking for regulation, and the political power of the semiconductor industry is probably over-stated. Billions of dollars in chip subsidies don’t come for free, and have implicitly subordinated the sector to natsec concerns. Indeed, the most powerful lobbies in Congress aren't always what you'd expect: car dealerships and restaurant franchises arguably have more clout than NVIDIA simply because they're bigger employers and have a footprint in every Congressional district.
  • More fundamentally, part of "updating to AGI" will be realizing that the existing political economy equilibrium is about to flip. Some asset prices will collapse and others will go to the moon; some sectors will start to see automation that shifts their employment intensity, and so forth. This all implies the relative power of incumbent interest groups will be radically reshuffled. For example, once we have self-driving trucks, the power of the Teamsters trucking division goes to zero. I don't think this happens overnight, but seeing these effects on the horizon could significantly alter the relevant public choice constraints.

Of course, I could be totally wrong. I just think next year is when shit starts to get real, so get your policy ideas shovel-ready as quick as possible.

Explore More Policy Areas

InnovationGovernanceNational SecurityEducation
Show All

Stay in the loop

Get occasional updates about our upcoming events, announcements, and publications.