Politics

Ted Cruz Attempts to Exempt Big Tech From the Law

By Dylan Gyauch-Lewis

Copyright prospect

Ted Cruz Attempts to Exempt Big Tech From the Law

The Revolving Door Project, a Prospect partner, scrutinizes the executive branch and presidential power. Follow them at therevolvingdoorproject.org.

Until the assassination of archconservative pundit and activist Charlie Kirk that afternoon, September 10 was shaping up to be all about artificial intelligence and the federal government’s increasingly maximalist approach to its development and deployment.

That morning, Sen. Ted Cruz (R-TX), chair of the Senate Committee on Commerce, Science, and Transportation, introduced legislation that could allow the White House to unilaterally exempt AI companies from nearly any federal regulation.

The bill, called the SANDBOX Act—standing for “Strengthening Artificial Intelligence Normalization and Diffusion by Oversight and eXperimentation Act,” a clunker of an acronym even by congressional standards—would mandate the director of the White House Office of Science and Technology Policy, or OSTP, to create a “regulatory sandbox program” within one year of enactment. That program would allow companies working on artificial-intelligence products to request a “waiver or modification of one or more covered provisions” for an initial period of two years, renewable up to four times for a total of one decade of exemption from federal regulations.

But it wasn’t just the SANDBOX Act. Cruz also introduced his five-pillar “AI Framework,” of which the bill is only one part of the first pillar. The Senate Subcommittee on Science, Manufacturing, and Competitiveness—part of the Commerce Committee that Cruz chairs—also held a hearing on what Chair Ted Budd (R-NC) termed “the need for accelerated development and deployment of American AI products.” Rounding out the legislative branch’s unofficial AI Day was a House Committee on Natural Resources hearing where three permitting reform measures were discussed, with substantial testimony devoted to building a case against the National Environmental Policy Act (NEPA) because of how it impedes rapid data center construction.

Cruz’s legislation stands out from the pack, though, in its sweeping potential to completely rewrite federal regulation under the guise of a technocratic “sandbox” program. J.B. Branch, Public Citizen’s Big Tech accountability advocate, told me that the way “Cruz is trying to pitch this as sandboxes have been used before, this is normal” ignores that usually “when sandboxes are allowed, they’re for very specific niche regulations.”

Or as Anna Aurilio, campaigns director at Economic Security Project and ESP Action, put it, “This isn’t a sandbox, it’s a litter box.”

Talking Silicon Tacks

While the language used by proponents of AI in the hearings and legislative materials frames the debate around the lodestone of “innovation,” critics are already denouncing SANDBOX as the latest in a string of attempts to insulate Big Tech from public oversight and democratic accountability.

“Big Tech has been pushing, basically, to be exempt from the law,” Aurilio said. She added that Cruz’s legislation is about “letting Big Tech do whatever the heck it wants and getting rid of any regulations that stifle their ability to make a profit.”

In a press release, Public Citizen’s Branch said, “Public safety should never be made optional, but that’s exactly what the SANDBOX Act does.”

Throughout the text, there are myriad details that tilt technocracy in favor of Big Tech and deregulation. From the definitions on, the bill is packed with requirements that make the process favor applicants seeking waivers.

Following the creation of the sandbox program, applications may be submitted either by the director of OSTP directly or by other “persons” for a waiver of “one or more covered provisions of an applicable agency in order to test, experiment, or temporarily provide to consumers artificial intelligence products or services or artificial intelligence development methods.”

Of note, there does not appear to be any limit on how many “covered provisions” can be included in an application, nor on the number of applications that a party can file. The term “covered provision” is also defined identically to “rule” in Title 5, Section 804 of the federal code, which covers nearly all agency regulatory actions. There are just three narrow exceptions: (1) regulations that govern accounting standards, compensation, and firm organization; (2) rules about the “management or personnel” of an agency; and (3) rules that do not “substantially affect the rights or obligations” of organizations outside of the federal government.

Under that definition, everything from consumer safety rules to water quality standards to workplace safety requirements could be waived at the discretion of the executive branch.

Equally broad are the definitions of “artificial intelligence products or services” and “artificial intelligence development methods.” The former encompasses any product or service that “in whole or in part, uses one or more artificial intelligence systems.” The latter covers any “business model or production method that, in whole or in part, uses one or more artificial intelligence systems.”

Branch told me that these definitions are so expansive that the bill could open up attacks on virtually any regulation, saying that, given how thoroughly AI is being integrated into all kinds of products, those definitions could “incorporate almost anything that has a computer component to it.” Everything from cars to refrigerators to toasters are being sold with supposed AI features these days.

As Aurilio noted, the bill puts the onus on opponents of deregulation to affirmatively block deregulation, where normally the status quo enjoys default status.

Following the submission of an application for a waiver or modification, the director of OSTP must send a copy of the application to the head of each agency that oversees one or more of the covered provisions included in the application. From the time of receipt, agency heads have just 90 days, with the possibility of one 30-day extension (although the bill does not require the director of OSTP to grant such a request), to review the application and make a decision. If an agency head does not respond within 90 days (120 if granted an extension), they are presumed to approve of the application.

Even in the best of times, a three-month timeline is daunting for agencies with sweeping mandates. Now, after decades of declining capacity across the administrative state capped off by the reckless cuts of DOGE’s chain saw–wielding boss Elon Musk, such a limit could foreclose serious scrutiny of applications. This is only compounded by how susceptible the permitting process would be to “flooding the zone”; companies could have their legions of lawyers file thousands upon thousands of applications. Anything at the bottom of the stack that can’t be reviewed gets approved by default.

In order to deny the application, or any part of it, an agency head must detail the harms based on which they would deny it, including their assessment of the probability of those harms occurring, a justification for why a partial approval would not sufficiently mitigate the risks, and recommendations for how those harms could be mitigated. After a denial, the director of OSTP “shall” give the applicant 60 days to amend the application, after which it is sent back to agency heads who now get 60 days to review the updated application.

If the agency head remains unconvinced and would still deny the application, the director of OSTP would be able to unilaterally overrule their determination upon appeal from the applicant (which could be the director themselves) and grant the waiver or modification anyway.

Additionally, once a waiver or modification is granted, it would be posted in the Federal Register along with a streamlined application for any other person to apply for the same regulatory exemption.

Perpetual Motion-to-Proceed Machine

Just as the SANDBOX Act would create a process to grant waivers where the momentum favors those seeking exemption from regulation, it also proposes a streamlined process for permanently repealing nearly any federal rule that is getting in the way of the AI industry.

OSTP’s director is instructed to give an annual report to Congress that details the covered provisions targeted for waivers in applications and a breakdown of acceptances and denials. The report must include summaries of reasons for all denials, and lists of covered provisions the director recommends repealing or amending, with the director’s reasoning for those repeals and amendments.

Following the submission of that report, any member of either chamber could introduce a joint resolution to enact the director’s recommendations. After being sent to the relevant committee in whichever chamber originates the resolution, the committee has only ten legislative days to consider. If the committee does not “report” (a fancy way of saying make a decision) within that time frame, the resolution is automatically discharged from committee. Starting on the third legislative day following either the committee reporting a recommendation to advance the resolution or a discharge, any member of the chamber can motion to bring the resolution to consideration in front of the full body. That motion to proceed would not be subject to debate, could not be objected to by a point of order, would not be subject to a postponement motion, and the results of the vote could not be targeted by a motion to reconsider.

Such a joint resolution would be subject to two hours of debate in the House of Representatives. There is no debate time included in the procedure for Senate consideration. Between the total lack of debate time and the motion to proceed not being up for debate, a senator likely could not filibuster the resolution.

Following passage in the chamber where the joint resolution was introduced, it would be sent to the other chamber. Once referred to committee, the committee has just two legislative days to consider the measure before it is automatically discharged. From that point, the process is the same as described above.

Friends in High-Dimensional Places

Even beyond the potential in-built advantages for technology companies, AI firms have another ace up their sleeve: the man behind the curtain, OSTP director Michael Kratsios.

Kratsios, according to his LinkedIn profile, formerly worked at Thiel Capital, the venture capital firm of Peter Thiel, for the seven years immediately preceding joining the first Trump administration as chief technology officer (later moving to a post as undersecretary of defense). Perhaps best known as co-founder of the defense contractor Palantir, itself a major player in intelligence and defense applications of AI technology, Thiel is an omnipresent figure in the American tech world. He’s also quite a character; he’s currently doing a four-part lecture series on the Antichrist, who he thinks might be the climate activist Greta Thunberg.

Thiel is also a proponent of the “network state,” which champions abolishing government and replacing it with rule by blockchain and tech magnates. His biographer described his politics as “authoritarian.” Thiel has also lamented that libertarianism was set back by women being granted the vote. And on one of his less subtle days, he explicitly said that he wanted to use technology to “unilaterally change the world” by overriding democracy.

To be clear, Kratsios was not just a run-of-the-mill employee at Thiel Capital. At one point, he was Thiel’s chief of staff. And while Kratsios has voiced support for safety regulations to protect children, the OSTP director’s overall sentiment is strongly deregulatory. In an address to an AI summit held by Asia-Pacific Economic Cooperation, Kratsios celebrated that “the Trump Administration is getting the Federal government out of the way of America’s AI innovators.” Kratsios went on to criticize European regulation of AI as a “model of fear and overregulation” that would result in inevitable stagnation and falling behind technologically.

That sentiment is hardly unique to Kratsios. Environmental Protection Agency head Lee Zeldin has said that one of his top goals is the advancement of the AI industry (needless to say, that is not the EPA’s statutory mission). President Trump and Vice President Vance have both touted AI as integral to the future of the American economy. First lady Melania Trump announced she would be heading a new initiative to push teachers and students to participate in an “AI challenge,” saying “[AI] is poised to deliver great value to our careers, families, and communities.”

Billionaires with a stake in the AI industry’s future were also heavily involved in shaping the Trump administration’s staffing and policy. Marc Andreessen famously said in December 2024 that he was spending half of his time at Mar-a-Lago in order to advise the president on economic and tech policy. Blackstone CEO Stephen Schwarzman, whose firm is pouring tens of billions into data centers, including spending $189 million just on outdoor storage space required to service data centers, was being consulted by Commerce Secretary Howard Lutnick on potential administration appointees. Owner of xAI Elon Musk was famously granted nearly carte blanche to bulldoze the executive branch as the head of DOGE and, while he’s now departed government, still is highly influential in DOGE’s continuing operations.

In sum, applications for the sandbox program would be sent through a warp-speed process, overseen by an OSTP director who just came from a stint in the industry, sent to AI-booster agency heads, while the highest-profile figures in the administration all extol the technology.

Ghoul in the Machine

On June 17, after months of Musk’s xAI operating methane-burning turbines without a permit to power its data center in South Memphis, the Southern Environmental Law Center and the NAACP announced their intention to sue the AI firm for violations of the Clean Air Act. That is precisely the type of accountability that would be foreclosed by the SANDBOX Act.

If Cruz’s legislation becomes law, xAI could simply request a waiver for the Clean Air Act. If granted, the company’s data centers could then be allowed to spew toxic and carcinogenic emissions without limit or threat of legal action. The administration has already signaled that it would embrace such an approach; the EPA is working to overturn the “endangerment finding” that provides the legal basis for regulating greenhouse gas emissions by establishing that they are a dangerous pollutant.

Similarly, fintech companies that incorporate AI into their apps could use the sandbox program to get near-blanket immunity from security regulations, consumer protection measures, anti–money laundering and “know your customer” requirements, fair lending practices that prohibit racial discrimination, or even capitalization or stress-testing measures. The administration has already put in work to give fintech a free hand by neutering the Consumer Financial Protection Bureau.

Using AI to develop a tool to screen job applicants? Simply get a waiver for the equal opportunity provisions in Title VII of the Civil Rights Act and let the algorithm discriminate against minorities. Joining the rapidly expanding market of mental health aide chatbots? You can get a waiver for regulations under the Health Insurance Portability and Accountability Act (HIPAA), which could open the door to selling legally protected patient medical data. Making a smart fridge? Get exempted from any of those pesky Consumer Product Safety Commission rules about not hurting people!

According to Public Citizen’s Branch, the SANDBOX Act could also potentially block states from enforcing their regulations on AI companies that have been granted federal waivers. Federal law preempts state law, so Branch said, “I guarantee you these corporations are going to say that if they have a waiver from federal law, they will argue that the state law is not applicable.”

So if this legislation is passed, the words “artificial intelligence” might well become a magic incantation for basically any company to get around nearly any law or regulation.