Trump Stacks His AI Policy Council With Zuckerberg, Jensen Huang, and Ellison — Regulation Just Got Captured
Trump's 24-member AI policy council co-chaired by David Sacks features Mark Zuckerberg, Jensen Huang, and Larry Ellison. The foxes are officially guarding the henhouse. What this means for AI regulation, export controls, and safety.
Weekly Newsletter
AI, startups & growth insights. No spam.
On March 25, 2026, the Trump White House announced the reconstituted President's Council of Advisors on Science and Technology (PCAST) — and the roster reads less like an independent advisory body than a who's-who of companies that stand to gain the most from favorable AI policy. Mark Zuckerberg. Jensen Huang. Larry Ellison. Marc Andreessen. The people who will be regulated are now the people writing the framework for regulation.
This is not a conspiracy theory. It is the explicit composition of a formal White House advisory council. And understanding what it means — for AI safety standards, semiconductor export controls, compute access, and the future of American AI governance — requires looking squarely at who is sitting in the room.
What You Will Learn
- The foxes are guarding the henhouse — what the council actually is
- Full member breakdown — who made the list and why
- David Sacks as co-chair — the crypto angle and what it signals
- The Big Three: Zuckerberg, Huang, and Ellison's specific interests
- What they will actually influence — export controls, safety, compute
- Regulatory capture, defined and demonstrated
- The EU contrast — adversarial vs. captured regulation
- Historical precedent — when tech advisory councils go wrong
- What's missing — who is NOT on the council
- Conclusion — what comes next
The Foxes Guard the Henhouse
The President's Council of Advisors on Science and Technology — PCAST — has existed in various forms since the Eisenhower administration. It has historically featured a mix of academic scientists, independent technologists, and occasionally industry representatives. The key word is mix. The idea was always to triangulate interests: industry knows what is buildable, academia knows what is dangerous, and government synthesizes.
Trump's reconstituted PCAST does away with that triangulation. The White House announcement describes the council's mandate as addressing "the opportunities and challenges that emerging technologies present to the American workforce, and ensuring all Americans thrive in the Golden Age of Innovation." That is a remarkably commercial framing for what is supposed to be a science and technology advisory body.
The council will focus on AI, advanced semiconductors, quantum computing, and nuclear power. These are not abstract research domains. These are industries worth trillions of dollars — industries where the companies represented on this council operate and compete. Nvidia dominates AI semiconductor supply. Meta is among the world's largest deployers of AI models. Oracle has staked its cloud future on AI infrastructure. Andreessen Horowitz has invested hundreds of millions into AI startups.
Near-term, the council is expected to shape Trump's national AI framework — the successor to Biden's Executive Order on AI that Trump revoked on day one of his second term. The framework will set the tone for how the U.S. approaches AI safety, model evaluation requirements, export controls on chips, and how much latitude companies have to self-regulate. The people advising on that framework will, in every material case, benefit from weaker requirements and broader self-regulatory latitude.
Who Is On the Council
The White House appointed 13 initial members, with two co-chairs, and indicated the council may expand to 24 total. Here is the current roster and the most relevant lens through which to read each appointment:
Co-Chairs:
- David Sacks — White House AI and crypto czar, venture capitalist, former PayPal
- Michael Kratsios — White House Office of Science and Technology Policy director
Members:
- Marc Andreessen — Co-founder, Andreessen Horowitz (a16z), one of the most aggressive pro-deregulation voices in Silicon Valley
- Sergey Brin — Google co-founder; Google DeepMind is a direct competitor in the foundation model race
- Safra Catz — Oracle EVP; Oracle is building AI cloud infrastructure and has a $500B Stargate commitment
- Michael Dell — Dell Technologies CEO; major AI server hardware player
- Jacob DeWitte — CEO of Oklo, a nuclear startup backed in part by OpenAI's Sam Altman
- Fred Ehrsam — Co-founder of Coinbase; crypto-adjacent, not core AI
- Larry Ellison — Oracle executive chairman; personally committed to massive AI infrastructure investment
- David Friedberg — Entrepreneur and investor, All-In podcast regular
- Jensen Huang — Nvidia CEO; controls the dominant AI chip supply chain
- John Martinis — Quantum computing physicist, former Google; one of the few pure scientists on the list
- Bob Mumgaard — CEO, Commonwealth Fusion Systems; nuclear energy
- Lisa Su — AMD CEO; Nvidia's primary semiconductor competitor
- Mark Zuckerberg — Meta CEO; the world's largest open-source AI model deployer
Thirteen names, twelve of whom are either active technology executives or investors with direct financial stakes in AI policy outcomes. One physicist. That ratio matters.
David Sacks as Co-Chair
David Sacks stepping into the PCAST co-chair role represents both an elevation and, arguably, a demotion. As Trump's AI and crypto czar, Sacks wielded informal but significant influence over executive branch technology posture. Moving into a formal advisory council role puts him further from the operational levers — he is no longer in the room where decisions get made, but he remains in the room where frameworks get drafted.
Sacks is a PayPal mafia veteran, a prominent venture capitalist through Craft Ventures, and a co-host of the All-In podcast — one of the most listened-to shows in Silicon Valley. His ideological commitments are well documented: he is vocally pro-crypto, skeptical of regulatory overreach, and aligned with the view that American AI dominance requires moving fast and constraining government interference.
That worldview will now formally co-chair the body advising the President on AI policy.
The crypto angle deserves more scrutiny than it typically receives in AI coverage. Several PCAST members — Fred Ehrsam (Coinbase co-founder), David Friedberg (All-In regular), and Sacks himself — have deep crypto roots. The convergence of crypto and AI policy is not accidental. Both communities share a philosophical opposition to centralized oversight. Both communities benefit from regulatory ambiguity. And the Trump administration has made the alignment of crypto-friendly and AI-friendly policy a deliberate political project.
What Sacks brings to PCAST is not primarily technical expertise. It is a coherent and well-resourced ideological perspective — one that will shape how the council frames its recommendations before a single expert report is written.
The Big Three Interests
Three names on the PCAST roster command particular attention because their companies' interests are most directly implicated in AI policy outcomes.
Mark Zuckerberg — Meta has made open-source AI its central competitive differentiator. The Llama model family is Meta's argument that AI should be free, open, and unencumbered by the kind of safety guardrails that closed-model competitors like OpenAI and Anthropic apply. Zuckerberg has publicly and repeatedly argued against AI safety regulation, framed open-source AI as a national security asset, and lobbied against EU AI Act compliance requirements. His presence on PCAST creates a direct line between Meta's regulatory interests and White House AI policy.
Jensen Huang — Nvidia does not build AI models. It builds the hardware those models run on. But semiconductor export controls are existential for Nvidia's business. Biden-era chip export restrictions — limiting H100 and A100 sales to China — cut into Nvidia's addressable market by hundreds of billions of dollars. Huang has been vocal about the economic cost of export controls. His presence on PCAST means the world's dominant AI chip supplier has a formal advisory role in shaping the export control regime that governs its own products.
Larry Ellison — Oracle has committed $500 billion to U.S. AI infrastructure through the Stargate initiative alongside SoftBank and OpenAI. Ellison's personal vision — he described it publicly as building AI systems capable of "continuously monitoring everybody" — aligns with the surveillance and defense applications of AI that require government contracts and favorable procurement policy. Oracle's AI business depends on federal cloud contracts. Ellison's advisory role is, in effect, a seat at the table where those contracts are shaped.
Each of these three executives has a multi-billion-dollar financial interest in PCAST's policy outputs. None of them are required to recuse from discussions that directly affect their companies' interests.
What They Will Influence
The concrete policy domains where PCAST will operate are not abstract. They include:
Export controls on AI chips. The Biden administration's tiered export control framework restricted Nvidia H100 and equivalent chips from reaching China and several other countries. Trump has already signaled interest in loosening these restrictions. With Jensen Huang on the advisory council, the policy review process for export controls will include the direct testimony of the CEO whose quarterly earnings are most affected by those controls.
AI safety evaluation requirements. Biden's AI Executive Order required frontier model developers to share safety evaluations with the government before deployment. Trump revoked that order. PCAST will advise on what, if any, mandatory safety evaluation framework replaces it. The companies most burdened by such requirements — Meta, Google, and their peers — are represented on the council. Independent AI safety researchers are not.
Compute access and antitrust. The concentration of AI compute in Nvidia's supply chain is among the most significant structural risks in the AI ecosystem. PCAST will advise on semiconductor policy, which includes whether and how to address that concentration. Nvidia's CEO is on the council. AMD's CEO — Nvidia's primary competitor — is also on the council, which creates a different but equally complex conflict.
Federal AI procurement. The U.S. government is one of the largest potential customers for AI systems. PCAST's recommendations on federal AI adoption will shape which vendors win contracts worth billions. Oracle's Ellison and Dell's Michael Dell both run companies with active federal AI sales operations.
For a fuller picture of the legislative context this council is operating in, see our coverage of the US Federal AI Act and Senate preemption of state laws and Trump's federal preemption of state AI laws including Colorado's.
Regulatory Capture Concern
Regulatory capture — the process by which regulated industries come to dominate the regulatory bodies meant to oversee them — is a well-documented phenomenon in political economy. The FCC was captured by telecoms. The FDA has revolving door problems with pharma. The financial industry shaped Dodd-Frank from the inside.
AI policy is being captured before the regulatory infrastructure even exists.
The mechanism here is not subtle. PCAST does not have direct regulatory authority — it advises the President and OSTP, which in turn shapes executive orders, agency guidance, and legislative priorities. But advisory bodies set the conceptual terms of policy debates. They determine which questions get asked, which trade-offs get surfaced, and which frameworks become the default assumptions that agencies and Congress later work within.
A PCAST composed primarily of AI industry executives will not produce recommendations that prioritize independent safety evaluation over deployment speed. It will not recommend export controls that constrain its members' revenue. It will not produce analysis that challenges the concentration of AI infrastructure in a handful of companies — because those companies are in the room.
The absence of independent AI safety researchers, academic ethicists, labor economists, civil society representatives, or consumer advocates from the council's initial composition is not an oversight. It is a design choice.
The EU Contrast
The contrast with Europe's approach is stark and instructive — not because the EU AI Act is without problems, but because it represents a structurally different model for technology governance.
The EU AI Act was drafted with input from a multi-stakeholder process that included civil society organizations, academic institutions, and public consultations. Industry representatives participated — but they did not chair the process. The resulting legislation is imperfect and compliance-heavy, but it emerged from a process that was not structurally dominated by the parties with the most to gain from weak rules.
The EU's current preparatory obligations under the AI Act are already creating compliance costs that U.S. companies operating in Europe must absorb. Trump's PCAST will advise on whether the U.S. builds an adversarial framework to compete with or simply declines to match the EU's model.
With the current council composition, the answer is predictable. The argument will be framed as competitiveness — that mandatory safety requirements would hamper American AI development and cede ground to China. It is an argument that conveniently aligns with the financial interests of every major company represented on the council.
Europe is not getting AI governance right in every dimension. But it is at least attempting to get it from a structural position that doesn't begin by handing the pen to the industry being governed.
Historical Precedent
This is not the first time a presidential advisory council has been dominated by the industries it is meant to inform. The pattern is familiar enough that it has a name — the revolving door — and a long history.
The Obama-era President's Council on Jobs and Competitiveness, chaired by GE's Jeff Immelt, became a source of criticism when GE's own tax avoidance and offshoring practices contrasted with the council's stated focus on American job creation. The council was eventually disbanded without achieving significant policy outcomes.
Bush-era energy advisory panels were stocked with oil executives during the drafting of the 2005 Energy Policy Act — a bill that included billions in subsidies for the oil and gas industry.
The Clinton administration's technology advisory council was more academically balanced but was still criticized for insufficient independence from industry interests.
What distinguishes Trump's PCAST is the speed and completeness of the capture. Previous administrations at least maintained the appearance of balanced advisory composition. The 2026 PCAST doesn't bother. The White House announcement frames the appointments as a feature — a sign of how seriously the administration takes American tech competitiveness. The foxes are not just in the henhouse. They are being celebrated for showing up.
What's Missing
Who is not on the council matters as much as who is.
Sam Altman — OpenAI's CEO is conspicuously absent. Altman has been perhaps the highest-profile advocate for AI safety regulation in Silicon Valley, calling for licensing regimes and regulatory oversight. His absence may reflect the fraying of his relationship with the administration following Musk-aligned criticism of OpenAI's structure, or simply that his regulatory positions are inconvenient for a council oriented toward deregulation.
Elon Musk — The omission of the administration's most prominent tech ally is notable. Fortune's reporting suggests Musk was excluded, possibly due to his ongoing conflicts with other tech leaders, including Zuckerberg, or his formal DOGE role creating a structural conflict.
Independent AI safety researchers — Geoffrey Hinton, Yoshua Bengio, Stuart Russell, or any of the prominent researchers who have published substantively on AI risk are absent. The AI safety research community — which spans academia, nonprofit research labs, and independent institutes — has zero representation.
Labor economists — AI's most immediate near-term impact may be on employment. No economist focused on labor market impacts sits on the council.
Civil liberties and privacy advocates — AI surveillance capabilities are expanding rapidly. No civil society voice on data rights, facial recognition, or government use of AI appears on the current list.
Antitrust experts — The concentration of AI infrastructure raises legitimate antitrust concerns. No competition economist or antitrust scholar is included.
The council's 24-member cap means additional appointments are coming. It would be a significant signal if any of these voices made it into the remaining slots. Based on the current direction, that signal seems unlikely to arrive.
Conclusion
The Trump administration's reconstituted PCAST is, on its face, an impressive gathering of technology industry power. It is also a case study in how advisory capture works in practice — quietly, formally, with press releases and official appointments, without a single nefarious actor, simply by stacking the room with people whose financial interests align with the policy outcomes the administration already prefers.
The council will produce recommendations. Those recommendations will carry the weight of names like Zuckerberg, Huang, and Ellison. They will be framed as expert consensus. And they will almost certainly push in the direction of lighter safety requirements, loosened export controls, and an AI governance framework built around industry self-regulation.
Whether that is the right approach to governing one of the most consequential technologies in human history is a legitimate debate. The problem is that this council is not structured to have that debate. It is structured to produce one side of it.
The national AI framework that emerges from PCAST's work will shape what U.S. AI governance looks like for years — possibly decades. It will influence the international norms that other countries reference when building their own frameworks. It will determine whether the U.S. has any meaningful safety floor on frontier AI development or whether that question gets deferred indefinitely in the name of competitiveness.
Those are not questions best answered by the people who profit most from the answers.
Sources: White House PCAST Announcement | Fortune: Trump appoints Zuckerberg, Huang, Ellison | TechCrunch: David Sacks is done as AI czar | Bloomberg: Trump Taps Zuckerberg, Andreessen, Huang | Deadline: Trump Names Ellison, Zuckerberg to Tech Advisory Council