ABSTRACT. This essay examines the emergence of what a growing body of political-economy scholarship designates the “Proprietary Deep State”—a hybrid governance apparatus in which critical operational capacities of sovereign governments have been substantially delegated to, or supplanted by, a small number of technology principals. Drawing on publicly verifiable data through February 21, 2026, the essay analyses three convergent cases: Palantir Technologies’ deepening integration into US intelligence and enforcement infrastructure, culminating in a five-year, $1 billion blanket purchase agreement with the Department of Homeland Security disclosed on February 20, 2026; the structural consequences of the Department of Government Efficiency (DOGE)—which Elon Musk led from January to May 30, 2025 before departing, leaving behind an institutionalised successor apparatus under OMB Director Russell Vought, while Musk himself faces court-ordered deposition (February 5, 2026) over DOGE’s dismantling of USAID; and Meta Platforms’ acceleration toward ambient AI-embedded wearable computing, with smart-glasses sales having tripled in 2025 and a neural electromyography wristband already commercialised.
Bayesian game-theoretic modelling demonstrates that rational state actors will systematically subsidise private infrastructure monopolies, thereby entrenching rather than correcting the accountability deficit. Oxfam’s January 2026 report documents that global billionaire wealth surged 16 percent in 2025 to a record $18.3 trillion, and that billionaires are now 4,000 times more likely to hold or directly influence political office than ordinary citizens.
I. Introduction: From Shadow Bureaucracy to Proprietary Sovereignty
The concept of the “Deep State” has long denoted a permanent network of career officials, intelligence professionals, and military planners whose influence on policy outlasts any electoral cycle. In its classical formulation the Deep State operated through institutional continuity, classified information, and bureaucratic inertia—always accountable, at least in principle, to democratic processes even when insulated from them. What the 2025–2026 period reveals is a qualitative transformation: the functional capacities of the state—surveillance, infrastructure, fiscal administration, information management—have migrated to private corporate hands on a scale and at a velocity that existing regulatory frameworks were not designed to address.
This essa advances three related claims. First, that the principal locus of state power in the United States has undergone a partial but structurally significant privatisation, with Palantir Technologies, Elon Musk’s constellation of enterprises, and Meta Platforms as its primary loci. Second, that this privatisation has occurred under conditions of acute information asymmetry, producing the strategic dynamics modelled in Section IV. Third, that the legal framework governing relations between sovereign states and these private actors is constitutively inadequate to the present situation. Throughout this essy, the more analytically precise designation “Proprietary Deep State” is employed: a configuration in which the informational, communicative, and enforcement infrastructure of governance is privately owned, algorithmically mediated, and opaque to democratic oversight..
II. TheProprietary Deep State: Three Case Studies in Algorithmic Power
II.i. Palantir Technologies and the Proprietisation of Intelligence Infrastructure
No single development better illustrates the emergence of the Proprietary Deep State than the trajectory of Palantir Technologies. Co-founded in 2003 by Peter Thiel with early seed investment from the CIA’s venture arm, In-Q-Tel, the company has transformed from a niche data-analytics firm into what amounts to an operating system for the United States intelligence and enforcement community. Its expansion accelerated dramatically after January 2025.
Palantir’s Q4 2025 earnings, reported February 2, 2026, revealed that US government revenue grew 66 percent year-on-year to $570 million in a single quarter, with total quarterly revenues of $1.41 billion exceeding all analyst estimates. The company projects full-year 2026 revenues of $7.18–$7.20 billion, representing more than 60 percent annual growth. This expansion is anchored in a series of landmark contracts. In April 2025, ICE awarded Palantir a $30 million agreement to develop ImmigrationOS, a system designed to provide “near real-time visibility” on deportation targets. Palantir subsequently received DISA Impact Level 6 authorisation—the highest DoD certification covering Top Secret information—enabling deployment across military environments without case-by-case approval.
Two disclosures in the days immediately before this datet crystallise the governance stakes most acutely. On February 18–19, 2026, Tax Notes—drawing on USASpending.gov data and FOIA-obtained contracts—reported that the IRS has paid Palantir more than $180 million across 26 contracts since 2018, and that Palantir is now building an internal IRS tool to make highly sensitive taxpayer data more accessible across agency personnel. Former Treasury Associate Chief Information Officer Matthew Burton stated: “The greatest risk is capture: when the vendor knows it is indispensable, it exploits its position for gain at the expense of the public.” On February 20, 2026, Palantir secured a five-year, $1 billion blanket purchase agreement with the Department of Homeland Security, covering software licences, maintenance, and implementation services department-wide. Palantir’s remaining performance obligation soared from $2.6 billion in Q3 to $4.2 billion in Q4 2025, suggesting this DHS agreement may already be embedded in financial statements.
The accountability problem is structural, not incidental. When the algorithms used to classify threats, prioritise enforcement targets, manage battlefield AI, and aggregate taxpayer data are proprietary, the classical mechanisms of democratic oversight—legislative hearings, judicial review, inspector-general audits—encounter a constitutive epistemological limit. As Burton stated, government reliance on proprietary data-management systems creates “problems of financial dependency, technical lock-in, and real risks to democratic accountability.” The Electronic Frontier Foundation’s Victoria Noble warned that Palantir’s IRS contract “would create serious privacy risks for taxpayers, who entrust some of their most sensitive personal data to the IRS.”.
II.ii. Elon Musk, DOGE, and the Weaponisation of Structural Dependency
The case of Elon Musk and DOGE requires careful periodisation, because the original version of this document contained factual inaccuracies that this revision corrects. Musk formally departed his DOGE role on May 30, 2025, after approximately 130 days as a Special Government Employee—the legal maximum for that classification. His departure followed legal setbacks in multiple federal courts, clashes with Treasury Secretary Scott Bessent and OMB Director Russell Vought over the scope and targeting of cuts, a 71 percent collapse in Tesla’s first-quarter profit (revenue down 9 percent), and a severe deterioration in his public approval rating. In a June 2025 interview, Musk characterised his DOGE tenure as “somewhat successful” but said he would not repeat the experience.
Musk’s formal departure did not, however, terminate the structural consequences of DOGE—and it is the post-Musk phase that is analytically most important. The Revolving Door Project’s comprehensive February 2026 report, “DOGE: From Meme to Government Erosion Machine,” documents that DOGE operatives—predominantly young personnel recruited from Musk’s and Peter Thiel’s networks without prior government experience—were converted from special government employees to permanent agency staff, burrowing into federal agencies well after the initiative’s headline phase. OMB Director Russell Vought—identified by the report as the de facto post-Musk DOGE leader—institutionalised the core agenda through the full formal authority of his office: 317,000 federal employees were separated by year-end 2025; USAID, the Corporation for Public Broadcasting, and much of the Education Department were effectively eliminated; and the fiscal year 2026 White House budget includes a $45 million request for a residual DOGE structure of 30 employees, with salaries for a further 120 embedded agency staffers reimbursable through ITOR appropriations. Congressional Republicans passed only a single bill enacting $9 billion in DOGE-related cuts—far below Musk’s original $2 trillion ambition—and Trump officials signalled in early 2026 that no further clawback legislation was likely given the narrow House majority.
As of this date, Musk faces a consequential accountability moment. On February 5, 2026, US District Judge Theodore Chuang ruled that “extraordinary circumstances justify” compelling Musk to be deposed in a lawsuit brought by former USAID employees accusing him of unlawfully directing that agency’s dissolution. The judge found that Musk “likely has personal, first-hand knowledge” of the relevant decisions—even as White House lawyers simultaneously argued Musk had no legal authority. This paradox—maximum operational influence combined with claimed non-accountability—is the defining legal signature of the Proprietary Deep State. Separately, at least 23 of more than 100 traceable DOGE operatives made cuts at agencies regulating sectors in which they had prior financial interests (ProPublica, July 2025), exemplifying the structural conflict-of-interest that characterises the whole enterprise.
Musk’s strategic leverage over national security infrastructure operates along a parallel and currently unconstrained axis. As the owner of Starlink (dominant satellite internet in active conflict zones), X (the primary global real-time public communications platform), and SpaceX (critical launch vehicle for US military and intelligence satellites), he controls communication and logistics nodes that no state can readily replace. The Bayesian model in Section IV operationalises the structural dependency this creates..
II.iii. Meta Platforms and the Frontier of Neurocognitive Governance
The governance challenge posed by Meta Platforms operates at a longer temporal horizon but carries the most consequential long-run implications for democratic autonomy. Its mechanism is the progressive enclosure of the information environment within which political deliberation occurs and, increasingly, the hardware layer that mediates perception itself.
The current state of play is established with precision by Meta’s January 28, 2026 investor call and CES 2026 disclosures (January 7, 2026). Meta’s apps reached 3.58 billion daily active users in December 2025, including more than 2 billion daily actives each on Facebook and WhatsApp. Full-year 2025 revenue was $201 billion. Zuckerberg described 2026 as “a year where this wave accelerates even further,” naming “personal superintelligence” as Meta’s central product objective and specifying that the company’s competitive advantage resides in “unique context”: access to users’ “history, interests, content and relationships”—the most comprehensive personal data ecosystem in history. Reality Labs, which oversees hardware, posted a $19.2 billion loss in 2025, which Zuckerberg characterised as likely the “peak” before gradual improvement. Meta has invested more than $50 billion in Reality Labs since 2020 and plans $100 billion in capital expenditure in 2026.
The hardware layer is where governance concerns become most acute. At Meta Connect 2025 (September 2025), Meta unveiled the Ray-Ban Meta Display Glasses ($799, shipping September 30), the Oakley Meta Vanguard sport glasses ($499), and—most consequentially—the Neural Band, an electromyography wristband that reads peripheral nervous system signals to interpret finger and hand gestures as device inputs, enabling navigation and text input without touching any screen. Smart-glasses sales tripled in 2025. At CES 2026, Neural Band technology was demonstrated in a Garmin partnership for automotive integration. A next-generation smartwatch codenamed “Malibu 2,” scheduled for 2026, is expected to absorb the neural wristband’s function. Future versions of the glasses are described by Zuckerberg as devices that will “see what you see, hear what you hear, talk to you and help you as you go about your day.”
The privacy and governance implications of ambient neural-interface consumer devices are not hypothetical. Researcher Nita Farahany has documented that existing electromyography devices have been used in laboratory conditions to extract financially sensitive information—including PIN codes—without users’ conscious awareness. Stanford Law School’s Law and Biosciences Blog characterised neural data as raising “particularly pressing privacy concerns given their ability to monitor, decode, and manipulate brain activity” (2024). Colorado and California have enacted neural-data protection statutes; no federal framework exists and none is before Congress. A January 2026 court filing in federal proceedings against Meta revealed that internal AI chatbot safety deliberations involved direct executive-level input from Zuckerberg, with child safety advocates citing the case as evidence that safeguards must be “proactive and embedded into system design.” Oxfam’s January 2026 report notes that billionaires now own more than half of the world’s largest media companies and all major social media platforms—a concentration whose implications for political deliberation are compounded, not ameliorated, by the move into wearable computing.
III. Socioeconomic Divergence and the Rent-Extraction Model of Power
The economic data for 2025–2026 provide empirical grounding for the structural claims in Section II. Oxfam’s January 2026 report documents that global billionaire wealth surged by more than 16 percent in 2025—three times the annual average of the preceding five years—to a record $18.3 trillion, an 81 percent increase since 2020. The number of billionaires exceeded 3,000 for the first time. In October 2025, Musk became the first person in recorded history to accumulate personal wealth exceeding half a trillion dollars; by year-end his fortune stood at approximately $700 billion. The fifteen wealthiest Americans collectively gained $747 billion (31 percent) in 2025—in the same calendar year that the administration in which Musk held executive authority dismantled the regulatory frameworks nominally responsible for constraining such concentration.
The inverse correlation between elite wealth accumulation and human welfare outcomes is stark. One in four people globally faced food insecurity as of early 2026, a severe deterioration from the one-in-nine ratio documented in the preceding period. The $2.5 trillion in billionaire wealth gains recorded in 2025 would, by Oxfam’s calculations, have been sufficient to eradicate extreme global poverty 26 times over. Boston University Professor Brooke Nichols estimated that DOGE-driven USAID cuts had resulted in approximately 793,900 deaths by February 5, 2026—predominantly children—with projections of 14 million additional deaths by 2030 absent a reversal. Tesla shareholders approved a $1 trillion pay package for Musk the same day this mortality figure was publicly reported.
The translation of economic concentration into political power is quantified with unusual precision by Oxfam’s 2026 methodology. Billionaires are now 4,000 times more likely to hold or directly influence political office than ordinary citizens—a quadrupling of the 1,000:1 ratio documented in 2024. In the 2024 US electoral cycle, one in every six dollars contributed came from just 100 billionaire families, with Musk alone spending more than $290 million in campaign contributions. The structural dynamic at work is rent-extraction: control over digital infrastructure, data platforms, and government contract monopolies enables the continuous appropriation of value from public commons without commensurate productive contribution. The “Deep State” has not been dismantled; it has been privatised and made overt.
IV. Bayesian Game Theory: Strategic Behaviour under Information Asymmetry
The Bayesian game-theoretic framework is particularly apt for analysing the Proprietary Deep State because its defining characteristic is information asymmetry: the state does not know the true motivations and capabilities of its private partners, and private actors exploit this uncertainty strategically. Two scenarios merit formal treatment.
IV.i. The Blackout Gambit: Infrastructure Dependency and Systematic Over-Subsidy
Consider a two-player game between a Sovereign State (Player 1) and a Technology Principal (Player 2). The State’s uncertainty concerns the Principal’s type: with probability P = 0.6, the Principal is a State-Aligned Actor who will maintain infrastructure access in exchange for adequate compensation; with probability 1−P = 0.4, the Principal is an Independent Sovereign who will selectively restrict access to maximise private geopolitical or commercial gain.
In any active conflict zone where Starlink provides primary communications—as in Ukraine and several other active theatres as of February 2026—the Principal decides whether to maintain or restrict satellite access. The Independent Sovereign type can restrict access to extract a bilateral deal with a third party, additional public subsidy, or ideological compliance. The State cannot observe the Principal’s type and must act under uncertainty with existentially high stakes.
The Bayesian Nash Equilibrium is systematic over-subsidy. Because the expected cost of a communications blackout in a conflict zone is existentially high, the State’s dominant strategy is to offer compensation sufficient to retain alignment even from an Independent Sovereign type. Since this exceeds what a State-Aligned Actor requires, the equilibrium involves persistent transfer of public resources to private hands irrespective of actual intentions. The DOGE episode represents an acute variant: Musk moved inside the State’s payoff function entirely, exercising administrative authority over agencies responsible for his companies’ contracts while those contracts were awarded or renewed, converting dependency into direct governance leverage.
IV.ii. The Algorithmic Coup: Platform Bias and the Fragmentation of Democratic Epistemology
A second game involves the Electorate (Player 1) and a Platform Owner (Player 2) during an election cycle. The Electorate’s uncertainty concerns the algorithmic information environment: with probability P = 0.3, the platform is informationally neutral; with probability 1−P = 0.7, it systematically advantages certain electoral outcomes. The high prior for bias reflects documented political interventions by X under Musk’s ownership in the 2024 cycle and the broader pattern of algorithmic opacity.
Because the prior probability of bias is 0.7, the rational Bayesian voter observing any anomalous information distribution updates toward the bias hypothesis. The equilibrium result is generalised epistemic distrust: voters systematically discount platform-mediated information but, lacking a credible alternative at comparable scale, cannot replace it. This produces social fragmentation favourable to the Platform Owner, because a fragmented electorate is less capable of generating the collective mandate for regulatory action that would constrain platform power. The Platform Owner benefits from distrust of the platform so long as it remains the primary information intermediary—a condition that Meta’s 3.58 billion daily active users ensures will persist. The structural analogy to Palantir’s position in intelligence is exact: in both cases the private actor controls indispensable infrastructure, its proprietary nature prevents verification of behaviour, and equilibrium involves systematic subsidy or tolerance in exchange for access.
V. The Crisis of Legal Extraterritoriality and the Limits of the Westphalian Framework
The emergence of Sovereign Billionaires—private actors exercising foreign-policy-equivalent power across state boundaries—creates a systemic crisis for the Westphalian framework. The 1648 Peace of Westphalia established that states, as the primary legal persons of international relations, are exclusively entitled to exercise coercive and regulatory power within their territorial jurisdiction. This framework has never fully accommodated powerful private actors, but the scale of the current phenomenon is historically unprecedented.
When Musk restricted Starlink access to Ukrainian forces during active combat operations in the Crimea corridor in 2023—acknowledged in Walter Isaacson’s biography—he exercised a power that international law reserves exclusively to state agents. When Palantir’s proprietary algorithms govern US border enforcement with classified outputs no external auditor can review, sovereign authority is not violated but circumvented. When DOGE operatives seized access to Treasury payment systems managing trillions of dollars without congressional authorisation, constitutional norms were dissolved from within rather than overridden from without.
The court-ordered deposition of Musk over DOGE’s USAID actions (February 5, 2026) illustrates both that private actors exercising state functions can in principle be held accountable through domestic law, and that the mechanism is slow, contested, and deeply uncertain in outcome. The G7 must confront the possibility that existing international law’s exclusive focus on state actors renders it constitutively inadequate to the present situation. Priority doctrinal developments include: extending international human rights obligations to private actors exercising state-equivalent functions (building on the UN Guiding Principles on Business and Human Rights); establishing mandatory interoperability and algorithmic auditability standards as preconditions for market access in democratic jurisdictions; and elaborating an “infrastructure sovereignty” doctrine—the principle that critical communications, data, AI, and enforcement infrastructure must be democratically accountable regardless of nominal ownership.
VI. Structural Governance Recommendations for G7 Members
Conventional regulatory approaches—sector-specific oversight, antitrust enforcement, or voluntary codes of conduct—are structurally insufficient to the present challenge. The Bayesian Nash Equilibria in Section IV generate systematic incentives for states to subsidise rather than constrain private infrastructure monopolies. Correcting these equilibria requires structural interventions that alter underlying payoff functions.
Four priority areas warrant immediate collective action. First, mandatory algorithmic auditability: as a condition of government contracting and domestic platform operation, AI systems deployed in enforcement, electoral, or critical infrastructure contexts should be auditable by independent technical bodies with appropriate security clearance. The current situation—in which the classification of threats and prioritisation of enforcement are determined by Palantir’s proprietary algorithms while its co-founder maintains close executive branch relationships—is incompatible with the rule of law.
Second, infrastructure sovereignty legislation: a common G7 framework should establish that critical communications infrastructure—including satellite internet systems, AI backbone platforms, and social media platforms above defined reach thresholds—cannot be controlled by a single private actor without robust public accountability mechanisms. The framework should include mandatory redundancy requirements, interoperability obligations, and emergency access provisions exercisable by democratic governments. Starlink’s operational role in active conflict zones makes this a collective security matter.
Third, neural and cognitive data governance: the commercialisation of electromyography wristbands, AI-embedded smart glasses, and next-generation neural interfaces in the absence of any federal framework represents an acute governance failure. G7 members should classify neural and cognitive biometric data alongside the most sensitive health data, prohibit commercial monetisation without affirmative ongoing consent, require pre-market regulatory clearance for consumer neurotechnology products, and implement the UNESCO Recommendation on the Ethics of Neurotechnology through binding domestic legislation.
Fourth, structural conflict-of-interest prohibitions: the DOGE episode demonstrates that existing ethics rules for special government employees are constitutively inadequate when private actors hold tens of billions in government contracts and simultaneously administer the agencies managing those contracts. G7 members should consider mandatory divestiture or blind trust requirements for any private actor exercising de facto governmental authority, analogous to requirements applied to elected officials, with enforcement mechanisms independent of executive branch cooperation.
VII. Conclusion
The Proprietary Deep State is not a conspiracy theory but an institutional description of a structural condition. The convergence of verifiable empirical developments as of February 22, 2026—Palantir’s $1 billion DHS agreement that was disclosed, its IRS and ICE enforcement contracts, its Top Secret DISA authorisation; the DOGE apparatus institutionalised under Russell Vought long after Musk’s May 2025 departure, with Musk now facing court-ordered deposition; Meta’s $201 billion revenue base, tripling smart-glasses sales, and commercialised neural wristband; and Oxfam’s documentation of a 4,000:1 political influence asymmetry—reveals a structural transformation in the locus of state power. The Deep State has not been dismantled. It has been privatised, substantially enlarged, and made overt.
The Bayesian analysis demonstrates that this transformation is not merely a political choice reversible by electoral means. Structural dependency on private infrastructure monopolies generates equilibria that systematically favour continued privatisation, and the actors who would be most constrained by corrective regulation are best positioned to prevent it. What democratic states cannot afford is the assumption that the problem is self-correcting or that the existing framework is adequate to a situation it was not designed to govern.
References
Brookings Institution. (2025, February). The implications of a USAID shutdown. https://www.brookings.edu/articles/what-comes-after-a-usaid-shutdown/
Economic Policy Institute. (2025). Trump is enabling Musk and DOGE to flout conflicts of interest. https://www.epi.org/publication/trump-is-enabling-musk-and-doge-to-flout-conflicts-of-interest
Farahany, N. (2023). The battle for your brain: Defending the right to think freely in the age of neurotechnology. St. Martin’s Press.
FedSavvy Strategies. (2025, August). Palantir federal growth: AI expansion, Army Enterprise Agreement, and implications for Golden Dome for America. https://www.fedsavvystrategies.com/palantir-federal/
Gerhardt, M. (2025, February). Constitutional commentary on DOGE authority. As cited in Al Jazeera, “Do Elon Musk and DOGE have power to close US government agencies?”
Immigration Policy Tracking Project. (2025). ImmigrationOS: Palantir ICE surveillance platform. https://immpolicytracking.org/policies/reported-palantir-awarded-30-million-to-build-immigrationos-surveillance-platform-for-ice/
Malinowski, H. et al. (2025). DOGE and the future of US foreign aid. Survival: Global Politics and Strategy, 67(3).
NPR. (2025, June 25). Elon Musk may be gone but DOGE isn’t done remaking the federal government. https://www.npr.org/2025/06/16/nx-s1-5431926/doge-future-elon-musk
Oxfam International. (2026, January). Resisting the rule of the rich: Protecting freedom from billionaire power. Oxfam International. https://www.oxfam.org/en/press-releases/billionaire-wealth-jumps-three-times-faster-2025-highest-peak-ever-sparking
Oxfam International. (2026, January). Methodology note: Resisting the rule of the rich. https://www.oxfamfrance.org/app/uploads/2026/01/Oxfam_Methodology-Note.pdf
Palantir Technologies. (2025). Q3 2025 earnings release. InsiderFinance coverage, https://www.insiderfinance.io/news/palantir-stock-faces-government-contract-risk-amid-ai-growth
ProPublica. (2025, July). Tracking DOGE: More than 100 members identified. ProPublica.
Revolving Door Project. (2026, February). DOGE: From meme to government erosion machine. https://therevolvingdoorproject.org/doge-musk-vought-government-cuts-civil-service/
Robertson, A., & Corfield, G. (2025, September). Meta Connect 2025: Smart glasses, neural bands, and the future of personal AI. TechCrier. https://www.techcrier.com/2025/09/meta-connect-2025-smart-glasses-neural.html
ScienceDirect. (2024). Beyond neural data: Cognitive biometrics and mental privacy. Cell Press / ScienceDirect. https://www.sciencedirect.com/science/article/pii/S0896627324006524
Stanford Law School. (2024, December). What are neural data? An invitation to flexible regulatory implementation. Law and Biosciences Blog. https://law.stanford.edu/2024/12/02/what-are-neural-data-an-invitation-to-flexible-regulatory-implementation/
The Hill. (2026, January). Palantir courts major federal contracts—and controversy—in Trump era. https://thehill.com/policy/technology/5667232-palantir-trump-administration-surveillance/
The Record. (2025, October). What brain privacy will look like in the age of neurotech. Recorded Future News. https://therecord.media/what-brain-privacy-will-look-like
United Nations. (2011). Guiding principles on business and human rights. Office of the UN High Commissioner for Human Rights.
UNESCO. (2024). Draft recommendation on the ethics of neurotechnology. UNESCO.
USASpending.gov. (2025). Federal contract obligations to Palantir Technologies Inc. https://www.usaspending.gov
World Socialist Web Site. (2026, January). Oxfam report shows billionaire wealth surged in 2025. https://www.wsws.org/en/articles/2026/01/21/zpqh-j21.html
World Economic Forum / Oxfam. (2025, January). Takers not makers: The unjust poverty and unearned wealth of colonial inheritance. Davos Executive Summary. https://oi-files-d8-prod.s3.eu-west-2.amazonaws.com/s3fs-public/2025-01/English%20-%20Davos%20Executive%20Summary%202025.pdf