
In January 2025, OpenAI announced its Stargate initiative – a multibillion-dollar infrastructure investment, beginning with a supercomputing campus in Abilene, Texas. Shortly after, the company launched “OpenAI for Countries” a parallel program aimed at helping nations build their own AI infrastructure in alignment with so-called democratic AI principles.
While framed as a supportive gesture, the OpenAI for Countries initiative raises critical questions about digital sovereignty, particularly for countries in the Global South. Will Stargate partnerships enable these nations to shape their AI futures, or will they deepen dependencies on U.S.-centered tech governance?
Let’s have a critical look. Below is a video explaining what project Stargate exactly is.
OpenAI for Countries and the Promise of Sovereignty
OpenAI pledges to help countries develop secure, localized data centers that support data sovereignty and foster national AI ecosystems. According to the plan, this infrastructure would through the OpenAI for Countries program allow states to customize AI models to reflect cultural and linguistic nuances, ostensibly supporting SDG 9.1 and SDG 9.a on resilient infrastructure and international tech cooperation. Additionally, national startup funds co-financed by OpenAI are meant to seed local innovation.
However, the Stargate model is deeply linked with U.S. strategic interests. OpenAI states it will coordinate all such efforts with the U.S. government, raising concerns over whether the infrastructure truly supports autonomy or merely relocates control centers.
As of April 2025, pilot agreements have been signed with Nigeria, Chile, and Indonesia. In Nigeria, a $180 million Stargate site outside Lagos is expected to go live in early 2026. According to the Nigerian Ministry of Communications, the center will focus on AI tools for public health, including predictive disease surveillance. However, civil society groups like Paradigm Initiative have warned that data transparency provisions remain unclear. “If these models are trained on Nigerian data but governed from San Francisco, that’s not sovereignty,” said Gbenga Sesan, Executive Director of Paradigm Initiative.
Table 1: Summary of Stargate Partner Countries (as of April 2025)
Country | Stargate Investment | Launch Timeline | Use Focus Area | Key Concerns Raised |
---|---|---|---|---|
Nigeria | $180 million | Early 2026 | Public health AI | Data governance, external oversight |
Chile | $95 million | Mid 2026 | Climate and public service AI | Algorithmic transparency, control structures |
Indonesia | $130 million | Late 2026 | Education tech and moderation AI | Cultural alignment, content governance |
OpenAI for Countries – Democratic AI or Exported Norms?
OpenAI defines democratic AI as a system that ensures individual freedoms, market competition, and safeguards against authoritarian misuse. But critics argue that this OpenAI for Countries model risks exporting a specific vision of digital governance rooted in U.S. liberal-capitalist norms, potentially clashing with local democratic traditions and sovereignty frameworks.
The OECD’s 2025 report on AI adoption in firms notes that many countries still lack basic digital infrastructure, regulatory clarity, and AI skills training – gaps that cannot be bridged through foreign partnerships alone. Without meaningful domestic input in model governance and training data choices, AI developed under the Stargate umbrella may reflect external priorities, not local needs.
In Indonesia, where OpenAI has partnered with the national AI Innovation Agency, questions have arisen over content filtering protocols. Local journalists have expressed concern that automated content moderation tools may suppress culturally sensitive speech. “There is a risk that what gets flagged as ‘inappropriate’ reflects Silicon Valley values, not Indonesian law or norms,” said Dewi Lestari, a media researcher at Universitas Gadjah Mada.
Data Control and Power Asymmetries
One of OpenAI’s selling points for its OpenAI for Countries initiative is the promise of “sovereign data.” Yet, practical control over data processing, storage protocols, and model fine-tuning remains vague. Countries may own the data but not the systems that process it – a distinction that risks perpetuating extractive digital relationships.
This asymmetry echoes broader concerns raised in SDG 10.6, which calls for increased representation of developing countries in global governance. If Stargate sites function more like outposts of a U.S.-centered AI empire than genuine national assets, they may entrench rather than alleviate global inequities.
The case of Chile illustrates the dilemma. While the government hailed its Stargate partnership as a leap toward AI independence, the Santiago-based research group Datos Justos noted that key algorithmic oversight committees are chaired by OpenAI staff. “It’s a joint venture in name only,” said Dr. Camila Fernández, Director at Datos Justos.
Innovation or Dependency?
The offer to co-invest in national AI startup ecosystems has potential, especially if funds are structured to support SMEs and public-sector innovation aligned with SDG 8.3. But precedent suggests caution. Similar tech development funds have often favored elite institutions or foreign firms operating within national borders, sidelining grassroots innovators.
Moreover, as the OECD notes, AI diffusion tends to be uneven. Without robust local research ecosystems, training programs, and policy autonomy, countries risk becoming mere implementers of imported AI solutions.
In Chile, several tech incubators have reported difficulties accessing the OpenAI-backed national innovation fund. “It’s a challenge to get past the bureaucratic layers that favor government-aligned actors,” said Marcela Soto, co-founder of a Mapuche-led AI cooperative. “We want AI that reflects our worldview, not just another chatbot in Spanish.”
Policy Response Recommendations
Multilateral institutions such as the OECD, UNESCO, and UNDP have proposed frameworks to ensure equitable AI development:
- OECD recommends legally binding frameworks for data access and algorithmic accountability in cross-border AI infrastructure projects (OECD AI Principles).
- UNESCO’s Recommendation on the Ethics of AI (2021) urges states to mandate transparency, human oversight, and participatory design in all AI systems.
- UNDP calls for “AI Governance Compacts” co-signed by local communities, regulators, and technology providers to ensure contextual legitimacy and public benefit.
Policy analysts also suggest integrating Stargate sites into regional digital public infrastructure (DPI) strategies, anchored in community co-governance models.
Who Writes the Rules?
OpenAI’s Stargate initiative ‘OpenAI for Countries’ is a pivotal moment for AI geopolitics. It offers material resources and a pathway to participation in global AI development. But for the Global South, the stakes are high. The difference between AI sovereignty and AI dependence will rest on whether these partnerships are truly equitable, transparent, and aligned with local priorities.
Without explicit safeguards for data governance, democratic pluralism, and public benefit, Stargate may replicate the very power asymmetries they claim to dismantle. What matters now is not just who builds the servers, but who writes the rules that govern them.