What would it mean to design artificial intelligence in service of climate justice, communal care, and local values?
The 2026 Solidarity AI conference at the prestigious Chulalongkorn University in Bangkok brings together movement builders, researchers, AI developers, policymakers, union organizers, and digital rights advocates to confront this question. Co-convened by PCC Global and PCC Thailand, in partnership with Chulalongkorn University, Friedrich-Ebert-Stiftung, and regional allies, the gathering extends last year’s conversation in Istanbul, which explored the relationship between cooperatives and AI. That event asked how shared ownership and democratic governance might reshape not just the use of AI, but its underlying infrastructures.
Now, the conversation turns to Asia. In regions long positioned on the receiving end of global tech flows, digital systems increasingly restructure labor, determining who is paid, tracked, replaced, and rendered invisible. Through plenaries, participatory workshops, sector-focused breakouts, and strategy sessions, the conference creates space for scholarship that emerges from and remains accountable to lived experience.
At this moment, Solidarity AI names a choice: whether AI will be shaped by distant systems of power or by the communities who live with its consequences. It is a commitment to imagining and building technologies with, by, and for the communities they affect. It insists that those who build, maintain, and are governed by AI systems must shape them, not merely through symbolic input, but through meaningful control from the extraction of raw materials to the design of algorithms. At stake are the systems that decide whether a farmer gets a fair price, whether a care worker’s schedule makes sense, and whether a community’s language appears online at all. Either power remains rooted in neighborhoods and networks, or it is pulled upward into centralized and distant infrastructures of control, far from those who live with the consequences.
These ideas take shape through “solidarity stacks,” namely interconnected layers of technology, governance, and labor systems that are collectively owned, locally governed, and designed to serve social rather than extractive ends. Built from the ground up by communities, cooperatives, and public institutions, they offer a way to meet real needs while preserving autonomy, equity, and care.
This vision is already taking shape. From farmer-led data cooperatives in India to Vietnam’s national AI stack; from multilingual systems in Indonesia and Malaysia to worker organizing in the Philippines and platform cooperatives in Spain, each example offers a concrete glimpse of what is possible.
This is not a vision flown in from elsewhere. It draws from intellectual and political traditions that have long resisted extractive and techno-solutionist models. Buddhist ethics center compassion and interdependence. Thailand’s sufficiency economy questions growth-at-all-costs. Gandhian thought champions decentralization and restraint, while Ambedkarite critique demands justice and names who is left out. These traditions ground the conference’s move beyond Western-centric debates, opening space for regional imagination and challenging assumptions about whose values shape technology.
Solidarity AI is not just a space for critique. It is where experiments connect, where strategies are shared, and where fragments of a different technological future begin to cohere. Now is the moment to commit; to build together, to imagine boldly, and to move beyond what we were told was inevitable.