
In a packed auditorium at Google I/O 2025, amid applause and product demos, a quiet realization echoed beneath the excitement: society is running out of time to catch up with the technologies it unleashes. Gemini Ultra, Google’s latest leap in AI, promises seamless integration into everything from education to productivity to entertainment. Its capabilities are dazzling. So is its concentration of power.
What we’re witnessing isn’t just innovation — it’s consolidation. With every update, the invisible infrastructure of our lives becomes more dependent on the decisions of fewer people in fewer rooms.
This is no longer just about tools. It’s about terms. Terms of labor, of access, of intelligence itself. The question now is stark: Who shapes the systems that shape us?
The Speed of Innovation vs. the Lag of Governance
AI evolves faster than institutions. Regulatory bodies, public discourse, and ethical frameworks all trail behind — often by years. When ChatGPT and Gemini shift public expectations overnight, democracies can’t adjust fast enough. There’s no public referendum on what creativity, knowledge, or work should mean in the age of synthetic intelligence.
And this isn’t abstract. Education systems now weigh whether to integrate or ban AI. Artists sue over datasets. Workers wonder if their roles will still exist in a year. Most people don’t understand how these tools function — but they increasingly depend on them to apply for jobs, write resumes, even navigate healthcare systems.
The algorithms are sophisticated. The social scaffolding around them? Not so much.
Invisible Dependencies
There’s a strange paradox at play: as AI becomes more autonomous, humans become more dependent. It offers freedom — automate your inbox, plan your meals, generate a business plan. But each new convenience masks a growing asymmetry of power.
These systems are not neutral. They reflect the priorities of their creators and the data they’re trained on. The risk isn’t just bias — it’s dependence without understanding. We rely on outputs we can’t audit, in languages we didn’t write, governed by companies we can’t vote out.
Much like financial systems before the 2008 crash, there’s complexity without clarity, and scale without accountability.
Participation or Extraction?
There’s a difference between using a tool and being used by it. Right now, the AI economy runs largely on the unpaid labor of internet users — every click, search, and conversation becomes raw material for training models.
Those profits, however, flow upward.
While Gemini and its peers drive productivity, they also consolidate value into a few tech giants. The economic logic is extractive: leverage public data, commercialize insights, and centralize control. It’s an echo of previous platform economies — but with greater stakes. Because AI doesn’t just mediate information. It mediates thinking.
Digital Sovereignty in an Automated Age
What would it mean to reclaim agency?
In Europe, the push for digital sovereignty has gained ground — with open-source initiatives, data localization laws, and stricter AI regulations. Elsewhere, worker-led movements are beginning to demand transparency, co-governance, and algorithmic audits.
There are precedents: community-owned internet networks, platform cooperatives, publicly funded AI labs. They’re small for now — but they represent a refusal to accept that the future must be privatized.
Because this isn’t just about jobs or laws. It’s about sovereignty. Who gets to participate in the shaping of systems that are becoming as fundamental as language?
Are We Building Tools That Entrap?
Every generation believes its technology is special. But this moment is different — not because AI is smarter, but because it redefines what it means to know, to decide, to act.
Are we building scaffolds for a more humane society, or just gilded cages of convenience?
The answer depends on what we do next. On whether we build public capacity alongside private capability. On whether we prioritize transparency over speed. On whether we see AI as a mirror — reflecting not just our intelligence, but our values.
Google’s stage may be sleek. But the real questions lie offstage — in classrooms, council meetings, and community centers. Places where people still believe they have the right to shape their future.
Let’s not outsource that belief to the algorithm.
