From groundbreaking innovations to bold visions, our 2025 Fellows share their predictions on where technology is headed—and the impact it could have on the world.
2025 will bring about several crucial shifts in the global surveillance technology market. The AI surveillance industry remains heavily driven by profit and power consolidation, and is projected to reach $234.72 billion by 2027. Monitoring continues to be embedded in our communication tools, often disguised as "features" or "improvements" but are, in fact, functioning as sophisticated surveillance systems. Facial recognition technology has rapidly become one of the fastest-growing sectors within the surveillance industry. It is anticipated to grow to $13.4 billion in market value by 2028, at a growth rate of 14.9% annually, with ML and AI recognized as the primary drivers.
The Global Majority face the most severe consequences of surveillance overreach. For instance, the UAE has developed pervasive "safe city" platforms and extensive facial recognition infrastructure, making it one of the most comprehensively monitored nations in West Asia. This surveillance capability is likely to expand further as the UAE positions itself as a global AI hub. India's surveillance technology sector is also experiencing unprecedented growth with widespread adoption of AI-powered surveillance systems. Despite documented human rights violations and a Supreme Court investigation into spyware abuses, India’s government continues to advance its surveillance agenda. Mexico also exemplifies a dangerous trend: a rapidly expanding AI-enabled surveillance state operating without specific legislative protections, leaving people particularly vulnerable to privacy violations from both state and private entities as AI adoption expands.
Sovereign wealth funds, including those belonging to repressive regimes, are pouring billions into venture capital (VC) firms, knowing full well those VCs will funnel that capital into AI surveillance startups, drawn by the lucrative promise of defense contracts and monitoring platforms that can track and analyze large-scale private and behavioral data. Many of these investors wield unprecedented political influence today. Only through massive public pressure, strategic litigation, regulatory threats, and the rare alignment of market forces will VCs even consider prioritizing privacy rights and the human cost of their investments. But with surveillance tech's soaring profits and minimal oversight, there are few incentives for them to develop an ethical backbone.
In 2025, VCs will face intense scrutiny over their surveillance tech investments. Hoping to accelerate this transformation, I founded Surveillance Watch, an interactive map and database revealing the hidden connections within the opaque surveillance industry. This work complements the investigative work of Amnesty Tech, Privacy International, Access Now, Digital Rights Foundation, African Internet Rights Alliance and others in global civil society.
By connecting the dots between financial actors, privacy-violating technologies, and human rights impacts, we’re making the VC investments behind this abuse traceable and public, helping to drive a new standard of privacy-conscious due diligence in tech investments. As privacy violations in the Global Majority continue to impact human rights, legal liability will become a real concern. VCs will increasingly realize their portfolio companies' privacy practices could expose them to significant legal and reputational risks. Driven by a fear of potential LP clawback due to underperformance rather than a moral awakening, VCs will rush to enforce stricter due diligence and distance themselves from companies without clear privacy protections.
The question for VCs won't be whether to consider privacy implications, but how quickly they can adapt to the new reality, where funding surveillance tech carries real legal and financial consequences. Lastly, and most importantly, the myth that "privacy is dead" will finally crumble. Armed with comprehensive data about surveillance industry enablers, grassroots movements will surge worldwide. This is why building an actionable framework for venture capital accountability in AI development is important. This framework must center on the voices and lived experiences of communities most affected by technologies and arming our communities with data and tools to resist harmful surveillance practices. This includes transparent data collection systems that reflect community needs, community-based research to document impacts, political advocacy for stronger accountability measures, and supporting and localizing alternative technology platforms that prioritize privacy and protection from invasive monitoring.

Esra'a Al Shafei is a 2025 Mozilla Fellow.
From groundbreaking innovations to bold visions, our 2025 Fellows share their predictions on where technology is headed—and the impact it could have on the world.