Live
Black Hat USADark ReadingBlack Hat AsiaAI BusinessHow to Use Shaders in React (2026 WebGPU / WebGL Tutorial)DEV CommunityThe 5th Agent Orchestration Pattern: Market-Based Task AllocationDEV CommunityThe Hidden Cost of Copy-Pasting Code Into ChatGPTDEV Community14-Package Monorepo: How We Structured WAIaaS for AI Agent BuildersDEV CommunityPromoting raw BG3 gameplay bundle previews in the TD2 SDL portDEV CommunityWhat Is New In Helm 4 And How It Improves Over Helm 3DEV CommunityDevelopers Are Designing for AI Before Users NowDEV CommunityStop Using Elaborate Personas: Research Shows They Degrade Claude Code OutputDEV CommunityAn Engineering-grade breakdown of RAG PipelineDEV CommunityHate Speech Detection Still Cooks (Even in 2026)Towards AIDeepSource for Python: Static Analysis and AutofixDEV CommunityUnlock the Power of Private AI: Build a Local RAG Pipeline with LangGraph, Ollama & Vector DatabasesDEV CommunityBlack Hat USADark ReadingBlack Hat AsiaAI BusinessHow to Use Shaders in React (2026 WebGPU / WebGL Tutorial)DEV CommunityThe 5th Agent Orchestration Pattern: Market-Based Task AllocationDEV CommunityThe Hidden Cost of Copy-Pasting Code Into ChatGPTDEV Community14-Package Monorepo: How We Structured WAIaaS for AI Agent BuildersDEV CommunityPromoting raw BG3 gameplay bundle previews in the TD2 SDL portDEV CommunityWhat Is New In Helm 4 And How It Improves Over Helm 3DEV CommunityDevelopers Are Designing for AI Before Users NowDEV CommunityStop Using Elaborate Personas: Research Shows They Degrade Claude Code OutputDEV CommunityAn Engineering-grade breakdown of RAG PipelineDEV CommunityHate Speech Detection Still Cooks (Even in 2026)Towards AIDeepSource for Python: Static Analysis and AutofixDEV CommunityUnlock the Power of Private AI: Build a Local RAG Pipeline with LangGraph, Ollama & Vector DatabasesDEV Community

Buyer beware: how AI is infiltrating humanitarian aid operations

Access Nowby Giulio CoppiMarch 26, 20261 min read0 views
Source Quiz

Access Now’s latest research unpacks how AI tools are dodging procurement vetting to infiltrate humanitarian organisations, creating new risks for them and the communities they serve. The post Buyer beware: how AI is infiltrating humanitarian aid operations appeared first on Access Now .

When used in humanitarian aid operations, AI systems that carry prevalent biases, security risks, and consent issues can leave local communities even more vulnerable during a crisis. Yet even as debate rages over whether or not humanitarian organisations should be using AI in the first place, the sector’s use of this technology keeps on growing. In our latest report, Reinventing humanitarian aid procurement for the age of AI, we examine the specific challenges associated with humanitarian organisations adopting algorithmic tools in sometimes haphazard or unpremeditated ways, suggesting a roadmap to adjust IT, cybersecurity, procurement, legal, and digital protection governance accordingly. To this end, we also provide, as an appendix to the report, a framework to help transform digital procurement from a transactional business function, into a strategic one.

As we show in our research, much of the aid sector’s adoption of AI is being driven, on the one hand, by individual aid workers using large language models for their daily tasks or humanitarian NGOs turning to ‘smart’ chatbots to compensate for access restrictions and funding woes, and on the other, by tech companies pushing enhanced, AI-based features within existing products and services.

This is creating new, sometimes invisible, risks for which organisations are unprepared.  When a U.S. nonprofit’s chatbot went off-script (following a product update that activated unexpected AI features), vulnerable users were suddenly barraged with misleading, even harmful responses. Now imagine, if you will, the consequences of the same thing happening to a humanitarian chatbot used by people seeking lifesaving information in a crisis or conflict setting.

In the algorithmic era, the same digital tools that are supposed to improve people’s lives may instead provide a web of autocratic surveillance, contribute to the unlawful targeting of civilians in conflict, and automate life-or-death decision-making with neither meaningful human oversight nor remedy. Just like for any emerging technology, the irresponsible adoption of AI risks undermining humanitarian actors’ ability to honour the two non-negotiable cornerstones of humanitarian action: the humanitarian principles and the rule of ‘do no harm’.

Mapping AI’s uptake in aid

Building on our previous work mapping the often-opaque partnerships between humanitarian actors and private tech corporations, our latest report shows how algorithmic systems are entering the humanitarian space mostly by the backdoor, and often without safeguards. Extensive desk research and more than 70 interviews with actors from across the humanitarian, public, academic, private, and social impact sector, reveal a tendency towards haphazard integration, rather than deliberate institutional adoption, of cloud-based, algorithmic-enhanced functionalities and processes, slowly but progressively seeping into most internal systems.

This form of corporate capture is not new. Following the “cloud-first” playbook, it traps even the largest and wealthiest international organisations into digital and financial dependency. And while Global Majority aid workers and grassroots organisations are at the forefront of experimenting with AI adoption, a lack of adequate legal frameworks and safeguards, alongside lagging investments in humanitarian funding for ICT infrastructure, are thwarting widely touted claims that AI will democratise humanitarian efforts by empowering local actors.

To the contrary, our research shows how the progressive cloudification of most common digital systems and the fast-paced, distributed nature of modern digital development, is only deepening the existing digital divides between large organisations with privileged, on-call access to Big Tech, and smaller frontline organisations and communities struggling to protect their principled approach, and their own rights, in the digital age.

What to do about it

While organisation-led digital development based on open source systems and co-development with trusted external providers appears to be the best way to minimise the risks associated with algorithmic systems, we know this isn’t financially or technically feasible for most humanitarian actors. To that end, the appendix to our report provides the foundations for a digital service framework in the algorithmic age, offering up a suggested governance model to help humanitarian organisations acquire digital products and services, including algorithmic capabilities, with human rights in mind. Key recommendations featured in the report include:

  • Asking donors to promote changes in tech governance processes and mechanisms aimed at reforming humanitarian tech procurement;

  • Calling on regulators to strengthen transparency and due diligence requirements across the whole supply chain for all algorithmic tools;

  • Inviting the humanitarian community to underpin their digital transformation efforts with a governance framework that bridges procurement, cybersecurity, protection, and human rights protections;

  • Requesting that tech companies re-invest in human rights and humanitarian internal expertise, to help re-establish trust-building mechanisms with humanitarians and local communities;

  • Inviting local actors and communities to champion local tech solutions and run algorithmic accountability audits on proposed systems;  and

  • Asking research groups and cyber experts to analyse tech adoption specifically through the lens of NGOs operating within low-income countries and/or in conflict-affected areas.

Our research findings, related recommendations, and the appendix provided (in Google Sheet and PDF format) will hopefully form a basis for all actors in the humanitarian sector seeking to break down siloes, tackle the digital divide, and set digital transformation on a better path, for the benefit of those they seek to serve. We extend our thanks to the German Federal Foreign Office, whose support made this research possible. Access Now remains committed to extending and defending the digital rights of at-risk people and communities, especially those living in dire conditions, through crises, or caught up in conflict, and we welcome feedback, questions, or comments on this latest report.

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by AI News Hub · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

More about

research

Knowledge Map

Knowledge Map
TopicsEntitiesSource
Buyer bewar…researchAccess Now

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 182 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!

More in Products