Letter from the Founder

Why I Built Emergence Field Labs

What Afghanistan taught me about data, sovereignty, and how systems fail to learn

Scott Shadian

Scott Shadian

Founder & CEO

Key Takeaways

  • Afghanistan was one of the most researched places in human history, yet none of it prevented collapse
  • What a chaotic volunteer evacuation taught me about how knowledge actually moves
  • The same extractive logic that failed in Kabul is now being encoded into AI
  • Data ownership is only half the answer — the other half is what happens when communities choose to share it
  • The case for infrastructure that makes learning compound from the bottom up

When Kabul fell in August 2021, I expected a reckoning.

I had just spent a decade and a half inside what the United States, which effectively took over when it invaded in 2001, called a "nation-building project." I first arrived in Afghanistan in 2005, and for the next sixteen years, whether I was actually in Afghanistan or not, my life and work were tied to the country.

I moved from diplomacy to aid work and eventually founded Sayara International, a measurement, evaluation, and learning firm. Sayara would eventually come to have more than 200 staff working across 30 countries, but Afghanistan was always our home base. We generated enormous amounts of data for nearly every major aid donor operating there — all in the name of "impact," "learning" and "localization."

For decades, I helped build systems where data flowed upward. Communities collected it. Contractors packaged it. And then institutions interpreted it to fit their interests, using that data to form their policies.

It usually comes as a shock when I tell people that Afghanistan was one of the most researched places on earth. Perhaps in all of human history. For twenty years there was non-stop polling and evaluations. Mountains of metrics. Reams of impact reporting. There was so much research, in fact, that "polling fatigue" among Afghans became its own data point we had to monitor.

Yet, with all this rich data on every dimension of nation-building, the system in command could not see or hear itself. As Yeats wrote, "the falcon cannot hear the falconer. Things fall apart, the centre cannot hold."

And so it was.

Two trillion dollars. Twenty years. More than 240,000 Afghans killed. Nearly 3,500 American and coalition soldiers dead, as well as hundreds more diplomats, journalists and aid workers. And millions of Afghans displaced — mostly women and children — who were forced to rebuild their lives in foreign lands because of a war they did not ask for, but that demanded everything of them.

Then the same Taliban regime returned to power, stronger and more violent than before. The longest and costliest war in American history ended exactly where it began. But there was no genuine reckoning in the wake of the fall of Kabul. No shared institutional learning emerged here in America. No meaningful accountability came to light for how twenty years of decisions had led to that moment.

This could have been an opportunity for us to pause, breathe, and contemplate how we got into this mess. But the problem was that, while people had played a role in creating what happened, it was the System that ran it: a System that operated with relatively little human input or oversight. This System was devoid of consciousness; instead, it seemed to run automatically, driven by politics, ideology and capital flows.

The System did not stop to consider its own architecture because it could not consider its own architecture. The fall of Kabul provided a mirror that we could hold up to ourselves as a society, but the System can't look in a mirror. The System just keeps working the same way it's always worked. Six months later, Russia invaded Ukraine, and Afghanistan passed into faint memory, a notation on a timeline of continuous wars that have waged non-stop from 2001 until today.

"Every war already carries within it the war which will answer it. Every war is answered by a new war, until everything, everything is smashed."

Käthe Kollwitz

Käthe Kollwitz wrote those words a century ago. She understood a perennial truth: conflict or systemic failure of any kind that is not reckoned with, not genuinely learned from, will simply perpetuate itself.

After all, we didn't fail in Afghanistan because we didn't have enough information. We failed because we couldn't integrate it into something actionable. We had all the data we could have ever wanted, but it lived in silos, fragmented into incoherent, inaccessible parts. The data gathered in Afghanistan was mainly hoarded and used to compete for grants and contracts; it fed the System and the System used what parts it needed to tell itself the story it wanted to hear.

In the final years before the collapse, our district-level data showed government control slipping. Our research captured fear, defections, morale amongst Afghan forces unraveling. Our analysis suggested the government would fall within six months of withdrawal. The evidence was real. But there was no shared infrastructure that could collect it into a coherent picture upon which people could act. Instead, our clients were working towards a post-peace deal powersharing government—a reality that existed nowhere except in each organization's own data.

Each donor tracked proprietary indicators. Each organization measured what proved its own value. Knowledge and learning was rarely shared. Data instead became a competitive advantage to win a new grant. Everyone could see their part, tell their own story of "success", but no one could see the whole. And then the whole fell apart.

The neuroscientist and philosopher Iain McGilchrist would recognize this pattern immediately. In his model of hemispheric attention, the left hemisphere of the brain attends to the parts, favoring abstraction, categorization, metrics—the quantitative. It tends to reduce complexity into what can be grasped and controlled. The right hemisphere remains open to the whole and is better able to engage complexity, seen in context, relationship, living reality, meaning—the qualitative. Each hemisphere needs the other. But when the left becomes dominant and self-enclosed, it produces a view of reality that mistakes its maps for the territory. It biases what it already knows how to process, progressively reinforcing its own representations until the map begins to replace the territory itself.

This perceptual disorder was not only rampant in policy making in Afghanistan, but is represented across the social sector writ large (and through all sectors, for that matter). These left-hemispheric systems (the falcon) at scale are precise, quantitatively data-rich, and yet completely unable to see the living whole that their parts are failing to hold together. The right hemisphere (the falconer) — the one that holds context, sees system-wide patterns, and senses what the numbers cannot say — has no seat at the table of how we build systems, measure impact, or make policy. Eventually, "things fall apart, the centre cannot hold".

Afghanistan evacuation efforts

Love in a time of chaos

In the days that followed the collapse, I knew I had to act. I had spent too much time and energy, too much emotion, in Afghanistan. I felt involved, in a way that a human must but a closed system never can.

Ten years before the fall, we hired a man whose brother had been killed by the Taliban, and who had himself nearly died in the same attack. He carried those scars into his work with us, and he remained one of the most devoted and extraordinary employees we ever had. But as the withdrawal date drew closer and every avenue to get him out collapsed into a dead end, my worry became something harder to name.

I had promised him that I would get him out of the country. I promised myself that I would do this as well. And I believe it was that promise — that specific, personal, unshakeable promise — that ignited what came next: the idea to charter a plane.

One plane became two. Two became three, then four. Sayara's humanitarian evacuation eventually rescued more than 1,200 vulnerable Afghans. By the end, we had evacuated our staff and their families, along with journalists, female judges, professors, artists, civil society members — Afghans whose lives, had they stayed, would have been measured in months.

But it wasn't just Sayara. Dozens of ad hoc evacuations — vets, aid workers, diaspora networks, journalists — converged on the same burning mission: to save people they loved. This wider mission became known as the "Digital Dunkirk": an entirely organic charge of independent operations, unified in purpose but sovereign in means, driven ultimately by love, proving that ordinary people, when the institution fails, become the institution.

We didn't have a functioning system coordinating the effort, or a command center, or even a playbook. At Sayara we were a bunch of research nerds, journalists and aid workers armed only with Whatsapp, Signal and Zoom. What the hell did we know about running an evacuation organization in a war zone?

Yet what emerged was something I will never forget.

We were a decentralized network of people — former colleagues, volunteers, family members, strangers — coordinating in real time. Information moved horizontally. Decisions were made by those closest to the ground. No one waited for permission from the System. We just acted. We introspected. We learned. We iterated—at lightning speed. And we did not stop until we got our people out. Overall, it is estimated this wider group of rag tag volunteer organizations evacuated 8,000-12,000 at-risk Afghans.

It was chaotic. But out of that chaos, a spontaneous, complex, adaptive organism emerged. Rather than the cold, dead, unrelated parts of the System, our center of gravity was built from love, intuition, and the human spirit of cooperation. It became what chaos theorists call a strange attractor.

The System, with twenty years and two trillion dollars behind it, could not integrate even the simplest and most obvious patterns in the data, patterns that everyone on the ground could see clearly. It could not respond to chaos. Its chariots were frozen.

But a decentralized network of individuals around the globe, bound by a shared purpose and love for the Afghans they worked so closely with, moved 1,200 people to safety over the following weeks and months.

What I witnessed in those weeks was not just the power of decentralization unified in purpose. It was a specific kind of intelligence: knowledge generated at the edges, shared horizontally, and continuously refined by the people closest to the ground. No one had a complete picture. But because information moved freely between everyone who had a partial one, the whole exceeded the sum of its parts. That is the model that would define what I would set out to build.

That experience, coming after the sixteen years I spent working to rebuild Afghanistan, changed how I came to understand data, power and human agency.

Two Paths

Today, we are living through a moment when the platforms and institutions that manage our information are consolidating at an unprecedented rate. Algorithms shape what we see. Data about our behavior is aggregated and used to make decisions about us — often without us. The pattern is familiar from my days working in international aid: centralize and compartmentalize information, concentrate authority, create policy misaligned with lived reality. It strips communities of their agency — because when communities do not own their data, they do not own their story. When their knowledge is absorbed into centralized infrastructures, they lose autonomy over what it means and how it is used. Someone else, higher up, decides their story for them.

But the emerging age of AI will take this to a new level. The centralized control of knowledge that AI facilitates is becoming one of the greatest threats to human sovereignty. AI infrastructure is already starting to shape narratives and constraining critical thought.

As a result, information flows are changing at a foundational level. The last era of search and social media surfaced and directed information algorithmically, but it was still up to you to decide what you did with that information. The new age of AI, however, will synthesize information into meaning before you even have the chance to make your own sense of it. What comes next is a contest over who holds the power to interpret our lived reality. Our stories. Our truth.

If data is allowed to aggregate without consent and interpreted without accountability or autonomy, then it is no more than fuel for a surveillance society that will control how you think— regardless of whether it is wielded by a government tracking dissidents or a platform monetizing your attention. The logic that drove the search and social media age simply compounds in the age of AI: extract behavior, concentrate meaning-making, and use the resulting asymmetry of knowledge as power over the people who generated it.

"There is no greater agony than bearing an untold story inside you."

Maya Angelou

She meant it as a human truth. But it is also a systemic one. The communities we worked alongside in Afghanistan were not voiceless — they had stories, evidence, knowledge, warnings. But the infrastructure around them was designed to extract their data and then decide, without them, what it meant. Their stories entered the system and never came back out. What returned instead were policies built on someone else's interpretation of their lives.

Highly centralized systems like these repeatedly oppress and ultimately fail when knowledge cannot move freely across silos and when learning — that is, truth — is prohibited from challenging the truisms that make the systems exist in the first place.

Afghanistan is not unique. It is a microcosm of a pattern playing out in the macrocosm, and you don't have to look far to see it.

Across the United States, billions are spent each year on social programs — in education, public health, housing, criminal justice, workforce development — with remarkably little shared learning between them. A juvenile reentry program in Baltimore figures something out — about trust, about timing, about what actually moves a young person away from recidivism. That knowledge is real. It was earned. And then the grant cycle ends, the report is filed, and it disappears. The next program, in Fresno or Louisville or Detroit, starts from zero. The community health initiative that reduced maternal mortality in rural Appalachia never talks to the one in the Mississippi Delta. Not because they don't want to. Because there is no infrastructure that makes it possible. The evidence accumulates. The learning does not.

The people inside these organizations deeply care, but that doesn't erase the foundational problem of the architecture that houses them, which was designed to report vertically, satisfy funders, and move on to the next funding cycle.

∗ ∗ ∗

Emergence Field Labs is laying the foundation for a different path — one designed specifically in opposition to the surveillance logic described above: i.e. the extraction of community data, the concentration of its meaning, and the use of that asymmetry as power over the people who generated it.

We say that knowing full well what it invites: the obvious skepticism that a platform is still a platform, that infrastructure centralizes as easily as it liberates, and that good intentions have never been sufficient armor against bad architecture. We've seen that movie up close.

So the question we've held from the beginning is not simply can we help communities generate rigorous evidence — plenty of organizations have done that while quietly concentrating the meaning-making in their own hands. The question is: who owns the data? Who controls what the data means, how it travels, and whose assumptions get encoded into the system over time?

We are building decentralized tools that allow organizations and communities to generate rigorous evidence while retaining ownership of the data and of their story. The same data that extractive systems would aggregate without consent, interpret without accountability, and deploy against communities — we are returning to the people who generated it.

But data ownership is only half the vision.

The other half is what happens when sovereign actors choose to contribute to a commons of data and learning. When organizations become field labs and share knowledge horizontally, patterns emerge that no single organization could see alone. This allows knowledge to be integrated from the bottom up, grown from lived experience rather than imposed from above. What strengthens one part strengthens the whole, and the learning that results could not be produced by any part alone.

This is how collective, cooperative intelligence actually works: not by pooling data into a warehouse, but by creating conditions where what one organization learns becomes legible to another. Where a breakthrough in Baltimore can reach Fresno, when patterns that no single organization can see — because each could only see its own part — becomes visible to all of them together. The feedback loop this creates is the relational engine that true collective intelligence requires, where learning compounds and evolves, and the whole becomes greater than the sum of the parts.

Contrary to the data silos my colleagues and I built in Afghanistan, EFL is a living data ecosystem — sovereign, decentralized actors producing emergent data pathways, learning feedback loops, and interoperable insight without being absorbed, flattened, or overridden.

I have seen what happens when centralized systems cannot hear the edges. I have seen what happens when data is collected but never truly acted on — when institutions move forward without learning from their own evidence, without facing what their own data is trying to tell them.

It looks like total collapse. It looks like leaving things worse than when we started.

But the evacuation taught me something simple: when knowledge and experience flow freely between people who care, lives change, and new adaptive systems emerge.

We can't prevent failure, but we can learn from it. And we can build infrastructure that makes that learning sharable and accessible to communities on the frontlines.

We are determined to not contribute to a future shaped by institutions that hoard information or refuse to face their own failures. We want to support a future where communities generate knowledge, own it, share it with each other, and grow collectively smarter and wiser over time.

If you run an organization tired of watching your evidence disappear into a funder's filing cabinet, we built this for you. If you fund social change and want to stop financing the same lessons over and over, lets talk. If you build technology and believe infrastructure should serve communities rather than extract from them, come find us.