Your browser may not be supported by this website. Please upgrade to the latest version provided by your browser vendor.

Insights / Guest piece

26 Jan 2026 / min read

Anticipatory intelligence, long-term decision making and complex systems

Continuing our series of interviews with leaders in their respective fields of the climate and nature transition, this month we sat down Aarathi Krishnan who specializes in anticipatory governance for the humanitarian and development sectors.

FUTURE WORLD @ ArtScience Museum, Singapore, photo by Robynne O on Unsplash.

As part of our research cycle, we are interviewing a range of influential and inspirational leaders from across the climate and nature transition. These interviews are intended as a window into innovative and exciting ways of approaching the transition and to spotlight the people who are at the forefront of these changes. Here, we sit down with Aarathi Krishnan. Aarathi is Founder and CEO of RAKSHA Intelligence Futures, building next-generation intelligence systems to detect fractures in geopolitics, finance, and governance before they break. Her work fuses AI-native analysis with human judgement to map how power, resources, and narrative converge quietly, long before crisis arrives. She's helping shape a new field - anticipatory intelligence - that challenges institutions to see what they typically overlook and make braver decisions about the futures they're actually creating. Aarathi has worked with the UN, World Bank, and leading philanthropies on risk, ethics, and governance transformation.

Name one object that you currently have on your desk

A hand-hammered copper vessel from an antique store in Kerala, where my family's roots run deep. It's slightly uneven and beautifully imperfect. A reminder that I'm building futures while standing on something far older than ambition. It keeps me grounded.

How would you explain what you work on to a five year old?

I help people see what might break before it does. You know when you build a really tall Lego tower and you can tell it's getting wobbly? You want to fix it before it crashes, right? That's what I do, except with things like countries and the systems that affect how people live.

Sometimes big problems don't just suddenly appear - they start small, like tiny cracks that grow bigger. My job is to spot those cracks really early. We’ve helped multilateral organisations work out which places might have big problems coming, so they could help people prepare. We’ve also helped large investment funds see what might impact their portfolios ahead of it hitting. It's like being able to see around corners or notice when something's about to tip over before anyone else does. The earlier you spot it, the more you can actually do to help.

Can you describe a recent moment or experience on a project that has particularly stuck in your mind?

We were building a signal monitoring framework for a global client really wrestling with how to explain what "anticipatory intelligence" even is. Mid-conversation, I realized: we're not describing a field that exists. We're inventing it. That stopped me.

There's something electrifying about creating a new way of seeing the world while you're doing it, writing the foundational language as you go. You're essentially shaping the terms that future institutions will use to understand risk. What does it mean to detect a fracture before it's visible? How do you build systems that see what's being overlooked? These aren't just technical questions - they're almost philosophical ones.

It's thrilling and terrifying in equal measure. Thrilling because you're genuinely creating something new. Terrifying because there's no roadmap, no established best practices to fall back on. You're just... building the plane while flying it. But that's also what makes it worth doing.

What are some of the hardest challenges you're grappling with right now?

Building infrastructure for something that doesn't exist yet. Anticipatory intelligence sits at this intersection of geopolitics, data science, and ethics - it's genuinely uncharted. When we're mapping emerging risks for UN agencies, there's no playbook for detecting these "quiet fractures" that happen before visible crises.

At RAKSHA, we are teaching institutions to see patterns they've been trained to ignore. Institutions are designed for stability, for proven methods. So, we're asking them to invest in decision making capability that, by definition, prevents crises they'll never know would have happened. That's a hard sell. How do you prove the value of something that worked precisely because nothing went wrong?

The hard part is staying clear while building something this novel. You need both rigor and imagination, and you can't let either one tip into performance. Rigor without imagination becomes just another consulting exercise. Imagination without rigor becomes futurism - interesting, but not actually useful. You're constantly navigating that tension.

It's demanding work, but honestly? I find it exhilarating. Though it requires a kind of internal steadiness most institutions don't really cultivate. You have to be comfortable with uncertainty, with building in public, with the possibility that you might be completely wrong about where this is going.

What are the most exciting developments you are seeing in your space?

That the space itself is still being shaped. We're moving beyond prediction into something different - systems that can spot early fractures before they cascade into full crises. Ethical fractures, institutional ones, geopolitical. It's happening where human judgment meets machine pattern recognition, and when it's done right, it actually amplifies human discernment rather than replacing it.

Traditional forecasting asks, "what will happen?" We're asking different questions: "What's quietly reconfiguring?" Where are the fractures forming that nobody is watching? Whose power is shifting in ways that aren't market-visible yet?" These are fundamentally different modes of intelligence.

The landscape is interesting because we're not alone in sensing this shift, but we're coming at it differently. Traditional risk advisory firms are still operating in quarterly cycles. Foresight practices produce brilliant scenario planning but often lack the operational teeth to drive decisions. Intelligence agencies with incredible capabilities are constrained by classification and mandates. Think tanks are producing excellent research that sits in PDFs.

We're in the space between all of these - building systems that are rigorous enough for institutions to trust, fast enough to catch fractures as they form, and unclassified enough to actually inform decisions at scale. That's the gap RAKSHA occupies. We're not replacing any of these actors; we're doing something none of them can quite do.

We're seeing this with our Fracture Atlas work - detecting structural vulnerabilities in how power, resources, and narrative converge. The vulnerabilities and stresses we are observing vary: from digital infrastructure grid stress to the collapse of ESG and verification systems. These are the kind of systemic stresses that traditional forecasting misses completely because it's looking for events, not the slow reconfiguration that happens before systems break. Like watching ice crack - by the time you see the surface break, the fracture has already been forming deep underneath for a while.

What's exciting is that this isn't just academic. When institutions can see six, twelve, eighteen months ahead - not predicting specific events but understanding structural trajectories - they can make fundamentally different decisions. Braver decisions. More humane ones.

Watching this new form of intelligence emerge feels like standing at a threshold. We're right at the beginning of something that could genuinely change how power operates, how resources flow, and how decisions get made. That's rare.

You're organising a gathering to discuss new approaches to the transition to a sustainable future. What does it look like, what would it focus on, and who would be around the table?

Definitely not a summit. An evening gathering. Off-the-record, candlelit, unbranded. Around the table: diplomats, activists, financiers, artists, and people who've actually survived collapse. The kind of room where a UN negotiator sits next to someone who lived through Zimbabwe's hyperinflation, where a climate finance expert is across from an artist who's been documenting displacement for twenty years.

We'd talk about what most conferences tactically avoid - who really owns the land beneath "green" projects, whose data is training the models, whose debt is funding the transition. What does "just" actually mean when the people making decisions about justice are rarely the ones who'll live with the consequences? What are we willing to let break in order to build something new?

The goal wouldn't be consensus or performative optimism. It would be courage. To name what we usually edit out. To sit with the discomfort of realizing how much we don't actually want to change, even as we advocate for transformation. To decide what we're genuinely willing to sacrifice - not in abstract terms, but specifically. Your convenience? Your position? Your certainty?

The kind of conversation that changes you, not just informs you. Where you leave different than when you arrived. Those are rare. And desperately needed.

Find out more about Aarathi's work below:

Website: www.rak-sha.com Substack: https://substack.com/@rakshain...