top of page

AI Will Show You the Map, But It Won’t Tell You Where to Go

  • Writer: George Schuler
    George Schuler
  • Sep 10
  • 3 min read

(What AI Can — and Can’t — Do for System Insights)

Opening: The Mirage of the Perfect Map


We live in an age where AI can sketch out narratives, ideas, and concepts at startling speed. Feed it texts, tables, lists, or pictures, and it can spit out frameworks and insights across reams of information.


At first glance, this feels revolutionary. Whole fields of strategy, advocacy, and systems thinking seem suddenly within reach of anyone with an internet connection.


But there’s a catch.


Because the value of a map isn’t in the lines and dots. It’s in knowing how to read it and deciding what to do with it. Without that, even the most elegant output is just a mirage.


What AI Can Do (And Do Well)

Let’s give credit where it’s due: AI can already do things that used to take teams of analysts weeks.


  • Spot patterns. It can cluster ideas, highlight outliers, and trace connections the human eye might skim past.

  • Sketch networks. It can arrange relationships into diagrams that look conference-ready.

  • Surface mood signals. It can sift through thousands of posts and distill an overall tone.

  • Catch language shifts. It can notice when optimism slides into fatigue, or when enthusiasm hardens into resistance.

  • Translate complexity. It can reframe dense analysis into plain words almost anyone can follow.


These are useful accelerators. They make the invisible a little more visible and save time for people who need to act.


But sketches are not strategies. And systems are not static diagrams, they are living, shifting terrains.


What AI Can’t Do (Yet)

Where the real work begins, AI still falters.


  • Context. A name in the center of a diagram might look pivotal, but only human judgment reveals whether that person actually holds trust, authority, or resentment.

  • Ethics. Automated readings of tone often flatten nuance, or worse, misinterpret grief, anger, or trauma as “negative sentiment.”

  • Strategy. AI doesn’t know the politics of your boardroom, the histories of your coalition, or the lived tradeoffs communities have already made.

  • Power and trust. Influence doesn’t flow through clean lines; it moves through messy human relationships. Trust is fragile. Credibility is earned. Timing is everything.

  • Translation into movement. AI can illuminate who is connected. Only people can decide which alliances to form, which risks to take, and how to shift a story in a way others believe.


This is the gap between visibility and wisdom. AI can catch the signals, but it can’t feel the stakes.


Why This Matters

Most change efforts don’t stumble because of missing data. They stumble because leaders misread the system they were in.


They had the sketch — but not the compass.


Think of a movement that stalled because leaders courted the wrong allies. Or a corporate initiative that fizzled because a small but powerful group resisted behind the scenes. In each case, the data was there. The navigation wasn’t.


AI might give you the outlines. But strategy is navigation, not cartography.


Knowing how to interpret a system is not the same as knowing how to move through one.


The Human Layer: Reading Between the Lines

This is where expertise, judgment, and experience come in. A skilled strategist doesn’t just ask what the sketch shows — they ask so what?


In our work with WWF and GlobeScan, research revealed how stakeholders viewed water risk, trust, and collaboration. AI could easily have clustered the responses or generated tidy summaries. But the real value came from asking deeper questions: Why do certain groups perceive issues differently? Which voices are trusted, and which are sidelined? What narratives are missing that could unlock broader participation?


The diagrams showed relationships. The strategy came from interpreting context: understanding why certain stakeholders functioned as bridges, and how WWF could position itself to convene across divides.


The same pattern showed up in our work with Vitamin Angels. Mapping the maternal and child health ecosystem revealed dozens of potential allies. But the real question wasn’t who is most central? --  it was who is trusted across boundaries? AI could sketch the network, but only human judgment could identify which relationships would actually move resources, shift narratives, and build credibility.


This is what AI misses. The sketch was the starting point. The strategy emerged from human interpretation of context, trust, and timing.


Closing Reflection

The allure of AI is speed and scale. The danger of AI is mistaking speed for wisdom.


If you want to change a system, you need both: the efficiency of tools and the discernment of people. AI can help you see the terrain. But it won’t tell you where to go, or how to bring others with you.


That choice -- and that responsibility -- remains ours.


 
 
 

Comments


© 2024 Connecting for Change, LLC. All rights reserved.

bottom of page