Wangari Digest
Wangari Digest
Explaining Isn’t Predicting
0:00
-10:59

Explaining Isn’t Predicting

Why neat explanations can mislead us in both business and life.

Every investor, manager, or policymaker faces the same puzzle: what really made the difference?

In finance, we lean on attribution analysis or its close cousin, driver-based analysis. In our private lives, we tell ourselves stories like: “I succeeded because I worked hard,” or “I failed because of bad luck.”

Both are attempts to explain the past. Both risk mistaking explanation for cause.

And in both business and life, that confusion matters.

Attribution as storytelling

Attribution analysis is a staple in finance.

  • In asset management, it splits excess return into allocation effect and selection effect.

  • In corporate dashboards, it decomposes revenue changes into price, volume, costs, and currency.

Driver-based analysis works the same way. It tells managers what “drove” results.

The strength of these approaches is clarity. They create tidy pie charts and narratives of performance.

But clarity is not causality. Attribution explains the past. It does not tell you what would happen if you acted differently tomorrow.

The counterfactual turn

That’s where causal inference enters.

Instead of decomposing outcomes, causal inference asks the counterfactual: what if things had been different?

  • What if interest rates had stayed flat?

  • What if the company hadn’t diversified into renewables?

  • What if women in management increased from 10% to 30%?

These are not historical decompositions. They are interventions. Causal inference is forward-looking in a way attribution can never be.

Example: women in management

Consider the link between women in top management and share prices.

  • Attribution analysis might report that companies with more women had higher returns, and assign a percentage of “impact.”

  • Causal inference reframes it: if the same company increased women in management, would its share price actually change?

One provides a story. The other tests the cause.

Why attribution misleads

Attribution risks confusion because correlations masquerade as causes.

Companies with more women may also have higher profits, better governance, or operate in stable industries. Attribution doesn’t separate these.

Causal inference forces assumptions into the open. We use Directed Acyclic Graphs (DAGs) to map possible paths:

  • Women in management → oversight → higher share price.

  • Profitability → ability to hire inclusively → women in management.

This mapping clarifies what’s cause, what’s effect, and what’s confounding.

Regression vs. causal thinking

Many analysts rely on regressions, “controlling for” profit, size, or volatility. That feels causal.

But without a causal graph, it’s easy to control for the wrong thing. You might block a causal channel or leave a confounder unaddressed.

That’s why tools like DoWhy are powerful. They align estimation with assumptions, so we’re testing real causal questions instead of running blind regressions.

Small data, big clarity

Counterintuitively, causal inference can thrive in small samples.

ESG and sustainability datasets are often limited — a few dozen firms, a decade of history. Not enough for machine learning, but enough for causal modeling.

Attribution shines for dashboards. Causal inference shines for decisions.

The mind’s stories

This distinction extends beyond finance.

Our minds practice attribution all the time. We explain outcomes to ourselves: “I succeeded because I’m smart,” or “I failed because the market was unfair.”

But those explanations are often just stories. They comfort us but rarely reveal cause.

Causal inference, like Vipassana practice, asks us to look more deeply. It resists easy stories. It examines what really causes what — and what is illusion.

That’s less comfortable, but more liberating.

Dashboards vs. decisions

So where does this leave us?

  • Attribution and driver analysis belong in dashboards, reports, and post-mortems. They help explain outcomes in accessible language.

  • Causal inference belongs in decision-making. It tells us whether changing a lever tomorrow will alter the outcome.

Confusing one for the other means confusing storytelling with strategy.

Closing thought

Explaining the past is not the same as predicting the future.

In markets and in our own minds, attribution gives us stories — tidy, seductive, backward-looking. Causal inference forces us to face the harder truth: only counterfactuals can guide us forward.

And that’s the difference between a dashboard and a compass.

Discussion about this episode

User's avatar