INVESTIGATE TOOLS

You don’t need to use all of these.

You just need to know they exist.

Most bad conclusions aren’t evil.

They’re rushed.

This section is how you slow them down.


1. Slow It Down

If something feels explosive, urgent, outrageous — pause.

Strong reaction + low time pressure = structural risk.

Ask:

What exactly was said?
What exactly happened?
What is interpretation layered on top?

There is a reason courts separate testimony from argument.

Slowing down is not weakness.

It’s control.

📎 Related thinking:
Daniel Kahneman – System 1 / System 2 processing
(Thinking, Fast and Slow)


2. Go Upstream

Where did this come from?

Original speech?
Original report?
Edited clip?
Commentary on commentary?

Most distortion happens between source and summary.

If you can’t reach the primary material, at least identify how many layers sit between you and it.

📎 Related principle:
Primary vs Secondary Sources (basic research literacy standard)


3. Expand the Slogan

If the claim fits on a T-shirt, it’s probably compressed.

Try expanding it.

What time frame?
What variables?
What trade-offs?
What uncertainty?

If someone can’t expand their position beyond a chant, the chant is doing emotional work.

📎 Related concept:
Complexity Reduction & Cognitive Load (John Sweller)


4. Follow Incentives

Before assuming conspiracy, map incentives.

Who benefits if this spreads?
Who benefits if it fails?
Who benefits if nothing changes?

Financial incentive is obvious.

Status, power, identity and attention are less obvious — but just as strong.

📎 Related thinking:
Incentive Structures in Behavioural Economics
(Steven Levitt / Chicago School tradition)


5. Separate Event from Story

Is this event strong on its own?

Or is it being pulled into an existing narrative?

Ask:

If the actors were reversed, would I interpret it the same way?

If not, narrative gravity is working.

📎 Related idea:
Framing Effects (Tversky & Kahneman)


6. Reset the Language

Replace loaded words with neutral ones.

Instead of:
“They attacked democracy”

Try:
“What specific action occurred?”

Instead of:
“This is corruption”

Try:
“What rule was broken?”

Clarity often returns instantly when language is neutralised.

📎 Related field:
Linguistic Framing (George Lakoff – though read critically)


7. Calibrate Confidence

Not:

“Do I agree?”

Instead:

“How certain am I?”

Confidence should rise and fall with evidence.

Not volume.
Not repetition.
Not tribe size.

📎 Related discipline:
Bayesian Updating
Epistemic Humility (Philosophy of Science)


8. Delay Public Reaction

Ask one question before responding:

“Have I checked this?”

Even 60 seconds improves reasoning stability.

Speed is often rewarded.

Accuracy is often not.

Choose accuracy.

📎 Related research:
Digital Misinformation Spread & Reaction Speed
(MIT Media Lab – Vosoughi et al., 2018)


Important

These tools are not weapons.

If you use them to humiliate someone, you are not using Tradecraft.

You are performing it.

Use them on yourself first.

If something survives these checks, confidence increases.

If it fails, uncertainty increases.

Both are progress.

Scroll to Top