Morgan Holland

San Francisco, CA

Superforecasting

By Philip Tetlock and Dan Gardner

Good Judgement Project (GJP) - IARPA funded

  1. An Optimistic Skeptic
    1. Do your homework, look one year out max, unpack into components, fermi-ize (case studies), outside view first (anchoring), then inside view, leave no assumption unscrutanized (first principles - Munger, Musk), be precise, update frequently
    2. Ignore false dichotomies - set goals, measure, learn, discover how to do better
    3. Numeracy, news junkies, open minded, curious, self critical, focused, commitment to self improvement
  2. Illusions of Knowledge
    1. Always doubt
    2. Always invert
    3. Introspect, self criticism, System 1 (unconscious) vs. System 2 (conscious) - reflexive, tip of the nose vs. interrogative thinking - pattern recognition, use both
    4. Kahneman - Thinking Fast and Slow
    5. Availability Heuristic - What You See Is All There Is (WYSIATI) - insensitive to Quality of Evidence, like seeing a shadow in the grass and thinking it's a Lion just because false positives aren't as dangerous
    6. Always ask 'what would convince me I am wrong?'
    7. Avoid Confirmation Bias (only seeing evidence that supports your initial opinion)
    8. Attribute Substitution - Bait & Switch - unconsciously replace the hard question with an easy one
    9. Intuition vs. Analysis - Blink vs. Think - another false dichotomy - blend them
  3. Keeping Score
    1. Be precise
    2. People avoid precision because of the Wrong Side of Maybe Fallacy (saying 80% chance it happens and it not happening and people saying you're wrong) - makes people prefer elastic language, 50/50, which is worse
    3. Having a track record helps you calibrate
    4. Calibration: perfect calibration is your forecast % hitting the real % all the time
      1. Underconfident = forecast % being lower than actual
      2. Overconfident = forecast % being higher than actual
    5. When we think of forecasting accuracy we don't think of calibration, we think of resolution
    6. Resolution: Precision and accuracy
      1. Mid-line (always saying 50/50 or 60/40 and getting it right) is well calibrated but ultra cautious
      2. Low/high on the line (always saying 80/20 or 20/80 and getting it right) is well calibrated and decisive
    7. Must be a risk taker and be decisive - get rewarded more for 90% odds than 70% odds, like betting: 80% = 4:1, 90% = 9:1
    8. Brier Score - 0 is perfect. 0.5 is 50/50 on everything. 2.0 is the worst (say 100% odds of something but it never happens)
      1. Glenn W. Brier
      2. Have to calibrate your measurement to how difficult the question is, often just saying 'no change' will produce a score close to 0 (i.e. predicting Phoenix's weather, vs. weather prediction in other places is way harder)
    9. Foxes know many things, they're eclectic experts, uncertain - they synthesize and aggregate
    10. Hedgehogs know one thing, One Big Idea, governs everything, green-colored glasses like in Oz, incorrect first principles, fatal
    11. Foxes have better calibration and resolution
    12. Hedgehog accuracy is actually worse in their field of expertise!
    13. Fox/hedgehog is a spectrum, no false dichotomies
    14. Dragonfly Eye - Wisdom of Crowds - all valid information points in one direction, but errors point in different directions, canceling each other out!
    15. Richard Thayler - Problem: Guess 2/3 of the average pick from contestants of a number between 0 and 100
      1. Assume average = 50, so guess 2/3 * 50 = 33
      2. Guess 2/3 * 33 = 22
      3. Guess 2/3 * 22, right answer was ~13 (average = 18.91)
    16. Aggregate, and be precise
    17. Empathy - consider sifting in your opponent's shoes (like poker)
Name *
Name