Think Like a Freak: The Authors of Freakonomics Offer to Retrain Your Brain, by Stephen Dubner and Steven Levitt

After two ‘Freak’ books full of captivating stories with often counter-intuitive conclusions, illustrating an unconventional use of economics, this third one in the series also contains many similar fascinating stories but is geared more towards teaching us how to think more rationally and effectively. I think the stories, from Nathan’s hot dog eating contest, to a scientist ingesting bacteria to test ulcers, and to explaining why goals are not kicked into the center in soccer, are again memorable and didactic. But I think the tips given on how to think more clearly are the true gems and takeaways.


  • If a given important problem exists, it is probably hard. Easy problems are solved; hard complex problems linger, despite many having tried to, yet failing to solve it. Complex problems are multi-dimensional, intractable, may have misaligned incentives, and do not have clear cause and effect.
  • In soccer, a penalty kick to the center is significantly more likely to score than a kick to the corner.  Nevertheless, kickers seldom kick to the center, because private incentives (fear of shame) greatly outweigh public incentives (scoring). While people won’t admit this, humans are likely to favor private benefit over a public good.
  • The “economic approach”—relying on data, not hunches or ideology, to solve problems.
  • Health care problem: when people don’t pay the true economic costs, they tend to overuse or consume inefficiently. It then crowds out the truly sick, and raises costs for the whole system.
  • It takes a lot of courage for people to admit: “I don’t know”. However, true learning cannot begin until you acknowledge you do not know.
    • Even historical cause and effect are often difficult to understand. Predictions, from economists, weathermen, stock market gurus, political analysts, and various gurus, are on average worthless.
    • Tetlock study of 300 political experts across 20 years—result was their predictions were not better than chance, and worse than a computer extrapolation algorithm. Dogmatism, or overconfidence, was the common attribute of the worst predictors.
    • Predictors are almost never held accountable for their failed predictions. Reputation cost of a failed prediction is lower than that of admitting to not know.
    • Most people don’t know that much of the world; yet they are shown to not know themselves very well either—80% of people think they are above average drivers.
    • Experts in one domain, are no more likely to be knowledgeable about a different domain.
  • The key to learning is feedback. Experimentation (randomized control trials are best) and feedback gives us a chance to cut through the complexity and discover cause and effect relationships. Resistance to experimentation may be due to culture, lack of experience, excessive complexity, or belief one already knows the answer.
    • Wine experiments have shown that on average, people enjoy expensive wines slightly less than cheaper wines. Even wine experts often could not distinguish between cheaper or more expensive wines.
  • Ask the “right” question: properly frame or re-define the problem at its core.
    • Ex: the American education problem is not necessarily a school problem, but a teacher quality problem (U.S. teachers are not recruited from the brightest), and a parent quality problem (home environment has disproportional influence to childhood education).
    • Ex: hot dog eating: focusing not necessarily on eating more hot dogs, but on how to make hot dogs easier to eat.
    • Ex: ulcer caused by stomach-surviving bacteria, not by stress.
  • Think like a kid: think small and simply; don’t automatically dismiss the obvious solutions; avoid preconceptions; tackle problems you enjoy.
  • Incentives drive the world. But they are not always obvious.
    • People don’t always disclose, or may not even know, their own incentives (declared preferences often differ from revealed preferences). Heed not what people say, or what is “right”, but what they actually do (behavior signals what is really care about). Incentivize them on those dimensions valuable to them but cheap for you to provide.
    • Incentives may back-fire: people may change their behavior or not behave in ways expected; they may game the system.
    • Relationship frameworks govern behavior: financial; competitive; collaborative; loved-ones; authority figure. Problems may be solved by shifting frames. Treating people with decency and respect is powerful—it shifts them into the cooperative frame.
    • “Separating equilibrium”: a game theory-based incentive that helps distinguish a good from a bad actor. Ex: King Solomon splitting baby; Van Halen brown M&M’s; Zappos $2000 quitting cash offer.
  • How to persuade: the reasons need to resonate with your opponent, not yourself; acknowledge both the strength of your opponent’s argument and the major weakness of your own argument; don’t insult them but try to make them an ally; use effective storytelling.
  • Quit strategically: ignore sunk costs; consider opportunity costs; experimentation and failure provide valuable feedback.
    • “Failing” well: fail quickly and cheaply; conduct a “pre-“mortem to think through how something can fail.
    • Facing a difficult decision: flip a coin—how you feel about outcome reveals what you want.
  • Thinking like a “Freak” method: think rationally; admit to not knowing; redefine or reframe the problem at core; experiment; ignore artificial limits; attack the root cause (consider legacy and repugnant issues) not the symptoms; eliminate biases; think like a kid; master the power of incentives and frames; judiciously quit.

Finished: 21-Aug-2017. Rating: 8/10.