I really enjoyed this book. I had Antifragile recommended to me many times, but wanted to read Taleb’s books in order. (I don’t recommend this, just skip to Antifragile and read Fooled By Randomness and The Black Swan later if you are still interested). Still, Taleb presents many interesting ideas about how to examine, and particularly how not to examine, events involving randomness.
Those things that came with the help of luck can be taken away by luck. Things that come with little help of luck are more resistant to randomness.
It doesn’t matter how often something succeeds if the cost of failure is too difficult to bear.
Randomness can be a catalyst for a feedback cycle. Monkeys anecdote: it has been shown that monkeys injected with serotonin will rise in the pecking order, which in turn causes a rise in serotonin, until the virtuous cycle breaks and starts a vicious one. Failure causes one to slide in the pecking order, causing a behavior that will bring about further drops in pecking order.
Instead of examining current success, it is more pertinent to be rich in the average of the lives you have led.
Alternative histories: you must judge a performance by the results and the costs of the alternative (if history played out another way). E.g. if enough players play Russian roulette, some will succeed and look “successful”.
People are heroes not because they won or lost, but because of heroic behavior.
People are bad at considering abstract risk. They are much better at focussing on vivid risk.
Rational thinking often seems to rationalize one’s actions by fitting some logic to them, instead of the reverse.
A mistake is not something to be determined after the fact, but in light of the information until that point.
Valuable things tend to stay around for a long time. If it’s been around for a while, it’s likely to stay. The opportunity cost of missing a new thing is minuscule compared to the garbage one has to wade through to get the few winners.
Decision making under uncertainty is better performed by minimal exposure to the media because it’s hard to separate out the few useful items from the noise.
Paying attention to the news has minimal returns on the time investment. People often think that it will surely be the next batch of news that will really make a difference in their understanding of things.
By observing less frequently, we improve the signal to noise ratio, since much of the noise is lost by then. This applies to news, portfolio returns, etc.
The longer something goes without exposure to a rare event, the more vulnerable it will be to the event.
Data can only be used to disprove a proposition, never to prove one.
You usually have to discover things for yourself. You are rarely affected in behavior (in any durable manner) by things you read. The impression tends to wane over time (as you read newer things and the impressions are replaced by fresh ones).
Problems of induction, going from plenty of particulars to the general.
Social treadmill effect: you get rich, move to rich neighborhoods, then become poor again. You get used to wealth and revert to a set point of satisfaction.
Losers often do not show up in analyses where they must be considered (survivorship bias).
People don’t accept randomness as the cause of their success, only their failure.