Black Swans- or why all predictions about the future are screwed.
The Black Swan: The Impact of the Highly Improbable
What do 9/11, Google, the Great Recession, Covid-19, and the election of Donald Trump all have in common? All of these surprises and shocks were black swans- unforeseen events that changed history. Everybody knows that swans are white. You see white swans most commonly, but once in a while nature throws us a curve with a black one.
The black swan dilemma is a term that has come into popularity to try to make sense of the unpredictable and unknowable. After a tragic event such as 9/11, that no one imagined could ever happened, we try to re-write history to find reasons for why we might have missed the events that led up to it. In a world ruled by algorithms and probabilities, the existence of black swans drives people crazy. How can you plan for a future that is ultimately unknowable?
This book, by Nicholas Taleb, tries to take a deep dive into why black swans exist and how that should force us to adjust our risk profiles and plans for the future. The author distinguishes between positive black swans, like Google and Netflix, that have grown enormously in size and influence, and negative black swans, like natural disasters, wars, and pandemics, that come about randomly, disrupting entire economic systems. The goal, he states, is to beware of exposing yourself too much to the negative ones, but try to find the positive ones before everyone else does.
This is a long, wandering book, and I wish Taleb had organized it better and included summaries once in a while. But it's still worth wading through if only to gain a bit more humility about what to expect from the future. The author tears into economists, politicians, and statisticians who worship data as the one and only way to predict the future. Economists didn't predict the 2008 recession, pollsters didn't predict Donald Trump, and the medical community didn't predict the Covid-19 epidemic.
Taleb creates two models of our reality. One he calls Mediocristan, where everything varies in minor ways and things are fairly predictable. This includes such things as height, weight, or income for people in the service sector. The other one is Extremistan, where things can get out of whack very quickly, and extremes dominate. In this model, professions and people that can build huge scales of success and dominate their field. A small group of athletes, entertainers, businessmen, and internet pioneers can accumulate most of the money and fame through something called the Matthew effect. In the predictable Mediocristan, it's highly unlikely that anything extreme or exciting will ever happen. But in Extremistan, scalable wealth, monopolistic companies, and huge concentrations of information create a fertile ground for black swans.
One of my favorite analogies from the book is the turkey problem. For all of its life, a farmed turkey has it pretty easy- the farmer comes and feeds it, and then he leaves. From the turkey's viewpoint, the farmer is a pretty nice guy. But then one day, the farmer comes and chops off the turkey's head and eats him. How was that predictable- if you're the turkey? This leads him to the narrative fallacy, where we create stories about how things are based on a string of events we observe. Many times, these stories are inaccurate, and history is full of faulty, incomplete stories that turned out disastrous.
My favorite story from the book concerns casinos, who make it their business to play the odds and come out on top. If anybody could beat a black swan of bad luck or the unpredictable, you would think it would be them. But one of the biggest casinos in Las Vegas lost the most money not from gambling, but from a freak tiger accident that claimed their star attraction (Siegfried and Roy). Another casino had a contractor that attempted to dynamite the casino over unpaid bills, another one faced huge fines because one of its employees stashed IRS forms under her desk, and a third had to pay out after the owner's daughter was kidnapped. It almost reminds me of the movie Jurassic Park, where they think they have everything under control, but didn't count on one of their employees double-crossing them.
Taleb goes through a laundry list of cognitive biases (many covered earlier in this blog), each one of which helps to conceal future black swans.
- Confirmation bias leads us to a naïve belief that the available evidence is all that we need.
- Survivorship bias covers up tons of silent evidence that comes from the fact that losers don't write history.
- The Ludic fallacy shows that you can't use sterile statistical models to predict messier real-world results.
- The Dunning Kruger effect leads to epistemic arrogance, or the belief that we know things when really we don't have nearly enough information. Accountants may be reliable experts on current number-crunching, but economists are unreliable in most of their assumptions and predictions.
Though the book came out in 2007 before the financial crisis, Taleb wrote an interesting follow-up to this black swan that he includes in the book, and which can be found here.
Taleb points out the famous Phillip Tetlock study that shows how wrong the experts were when asked to predict things. Tetlock's book, Superforecasting, is a great companion to this one, as is the Fifth Risk, by Michael Lewis. Predicting the future may be harder now than it's ever been, and there are too many large assumptions that we rely on that could go wrong. Black swans are an increasing likelihood as our world becomes more interdependent and interconnected, and we need to expect and prepare for them as best we can. The Covid 19 epidemic was definitely something no one expected in 2020, and it shows how vulnerable we are in many ways.
How do we prepare for things we can't anticipate? How can we know the unknowable? How do we avoid preparing for things that will never happen, while limiting our risks on cataclysmic things that are more likely to happen? That's the trillion dollar question.
We prepare ourselves with humility and curiosity, looking at all the evidence along the way, and not fooling ourselves with cognitive biases, distracting stories, and faulty assumptions.