Among other things Nassim Nicholas Taleb is a flâneur, mathematician, probabilist, systems thinker extraordinaire, philosopher, elitist, and most certainly a curmudgeon. I ran into him years ago when doing some research in financial engineering. He had just quit being a successful Wall Street trader, and got a job as a college professor exploring and disseminating (mostly his) ideas about “decision making under opacity.” The more I read him, the more I wanted to be like him when I grew up – so I became his student.
Readers may have heard about Taleb when they first ran into the now well-worn notion of a ‘black swan’ event (q.v. and here and here). Besides numerous technical papers on things like low probability events and infrequent happenings, he has written three books for the intelligent reader – the man is also extremely well-read, well-traveled, speaks several languages, and does not readily suffer fools nor write for the thinking impaired. Fooled by Randomness (2001) was his initial foray in which he introduced Black Swan to a wider audience. Because the book also introduced realworld risk taking in a new and revealing light, it was immediately picked up by the investment and banking communities, and we began seeing ‘black swan’ in the popular press.
His next book – surprisingly titled Black Swan (2007) - expanded on the topics of risk that now reached into public policy making, and embraced people familiar with behavioral economics. Somewhere in there he caught the attention of Nobelist Daniel Kahneman, who along with Tversky RIP, founded behavioral economics, and documented the foibles and fables of human decision making in his recent Thinking, Fast and Slow (2012), a must read in its own right.
And after letting a few more years pass and his popularity (notoriety?) grow, Taleb decided to gather his own near and far thinking into one volume that he purports as his summa, Antifragile – Things That Gain from Disorder (2012). The book runs over 570 pages with diagrams and squiggly laden technical appendixes. In the process of covering the extension of his ideas to what Taleb calls antifrigility, he takes the reader through a wonderful journey of western philosophy going back past the Greeks. Heavy emphasis is shown to Mediterranean civilizations since Taleb is an immigrant from Christian Lebanon, and all things Levantine (including things Arabic) are firmly embedded in his double helix. But the reader also learns about fine foods and wine available in various hideaways all over Europe, as he is regaled with the inner workings of complex systems that have been intelligent enough to adapt and even thrive in hostile environments – Man happens to be one of them.
Taleb teaches that antifragility is a property of a system (living, hardware, social, …) that allows it to take good sized hits from its environment and continue functioning – sort of like a Timex watch, ‘takes a licking and keeps on ticking.’ In fact, adaptive systems become more antifragile the more they are exposed to survivable stresses and strains. And it goes the other way around also, the more adaptive systems are sheltered or coddled, the more fragile they become, unable to withstand even relatively mild shocks without breaking.
Examples of such systems in our civilized environment are presented to illustrate the point. Readers may recognize versions of antifragility that have been practiced and taught everywhere from athletics to AIs that today can learn from their environment and experience. Those who have studied in the system sciences will respond with an ‘of course, what other kinds of responses would you expect?’ But Taleb ties it all together, and makes a strong case for indicting the current politically correct and risk hysterical policies that have molded modern societies. He labels as ‘fragilistas’ people who specialize in fashioning processes and policies that are to be shielded from survivable encounters which would make them stronger.
Also intertwined in the dissertation are examples of iatrogenics (another new word for me), which is the “harm done by the healer, as when the doctor’s interventions do more harm than good.” All of us are familiar with the kinds of iatrogenics profusely proliferated by governments and their minions.
Fascinating also were the examples of hormesis that humans have purposely practiced to increase their antifragility. Hormesis is the term for “a bit of harmful substance, or stressor, in the right dose and with the right intensity that stimulates the organism and makes it better, stronger, healthier, and prepared for a stronger dose at the next exposure.” And you guessed it, Taleb recommends including a touch of hormesis inserted into your daily round to prepare you for what can really take you down.
Taking a step back, Taleb introduces his reader early on to the “Three Types of Exposure” that form the “Central Triad” framing his ideas on antifragility. These are Fragile, Robust, and Antifragile. Look at them as dimensions of the coordinate system into which Taleb invites us to put things when we adopt his new perspective on the world.
As you may have guessed, fragile things (systems) need “tranquility” – no random bumps or jerks, everything encountered in a recognized, stable, and anticipated order for which the system was designed. Centralized governments are examples of fragile systems. In opposition, antifragile beings and systems not only tolerate a good bit of disorder, but actually thrive on it and, therefore, may even seek it out (remember hormesis?). Distributed systems, those with decentralized control, are nature’s champion antifragilistas. In governance confederations of semi-independent city states and provinces (think Swiss cantons) has demonstrated historical resilience. And in systems engineering we do our best to design such mechanical, computational, and procedural organisms. But as you may have noticed, the notion has been slow on the uptake within large bureaucracies.
The remaining dimension is robustness. Robustness lies between fragility and antifragility. It is a property of systems that are designed to take punishment from a narrower and more expected domain of stresses. And its make-up is such that even if you change some parts of the process or structure, it will continue reacting in the same manner as before (i.e. as designed). A robust system takes little account of a certain level of changes in its construction or even in the performance of its components, and seeks to retain its input/output relationships (transfer function). Finally, robustness does not increase with hormetic applications.
Anyway, I hope that you get a little flavor of a very large idea that Taleb introduces, details, explores, and proselytizes in Antifragile. The abundant intellectual candy that he throws at the reader comes under fascinating labels, some more of these I list below.
Conflation of Event and Exposure,
Green Lumber Fallacy,
The Robert Rubin violation, …
Read the book, it's a fun way to get smarter.