Im angelsächsischen Ausland macht derzeit Nassim Nicholas Taleb Furore mit seiner Idee, dass man Chaos nicht beherrschen kann, auch statistisch nicht. Statt dessen, so Taleb, sei es besser, das Chaos so zu strukturieren, dass man gelegentlich auftretenden, unvorhersehbare Ereignisse, die er Black Swan Ereignisse nennt, Ereignisse wie z.B. die Finanzkrise der Jahre 2008ff, nicht den Schaden anrichten können, denn sie angerichtet haben.
Seine Argumentation basiert auf den Interdependenzen komplexer Systeme und der damit einhergehenden Unbeherrschbarkeit und gleichzeitigen Produktivität entsprechender Systeme, die schon Hayek mit seinem Begriff der Katallaxie beschrieben hat:
“Complex systems are full of inderdependencies – hard to detect – and nonlinear responses. ‘Nonlinear’ means that when you double the dose of, say, a medication, or when you double the number of employees in a factory, you don’t get twice the initial effect, but rather a lot more or a lot less. Two weekends in Philadelphia are not twice as pleasant as a single one – I’ve tried. When the response is plotted on a graph, it does not show as a straight line (‘linear’), rather as a curve. In such environment, simple causal associations are misplaced, it is hard to see how things work by looking at single parts.
Man-made complex systems tend to develop cascades and runaway chains of reactions that decrease, even eliminate, predictability and cause outsized events. So the modern world may be increasing in technological knowledge, but, paradoxically, it is making things a lot more unpredictable. Now for reasons that have to do with the increase of the artificial, the move away from ancestral and natural models, and the loss of robustness owing to complications in the design of everything, the role of Black Swan is increasing. Further, we are victims to a new disease, called in this book neomania , that makes us buold Black Swan – vulnerable systems – ‘progress’.
An annoying aspect of the Black Swan problem – in fact the central, and largely missed, point – is that the odds of rare events are simply not computable. We know a lot less about hundred-year floods than five-year floods – model error swells when it comes to small probabilities. The rarer the event, the less tractable, and the less we know about how frequent its occurrence – yet the rarer the event, the more confident these ‘scientists’ involved in predicting, modeling, and using PowerPoint in conferences with equations in multicolor background have become.
The antifragile gains from prediction errors, in the long run. If you follow this idea to its conclusion, the many things that gain from randomness should be dominating the world today – and things that are hurt by it should be gone. Well, this turns out to be the case. We have the illusion that the world functions thanks to programmed design, university research, and bureaucratic funding, but there is compelling – very compelling – evidence to show that this is an illusion, the illusion I call lecturing birds how to fly. Technology is the result of antifragility, exploited by risk-takers in the form of tinkering and trial and error, with nerd-driven design confined to the backstage. Engineers and tinkerers develop things while history books are written by academics, we will have to refine historical interpretations of growth, innovation, and many such things” (Taleb, Antifragile, pp.6-8).