3
" In his seminal book Antifragile, Nassim Nicholas Taleb shows how the linear model is wrong (or, at best, misleading) in everything from cybernetics, to derivatives, to medicine, to the jet engine. In each case history reveals that these innovations emerged as a consequence of a similar process utilized by the biologists at Unilever, and became encoded in heuristics (rules of thumb) and practical know-how. The problems were often too complex to solve theoretically, or via a blueprint, or in the seminar room. They were solved by failing, learning, and failing again. "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
17
" Much of the literature on creativity focuses on how to trigger these moments of innovative synthesis; how to drive the problem phase toward its resolution. And it turns out that epiphanies often happen when we are in one of two types of environment. The first is when we are switching off: having a shower, going for a walk, sipping a cold beer, daydreaming. When we are too focused, when we are thinking too literally, we can’t spot the obscure associations that are so important to creativity. We have to take a step back for the “associative state” to emerge. As the poet Julia Cameron put it: “I learned to get out of the way and let that creative force work through me.”8 The other type of environment where creative moments often happen, as we have seen, is when we are being sparked by the dissent of others. When Kevin Dunbar, a psychologist at McGill University, went to look at how scientific breakthroughs actually happen, for example (he took cameras into four molecular biology labs and recorded pretty much everything that took place), he assumed that it would involve scientists beavering away in isolated contemplation. In fact, the breakthroughs happened at lab meetings, where groups of researchers would gather around a desk to talk through their work. Why here? Because they were forced to respond to challenges and critiques from their fellow researchers. They were jarred into seeing new associations. "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
19
" In 2013 a study published in the Journal of Patient Safety8 put the number of premature deaths associated with preventable harm at more than 400,000 per year. (Categories of avoidable harm include misdiagnosis, dispensing the wrong drugs, injuring the patient during surgery, operating on the wrong part of the body, improper transfusions, falls, burns, pressure ulcers, and postoperative complications.) Testifying to a Senate hearing in the summer of 2014, Peter J. Pronovost, MD, professor at the Johns Hopkins University School of Medicine and one of the most respected clinicians in the world, pointed out that this is the equivalent of two jumbo jets falling out of the sky every twenty-four hours. “What these numbers say is that every day, a 747, two of them are crashing. Every two months, 9/11 is occurring,” he said. “We would not tolerate that degree of preventable harm in any other forum.”9 These figures place preventable medical error in hospitals as the third biggest killer in the United States—behind only heart disease and cancer. "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do