26
" These failures are inevitable because the world is complex and we will never fully understand its subtleties. The model, as social scientists often remind us, is not the system. Failure is thus a signpost. It reveals a feature of our world we hadn’t grasped fully and offers vital clues about how to update our models, strategies, and behaviors. From this perspective, the question often asked in the aftermath of an adverse event, namely, “Can we afford the time to investigate failure?,” seems the wrong way around. The real question is, “Can we afford not to? "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
32
" A pre-mortem typically starts with the leader asking everyone in the team to imagine that the project has gone horribly wrong and to write down the reasons why on a piece of paper. He or she then asks everyone to read a single reason from the list, starting with the project manager, before going around the table again. Klein cites examples where issues have surfaced that would otherwise have remained buried. ‘In a session held at one Fortune 50-size company, an executive suggested that a billion-dollar environmental sustainability project had “failed” because interest waned when the CEO retired,’ he writes. ‘Another pinned the failure on a dilution of the business case after a government agency revised its policies.’15 The purpose of the pre-mortem is not to kill off plans, but to strengthen them. It is also very easy to conduct. ‘My guess is that, in general, doing a pre-mortem on a plan that is about to be adopted won’t cause it to be abandoned,’ Kahneman has said. ‘But it will probably be tweaked in ways that everybody will recognize as beneficial. So the pre-mortem is a low-cost, high-pay-off kind of thing. "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
39
" All airplanes must carry two black boxes, one of which records instructions sent to all on-board electronic systems. The other is a cockpit voice recorder, enabling investigators to get into the minds of the pilots in the moments leading up to an accident. Instead of concealing failure, or skirting around it, aviation has a system where failure is data rich. In the event of an accident, investigators, who are independent of the airlines, the pilots’ union, and the regulators, are given full rein to explore the wreckage and to interrogate all other evidence. Mistakes are not stigmatized, but regarded as learning opportunities. The interested parties are given every reason to cooperate, since the evidence compiled by the accident investigation branch is inadmissible in court proceedings. This increases the likelihood of full disclosure. In the aftermath of the investigation the report is made available to everyone. Airlines have a legal responsibility to implement the recommendations. Every pilot in the world has free access to the data. This practice enables everyone—rather than just a single crew, or a single airline, or a single nation—to learn from the mistake. This turbocharges the power of learning. As Eleanor Roosevelt put it: “Learn from the mistakes of others. You can’t live long enough to make them all yourself.” And it is not just accidents that drive learning; so, too, do “small” errors. When pilots experience a near miss with another aircraft, or have been flying at the wrong altitude, they file a report. Providing that it is submitted within ten days, pilots enjoy immunity. Many planes are also fitted with data systems that automatically send reports when parameters have been exceeded. Once again, these reports are de-identified by the time they proceed through the report sequence.* "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do