123
" Take science, a discipline where learning from failure is part of the method. This is a point that has been made by the philosopher Karl Popper, who suggested that science progresses through its vigilant response to its own mistakes. By making predictions that can be tested, a scientific theory is inherently vulnerable. This may seem like a weakness, but Popper realized that it is an incalculable strength. “The history of science, like the history of all human ideas, is a history of . . . error,” Popper wrote. “But science is one of the very few human activities—perhaps the only one—in which errors are systematically criticized and fairly often, in time, corrected. This is why we can say that, in science, we learn from our mistakes and why we can speak clearly and sensibly about making progress. "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
125
" In a simple world, blame, as a management technique, made sense. When you are on a one-dimensional production line, for example, mistakes are obvious, transparent, and are often caused by a lack of focus. Management can reduce them by increasing the penalties for noncompliance. They can also send a motivational message by getting heavy once in a while. People rarely lose concentration when their jobs are on the line. But in a complex world this analysis flips on its head. In the worlds of business, politics, aviation, and health care, people often make mistakes for subtle, situational reasons. The problem is often not a lack of focus, it is a consequence of complexity. Increasing punishment, in this context, doesn’t reduce mistakes, it reduces openness. It drives the mistakes underground. The more unfair the culture, the greater the punishment for honest mistakes and the faster the rush to judgment, the deeper this information is buried. This means that lessons are not learned, so the same mistakes are made again and again, leading to more punitive punishment, and even deeper concealment and back-covering. "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
130
" Toyota has a rather unusual production process. If anybody on the production line is having a problem or observes an error, that person pulls a cord that halts production across the plant. Senior executives rush over to see what has gone wrong and, if an employee is having difficulty performing her job, she is helped as needed by executives. The error is then assessed, lessons learned, and the system adapted. It is called the Toyota Production System, or TPS, and is one of the most successful techniques in industrial history. “The system was about cars, which are very different from people,” Kaplan says when we meet for an interview. “But the underlying principle is transferable. If a culture is open and honest about mistakes, the entire system can learn from them. That is the way you gain improvements. "
― Matthew Syed , Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do