For discussion by ASQ’s Influential Voices this month, Paul Borawski looks at Risk, Failure & Careers in Quality.
There is a bias toward avoiding the possibility of failure by avoiding actions which may lead to failure or even any action at all. This is a problem. The need in so many organizations to avoid failure means wise actions are avoided because there is a risk of failure.
Many times the criticism of such cultures however gets a bit sloppy, in my opinion, and treats the idea of avoiding failure as bad. Reducing the impact of failure is very wise and sensible. We don’t want to sub-optimize the whole system in order to optimize avoiding as much failure as possible. But we don’t want to sub-optimize the whole system by treating failure as a good thing to welcome either.
Part of the problem is sloppy thinking about what failure is. Running an experiment and getting results that are not as positive as you might have hoped is not failure. That is going to happen when run experiments. The reason you run PDSAs on a small scale is to learn. It is to minimize the cost of running the experiments and minimize the impacts of disappointments.
Running an experiment and having results that negatively impact customers or result in costs that were not planned may well be failure. Though even in that case calling it failure may be less than useful. I have often seen that a new process that eliminated 10 problems for customers but added 2 is attacked for the 2 new problems. While those new problems are not good that you have a net gain of 8 fewer problems should be seen as success, I would argue, not failure. However, often this is not the case. And the attitude that any new problem is blamed on those making a change, regardless of the overall system impact does definitely hamper improvement.
As I said in a previous post, Learn by Seeking Knowledge, Not Just from Mistakes:
It isn’t an absence of people making mistakes (including carrying out processes based on faulty theories) that is slowing learning. People are very reluctant to make errors of commission (and errors of commission due to a change is avoided even more). This reluctance obviously makes learning (and improvement) more difficult. And the reluctance is often enhanced by fear created by the management system.
The culture I want to develop is one where systems thinking leads to optimizing the overall system. And to the extent that to do so it is wise to take risks that may include some failures taking risks is good. But we need to also use the long known practices to reduce any costs of adverse results.
Continue reading →