Tag Archives: evidence based management

Evidence-based Management

Bob Sutton’s writing includes the excellent article “Management Advice: Which 90% is Crap?” (which we discussed in: Management Advice Failures) and the Knowing Doing Gap. I just discovered his blog today which is quite good: Work Matters. A recent post – Hand Washing and Evidence-based Management, includes some good advice on data and process improvement:

I’ve written before about how handwashing by medical care workers is one of the most well-documented preventable causes of death and disease in health care settings.

Self-report data can be worse than useless. They describe an Australian study where 73% of doctors reported washing their hands, but when the docs were observed by a researcher only 9% were seen washing their hands.

The way they finally got compliance up to nearly 100% was to have a group of the hospitals more influential doctors each press their palms on plates that were cultured and photographed, which resulted in images that “were disgusting and and striking, with gobs of colonies of bacteria.”

Continue reading

The State of Lean Implementation

Interesting survey by the Lean Enterprise Institute notes the following as the major obstacles to transforming to a lean organization.

1 Lack of implementation know-how: 48%
2 Backsliding to the old ways of working: 48%
3 Middle management resistance: 40%
4 Traditional cost accounting system doesn’t recognize the value of lean: 38%

I wonder if asking why several more times would help [broken link replaced with a new link]? It is not as though the difficulty in adopting lean thinking is unique. I don’t know of a management improvement method (TQM, BPR, Six Sigma, ToC…) that easily leads to changing the way a company operates.

Related: Why Use Lean if So Many Fail To Do So Effectively

via: lean blog

Problems Caused by Performance Appraisal

I ran across a great article on the problems created by our common use of performance appraisal today: Unjust Deserts (pdf format) by Mary Poppendieck:

As Sue’s team instinctively realized, ranking people for merit raises pits individual employees against each other and strongly discourages collaboration, a cornerstone of Agile practices.

There is no greater de-motivator than a reward system that is perceived to be unfair.

The article does a good job of explaining these, and several more, problems caused by performance appraisal. It also provides some good thoughts on how to manage effectively, including:

Continue reading

The Customer Knows Best?

The Customer Knows Best? Better Think Again by Anthony W. Ulwick

It’s important to listen to customers – but not follow their words without skepticism. Ask them to design your next product and you’re likely to miss the mark, suggests this Harvard Business Review excerpt.

Excellent point. Some management ideas are pretty easy and straight forward. But many management practices require knowledge and judgment to apply them successfully. Easy solutions may be desired, but, often you must choose between easy and effective (hint, I suggest effective is the better target).

Listening to customers is important but it is not sufficient. W. Edwards Deming made this point emphatically on page 7 of the New Economics:

Does the customer invent new product of service? The customer generates nothing. No customer asked for electric lights… No customer asked for photography… No customer asked for an automobile… No customer asked for an integrated circuit.

Continue reading

Data Based Decision Making

Acumen visits Google:

As a first step, we hope to collaborate with interested Googlers to find better ways to learn what works around the world. Identifying powerful solutions to poverty that are useful to people in different settings, and that are market-driven, scalable, and sustainable, is our greatest challenge. Second, we’re hoping to strengthen how the world measures both social and financial returns to investments in delivering critical goods and services to the poor. Like Google, we hold a deep belief in the power of measuring everything we can.

Google has done a fantastic job of using data to make decisions. In fact so much so, that some think they may go overboard trying to find an algorithm for everything. My dinner with Sergey [the broken link was removed]:
Continue reading

Measuring and Managing Performance in Organizations

Forward [the broken link was removed] (by Tom DeMarco and Timothy Lister) to Measuring and Managing Performance in Organizations:

measurement is almost always part of an effort to achieve some goal. You can’t always measure all aspects of progress against the goal, so you settle for some surrogate parameter, one that seems to represent the goal closely and is simple enough to measure. So, for example, if the goal is long-term profitability, you may seek to achieve that goal by measuring and tracking productivity. What you’re doing, in the abstract, is this:

measure [parameter] in the hopes of improving [goal]

When dysfunction occurs, the values of [parameter] go up comfortingly, but the values of [goal] get worse.

Previous post on this topic:

Tom DeMarco and Timothy Lister are authors of the excellent Peopleware: Productive Projects and Teams

Operational Definitions and Data Collection

Americans’ Dirty Secret Revealed by Bjorn Carey
See also: Google News on washing hands [broken link was remove] – Soap and Detergent Association press release [another broken link was remove]

A study released recently spawned a flurry of articles on washing hands. I have seen such reporting before and again I find it interesting (as sad as that might be). The stories repeatedly say things like: “Men’s hands dirtier than women’s.” The study actual was focused on the percentage of people who washed their hands. While there is likely a correlation, making such leaps in reporting data is not wise. This example is often mirrored in the data use of organizations; where interpretations of the data are given as the facts instead of the data itself.

However that is not what I find most interesting. Instead I find the lack of operational definition interesting. In many of the articles they have quotes like:

In a recent telephone survey, 91 percent of the subjects claimed they always washed their hands after using public restrooms. But, when researchers observed people leaving public restrooms, only 83 percent actually did so.

Only 75 percent of men washed their hands compared to 90 percent of women, the observations revealed.

Claims are often made about results that only are justified based on unstated assumptions about the real world results that the data are meant to represent. But those claims are undermined when there is no evidence provided that the assumptions are valid. Without operational definitions for the data there is a significant risk of making claims about what the data means that are not valid.

Continue reading

Measurement and Data Collection

This is my response to the Deming Electronic Network message (the site is dead, so I removed the link) on measurement.

I find it useful to ask what will be done with the results of data collection efforts (in order to confirm that the effort is a wise use of resources). If you don’t have an answer for how you will use the data, once you get it, then you probably shouldn’t waste resources collecting it (and I find there is frequently no plan for using the results).

I have found it helpful to ask: what will you do if the data we collect is 30? What will you do if it is 3? The answer does not need to be some formula, if 30 then x. But rather that the results would be used to help inform a decision process to make improvements (possibly the decision to focus resources in that area). I find, that asking that question often helps reach a better understanding of what data is actually needed, so you then collect better data.

I believe, it is better to focus on less data, really focus on it. My father, Bill Hunter, and Brain Joiner, believed in the value of actually plotting the data yourself by hand. In this day and age that is almost never done (especially in an office environment). I think doing so does add value. For one thing, it makes you select the vital few important measures to your job.

But it is very difficult for anyone to actual suggest plotting data by hand: they must be very secure in their reputation (or maybe a bit crazy), because it seems to be a hopelessly outdated idea that paints you as the same. My appeal, within the Deming context, is that the psychology of plotting the points yourself is qualitatively different from letting the computer do it. Plotting the data yourself serves to lift the data that you plot out of the sea of data that we find ourselves inundated with and gives you a deeper connection to it. You would not plot all the data that you use by hand; just the most important items.

John Hunter
Curious Cat Management Improvement Connections