Tag Archives: critical thinking

Poorly Stratified Data Leads to Mistakes in Analysis

Getting organization to think of data as critical to making effective decisions is often a challenge. But the very next problem is that while data is used it is actually more misused than used.

How Not to be Wrong (book cover)

What is important is not just having numbers mentioned when decisions are being made. Or even having numbers mentioned when those decisions are evaluated after they have been implemented (or course many organizations don’t even evaluate the results of many changes they adopt, but that is a different problem). What is important for “evidence based decision making” is that what those numbers actually mean must be understood. It is easy to be mislead if you don’t think critically about what the numbers tell you and what they do not.

Poorly stratified data is one problem that leads to mistakes in analysis.

How ZIP codes nearly masked the lead problem in Flint

As I ran the addresses through a precise parcel-level geocoding process and visually inspected individual blood lead levels, I was immediately struck by the disparity in the spatial pattern. It was obvious Flint children had become far more likely than out-county children to experience elevated blood lead when compared to two years prior.

How had the state so blatantly and callously disregarded such information? To me – a geographer trained extensively in geographic information science, or computer mapping – the answer was obvious upon hearing their unit of analysis: the ZIP code.

Their ZIP code data included people who appeared to live in Flint and receive Flint water but actually didn’t, making the data much less accurate than it appeared [emphasis added].

This type of assumption about data leading to mistakes in analysis is common. The act of using data doesn’t provide benefits is the data isn’t used properly. The more I see of the misuse of data to more importance I place on the skill of thinking critically. We must challenge assumptions and challenge what the data we look at actually means.

Continue reading

The Importance of Critical Thinking and Challenging Assumptions

There are many factors that are important to effectively practice the management improvement ideas I have discussed in this blog for over a decade. One of the most important is a culture that encourages critical thinking as well as challenging claims, decisions and assumptions.

I discussed this idea some in: Customers Are Often Irrational. There is a difference between saying people wish to have their desires met and people act in the manner to maximize the benefits they wish to receive.

It is important to study customer’s choice and learn from them. But being deceived by what their choice mean is easier than is usually appreciated. Often the decision made is contrary to the ideal choice based on their beliefs. It is often poor decision making not an indication that really they want a different result than they express (as revealed versus stated preference can show). People that ignore the evidence behind climate change and condemn coastal areas to severe consequences don’t necessarily prefer the consequences that their decision leads to. It may well be that decision to ignore the evidence is not based on a desire to suffer long term consequences in order to get short term benefits. It may well be just an inability to evaluate evidence in an effective way (fear of challenging ourselves to learn about matters we find difficult often provides a strong incentive to avoid doing so).

Knowing the difference between choosing short term benefits over long term consequences and a failure to comprehend the long term consequences is important. Just as in this example, many business decisions have at the root a desire to pretend we can ignore the consequences of our decisions and a desire to accept falsehoods that let us avoid trying to cope with the difficult problems.

photo of me with a blackboard in my father's office

Photo of me and my artwork in my father’s office by Bill Hunter

It is important to clearly articulate the details of the decision making process. We need to note the actual criticism (faulty logic, incorrect beliefs/assumptions…) that results in what some feel is a poor conclusion. But we seem to find shy away from questioning faulty claims (beliefs that are factually incorrect – that vaccines don’t save people from harm, for example) or lack of evidence (no data) or poor reasoning (drawing unsupported conclusions from a well defined set of facts).

Critical thinking is important to applying management improvement methods effectively. It is important to know when decisions are based on evidence and when decisions are not based on evidence. It can be fine to base some decisions on principles that are not subject to rational criticism. But it is important to understand the thought process that is taken to make each decision. If we are not clear on the basis (evidence or opinion regardless of evidence) we cannot be as effective in targeting our efforts to evaluate the results and continually improve the processes in our organizations.

Describing the decision as “irrational” is so imprecise that it isn’t easy to evaluate how much merit the criticism has. If specific facts are called into question or logical fallacies within the decision making process are explained it is much more effective at providing specific items to explore to evaluate whether the criticism has merit.

When specific criticisms are made clear then those supporting such a decision can respond to the specific issues raised. And in cases where the merits of one course of action cannot be agreed to then such critical thought can often be used to create measures to be used to evaluate the effectiveness of the decision based on the results. Far too often the results are not examined to determine if they actually achieved what was intended. And even less often is care taken to examine the unintended consequences of the actions that were taken.

Continue reading

Customers Are Often Irrational

Penney Pinching

“The first rule is that there are no irrational customers,” Drucker wrote in Management: Tasks, Responsibilities, Practices. “Customers almost without exception behave rationally in terms of their own realities and their own situation.”

“in terms of their own realities and their own situation.” is a huge caveat. Essentially plenty of customers behave irrationally – by any sensible definition of rational. I agree, to make them customers and keep them as customers you need to develop theories that can make sense of their behavior. And it doesn’t make sense to think if they behave irrationally that means randomly (chaotically, unpredictably, uncontrollably). Customers can be predictably irrational (as a group).

Seeing that people will chose* to fly lousy airlines because the initial price quoted is a little bit cheaper than an alternative (or because they are in a frequent flyer program) you can say the customer is behaving rationally if you want. Coming up with some convoluted way to make their decision, which based based solely on their desired outcomes (and cost factors etc.) is not rational, to be seen as rational seems like a bad idea to me. Instead figure out the models for how they fail to behave rationally.

They consistently chose an option they shouldn’t rationally want; in order to save some amount of money they don’t care about nearly as much as the pain they will experience. And the amount they will then complain about having to suffer because they chose to deal with the badly run airline. That isn’t rational. It is a common choice though.

The problem is not in thinking the customers are being irrational for not buying what you are selling. The problem is in thinking the customers will behave rationally. Your theory should not expect rational behavior.

There are plenty of other examples where customers make irrational decisions. I don’t think calling them rational (within the irrationality of their “own realities” makes sense). People will buy things because they think it is a better bargain to get the more expensive item that is the same, for more money, because originally the store charged more and now it is on sale. Anchoring isn’t an understanding of how people are rational. It is an understanding of how psychology influences people in ways that are not rational.

Continue reading

Stratification and Systemic Thinking

I am reading a fascinating book by Jessica Snyder Sachs: Good Germs, Bad Germs. From page 108:

At New York Hospital, Eichenwald and infectious disease specialist Henry Shinefield conceived and developed a controversial program that entailed deliberately inoculating a newborn’s nostrils and umbilical stump with a comparatively harmless strain of staph before 80/81 could move in. Shinefield had found the protective strain – dubbed 502A – in the nostrils of a New York Hospital baby nurse. Like a benign Typhoid Mary, Nurse Lasky had been spreading her staph to many of the newborns in her care. Her babies remained remarkably healthy, while those under the care of other nurses were falling ill.

This is a great example of a positive special cause. How would you identify this? First you would have to stratify the data. It also shows that sometimes looking at the who is important (the problem is just that we far too often look at who instead of the system so at times some get the idea that it is not ok to stratify data based on who – it is just be careful because we often do that when it is not the right approach and we can get fooled by random variation into thinking there is a cause – see the red bead experiment for an example); that it is possible to stratify the data by person to good effect.

The following 20 pages in the book are littered with very interesting details many of which tie to thinking systemically and the perils of optimizing part of the system (both when considering the system to be one person and also when viewing it as society).

I have recently taken to reading more and more about viruses, bacteria, cells, microbiology etc.: it is fascinating stuff.

Related: Science Books by topicData Can’t LieUnderstanding Data

Errors in Thinking

photo of Jerome Groopman

The Doctor’s In, But Is He Listening?, text and podcast from NPR:

Jerome Groopman (photo) is a doctor who discovered that he needed a doctor. When his hand was hurt, he went to six prominent surgeons and got four different opinions about what was wrong. Groopman was advised to have unnecessary surgery and got a seemingly made-up diagnosis for a nonexistent condition. Groopman, who holds a chair in medicine at Harvard Medical School, eventually found a doctor who helped…

“Usually doctors are right, but conservatively about 15 percent of all people are misdiagnosed. Some experts think it’s as high as 20 to 25 percent,” Groopman tells Steve Inskeep. “And in half of those cases, there is serious injury or even death to the patient.”

Errors in thinking: We use shortcuts. Most doctors, within the first 18 seconds of seeing a patient, will interrupt him telling his story and also generate an idea in his mind [of] what’s wrong. And too often, we make what’s called an anchoring mistake – we fix on that snap judgment.

An understanding of theory of knowledge is helpful to counteract errors in thinking. How we think is not perfect, and an understanding the weaknesses and faulty conclusions we are susceptible to making is helpful. That can help avoid jumping to conclusions that are faulty and to design systems that counteract such behavior.

Related: Epidemic of DiagnosesWrite it DownThe Illusion of UnderstandingIllusions – Optical and Otherhealth care improvement posts

Read an exceprt [the broken link was removed] from the book: How Doctors Think by Jerome Groopman .

All Models Are Wrong But Some Are Useful

“All Models Are Wrong But Some Are Useful” -George Box

A great quote. Here is the source: George E.P. Box, Robustness in the strategy of scientific model building, page 202 of Robustness in Statistics, R.L. Launer and G.N. Wilkinson, Editors. 1979.

See more quotes by George Box.

Related: Dangers of Forgetting the Proxy Nature of Dataarticles by George BoxQuotes by Dr. W. Edwards Deming

Making Better Decisions

Comment on: When Times Are Tough, Do You Make Better Decisions? [the broken link was removed]

When times are tough you are more likely to do something – take some action, make some decision. When times are good, many are content to let things go: not make any tough decisions or any that might upset someone… When in a bind it is accepted that something has to be done, so you can often get past the “we are doing ok, why make us change…” objections.

Similarly it can encourage those to question a decision they don’t agree with (instead of, when times are good, thinking: well I disagree but I will just go along…). So it is possible that in a dysfunctional management system (which is alot of them) it can seem that when times are tough better decisions are made.

In addition, when times are bad any decision might seem good when things improve due to regression to the mean. Peter Scholtes illustrated this with a boss who yelled at his workers when performance become too bad. And his belief that this helped was reinforced as performance improved after the “tough talk.” Of course, the perception of increased performance may not have anything to do with the “tough talk.”
Continue reading

The Illusion of Understanding

The “Illusion of Explanatory Depth”: How Much Do We Know About What We Know? (broken link 🙁 was removed) is an interesting post that touches on psychology and theory of knowledge.

Often (more often than I’d like to admit), my son… will ask me a question about how something works, or why something happens the way it does, and I’ll begin to answer, initially confident in my knowledge, only to discover that I’m entirely clueless. I’m then embarrassed by my ignorance of my own ignorance.

I wouldn’t be surprised, however, if it turns out that the illusion of explanatory depth leads many researchers down the wrong path, because they think they understand something that lies outside of their expertise when they don’t.

I really like the title – it is more vivid than theory of knowledge. It is important to understand the systemic weaknesses in how we think in order to improve our thought process. We must question (more often than we believe we need to) especially when looking to improve on how things are done.

If we question our beliefs and attempt to provide evidence supporting them we will find it difficult to do for many things that we believe. That should give us pause. We should realize the risk of relying on beliefs without evidence and when warrented look into getting evidence of what is actually happening.

I commented on in this for Science and Engineering blog.

Related: Management is PredictionTom Nolan’s talkInnovate or Avoid RiskManagement: Geeks and DemingTheory in Practice

Distort the System

From our post: Targets Distorting the System, Dr. Brian Joiner:

spoke of 3 ways to improve the figures: distort the data, distort the system and improve the system. Improving the system is the most difficult.

Another example of this in practice: Recount helps one university rise in the rankings [the broken link was removed]:

Behnke, who says he’s no fan of rankings, said he recently spoke to a provost at another institution who was capping class sizes at 19 to boost the “Classes Under 20” number.

I am sure “classes under 20” is a proxy for an intimate learning environment and interaction with knowledgeable professors that can teach well. You can’t directly measure the benefit of interaction with a professor in a small group on learning to create data to be used in ranking schools (Deming on unknown and unknowable figures). So classes with under 20 students and % of faculty with PhDs… are used as proxies for this idea.

If the proxy is the focus (as in school rankings) then distorting the system to create better looking data is a likely result. The purpose behind the action has great significance. If an institution desired to create a better learning environment and they used say a cause and effect diagram to find a group of problems and then determined one appropriate improvement step was to reduce class size (and perhaps another was to reduce the importance of tests and perhaps another was to provide professors training on effective teaching strategies) that a sensible path to improving the system.

Continue reading

The Customer Knows Best?

The Customer Knows Best? Better Think Again by Anthony W. Ulwick

It’s important to listen to customers – but not follow their words without skepticism. Ask them to design your next product and you’re likely to miss the mark, suggests this Harvard Business Review excerpt.

Excellent point. Some management ideas are pretty easy and straight forward. But many management practices require knowledge and judgment to apply them successfully. Easy solutions may be desired, but, often you must choose between easy and effective (hint, I suggest effective is the better target).

Listening to customers is important but it is not sufficient. W. Edwards Deming made this point emphatically on page 7 of the New Economics:

Does the customer invent new product of service? The customer generates nothing. No customer asked for electric lights… No customer asked for photography… No customer asked for an automobile… No customer asked for an integrated circuit.

Continue reading