Complexity Models + Hard Data = Solutions
to Hard Problems in Health Care
Digest of a presentation by Kevin Dooley, Ph.D.,
Tuesday, April 27, 1999
Answers the questions:
- How can a complexity approach lead to a more relevant
means of collecting and analyzing data?
The Black Hole of Data:A Case Study
Kevin Dooley shares the story of an HMO he consulted with in Minneapolis.
The company was trying to get its subsidiary companies up to speed on
quality concepts, so they hired Dooley to spend 18 months collecting
reams of data on their quality information systems. The sheer volume
of data he collected was staggering.
Then it was time to present the data. Dooley was encouraged
to boil all of the data down to an 8.5 x 11 piece of paper, put it on
an overhead... and then communicate it to the executive committee in
about 10 minutes.
This was the culmination of 18 months of work.
Later, Dooley checked back to see if anything actually
changed as a result of the data collected. Turned out that nothing had
changed.
Contrast that with this scenario: Dooley took a survey
of some customers in one business unit for another organization. It
took a day to develop the survey, a week to implement it, and just two
months to review it. In a fraction of the time, they generated much
more relevant information.
So what’s the key to doing this effectively? How can
data collection aid us in solving problems? What does a complexity approach
have to say about this?
In short, we are finding that we’re spending a lot of
time measuring the wrong stuff. And, we’re not telling stories about
the data.
The Inverse Power Law: A Pervasive Model in Complex
Systems
Traditional statistics has been useful in the machine-like environment...
and even to machines themselves. But traditional statistics begin to
fail us in systems with a strong human element.
Here’s a common scenario: You go out and collect a
lot of data, you plot it in a histogram, show it around and then stop
because there is nothing else to say. But really, there is something
else to say.
The
Inverse Power Law pops up a lot in complex systems — and presents a
story worth telling. Simply, the power law states that frequency and
magnitude are inversely related. In even simpler terms, small stuff
tends to happen a lot, big stuff happens less frequently. This holds
true for earthquakes, for example: There are lots of small tremors,
and only a few big disasters. It’s true in economic behavior: markets
are mostly stable with small fluctuations and rare big crashes. It’s
true in evolution: The number of species going extinct is usually very
few... but occasionally it’s a lot. It’s also true in earth temperature
changes, the growth of cities, energy dissipation and waves. The frequency
of differing notes in Brandenburg and Mozart concertos also follow this
pattern… so there is something aesthetically pleasing about this distribution.
Organizations and Sandpiles
Now imagine that you are dropping sand, one grain at a time, onto a
table top. After dropping many, many grains, they will begin to collect
in a cone-shaped sand pile. As you continue to drop individual grains,
the sand pile becomes bigger and bigger.
But soon, you will reach a state of self-organized
criticality, in which the sand pile can’t get any higher. It stays the
same height and eventually starts having avalanches. Imagine that you
continue to drop individual grains. Additional grains will usually have
no effect. Many will displace just a few other grains of sand, causing
them to slide down the side of the pile. But sometimes, a single grain
will make a huge avalanche happen. You may recognize this dynamic again
as the inverse power law.
There is a moral to this scenario: The same events
that make nothing happen are the same events that make huge catastrophes
happen. Usually when a catastrophe is happening, we start scurrying
about looking for a catastrophic cause. And maybe there is one. But
it’s also likely that it “just happens.” It’s just that single extra
grain of sand. Thus, by eliminating the causes of small problems, we
can also eliminate the causes of severe “avalanches.”
Now think of the avalanches as human behavior. As we
consider the massive linkages between people in our organizations as
a result of information technology, we’re now zipping past the point
of criticality — like the sandpile that cannot grow in height any longer.
This is neither natural, nor healthy. We don’t need to be superconnected.
Not everyone needs to know everything. Intelligence in the future is
knowing when not to connect.
Think of a group of bureaucrats in your office who
love to push paper. They make reports and pass them on to their neighbors.
Those reports then become summary pages in yet other reports. Each time
a piece of information is dropped, it creates a ripple effect into the
efforts of others who have their own reports to compile. Imagine if
everyone was just on the verge of finishing their report, and someone
dropped one more piece of information. Suddenly we have an avalanche.
No one will see the cause. They’ll just wonder “what went wrong this
month?”
Here’s another example of the inverse power law in
the sandpile organization. Let’s say we rank order the mistakes in our
work. The least severe errors happen infrequently. As errors get more
and more severe, it happens because of a preexisting set, generating
compounding problems.
This was the case in a VHA Medication Error study. Every
time a medication error occurred, the type and severity were recorded
on scale, where “0 = potential error” and “6 = patient death.”
Since “deaths” occurred so infrequently (or perhaps
were not reported), it took a long time to generate statistically relevant
data. In retrospect, the “near misses” were happening more frequently,
and could have been assessed within just a few weeks.
Had the “near misses” been the focus of analysis instead
of the “patient deaths,” the study could have occurred much faster.
Remember: the dynamics that make frequent, small events take place are
the same ones that cause catastrophic events. If we measure only deaths
and use that as our data, we’re out of luck because that data is coming
much too slow. So we need to study the less critical events, which are
coming much more often. We can learn a lot about catastrophes by studying
more mundane events.
Colin Powell was asked how he makes such complex decisions
so quickly. He said “If I have about 60% of the information, I make
a decisions. If I wait for more, it becomes too long.”
About the Presenter
Kevin Dooley, Ph.D., is professor of management and industrial
engineering at Arizona State University. He teaches and conducts research
in quality management, innovation and complex systems. He holds a doctorate
in mechanical engineering from the University of Illinois and has consulted
with numerous organizations on issues of business and engineering. Dr.
Dooley also serves as editor of the journal Nonlinear Dynamics, Psychology
and the Life Sciences.
|