A few years ago, I read Blink:
The Power of Thinking Without Thinking, by Malcolm Gladwell (Little, Brown
& Company, 2005). At the time, the
book struck a chord with me. The main
concept explains how people, especially experts, can tell at a glance what is
going on and make accurate decisions.
Gladwell shares examples of this, including art experts who immediately
know a statue is a forgery, psychologists who can tell if a marriage will break
up in three minutes, and another psychologist who can understand
microexpressions. Experts do this by
“thin slicing” available information, using relevant data quickly and
discarding the other, useless parts of the whole picture. The ability is affected, Gladwell explains,
by other brain hacks such as priming.
The main point is that clear thinking can be overwhelmed by irrelevant
information, and experts learn to ignore irrelevant information and trust their
intuition.
This appealed to me because I had seen this way of thinking
in action. I’ve seen paramedics glance
at a patient and know what was wrong within 5 seconds. I’ve seen firefighters subconsciously know a
roof was unstable, without being able to specifically explain why in a
reasonable way. I’d seen doctors know
when a patient was hiding relevant parts of their story. As an expert in EMS, it appealed to my self-image
to imagine I could consistently do this.
The problem with Blink
is that expert-level knowledge is required for successful thin slicing. John Gottman, the psychologist who could
predict divorce at a 90% success rate after three minutes of watching a video
of a couple’s interactions, used extensive science, methodology built up over
the course of years, and reason to be able to do the trick. Paul Eckman’s microexpression Facial Action
Coding System doesn’t use thin slicing – the system fills a 500-page
binder. Another problem is that
incorrect thin slicing happens all the time and leads to dreadful
outcomes. It could be argued that some
police shootings of unarmed people are examples of horrifyingly incorrect thin
slicing. The third issue with the book
is Gladwell doesn’t explain how this works or how to improve the reader’s
ability to perform this skill. How does
an expert mind accomplish what a novice mind cannot?
Finally, I have a “Moneyball problem” with the ability of
experts to consistently make accurate intuitive decisions. If experts were especially accurate, then
baseball scouts (experts) wouldn’t need statisticians (non-experts relying on
data) in baseball player evaluations.
Slower decisions based on reasoning and data can be as strong as
blinking out an intuitive answer. When I
first read the book, I intuitively loved it.
After further reflection, I think Gladwell’s Blink is probably an example of reporting bias – include the hits
and don’t mention the misses.
Blink popped back into
my mind because I finished reading Thinking,
Fast and Slow by Daniel Kahneman (Farmer, Straus and Girous, 2011). Kahneman and his late collaborator, Amos
Tversky, are/were psychologists who specialized in investigating why people
don’t always act in ways that rationality would dictate they should. There is an imaginary creature, Homo economicus, which is used as a model in economic
discussions. This creature always acts
perfectly to enhance its interests and economists can model the behavior. People don’t act perfectly in their own
interests, however. Studying why they
didn’t led Kahneman to win a Nobel Prize in 2002. In economics.
He is a psychologist.
Badass. (Tversky would probably
have won too, but you have to be alive to win a Nobel.)
Thinking, Fast and
Slow explains how the brain works with two systems. System 1* is the intuitive cognitive
system. It makes snap judgments. System 1 is effortless and automatic. 2+2=___.
Black and ______. System 1 gave
you those answers. System 1 is also affected
by emotion and associating images together.
When you associate a song with a specific time in your life, or a smell
with a specific location, it is probably System 1 at work. System 1 is the most common use of the brain. Expert use of System 1 is what Blink was talking about.
System 2, in contrast, is the conscious, thinking, rational
mind. It is the system used to figure
out 24x37=_____. System 2 can read a
map. System 2 solves big problems in a
rational, logical way. The problem is System
2 is slow and lazy. If you don’t force
System 2 to work, it won’t. As proof, I
offer the fact that most of you didn’t expend the effort to find out 24x37=888.
Too much data! System 2 is out! Source |
Here is the problem.
System 1 is often wrong. System 1
makes bad decisions. System 2 is pretty
much consistently right, but is lazy and hard to turn on consistently. Think about buying a car: You can figure out
the prices, spreadsheet payments, work out pros and cons of models, and work
out which is the best car. But System 1 deeply
and truly likes the red one. People end
up buying the red one. Corporations prey
on System 1 all the time in advertising.
Las Vegas is built on System 1 thinking (“This machine is about to get
HOT!”).
You can almost see the spot that is missing System 2... Source |
Why is this on an EMS blog?
Because System 1 has problems providing consistently good medical
care. When a drip rate calculation is
needed, you turn to System 2. System 2
yawns at you and rolls its eyes. System
1 waves its arms above its head and says, “Oooh, oooh! I know!
I know!” System 1’s idea is to simply
turn the drip on for a while to get the desired effect and system 2 can figure
out the drip rate later when we are writing the PCR.
System 1 freaks out at MCIs.
System 1 causes us to work dead people who should be pronounced. System 1 causes us to skip the secondary
because the patient smells bad. System 1
finds a homeless guy in an alley and “thin slices” that he is just drunk,
rather than checking other reasons for altered mental status. System 1 is where errors happen. So what can we do to strengthen System 2 in
our professional lives? It doesn’t work
to say: “I will try harder.”
To me, the trick to activating System 2 is to turn 2 into
1. Make your rational, reasoned,
calculated decisions into intuitive ones.
We do this by working them out in advance. This is the strength behind preplans, SOPs,
and protocols. Exercises prime System 1
actions through advance System 2 work, as well.
Practicing rare events until our actions become instinctive is switching
System 2 thinking into System 1 thinking.
It is also important for protocols and SOPs to not rely on
System 2 thinking. Protocol authors
should strive to get rid of the potential for error. Require less thought. Don’t let System 2 yawn and roll its lazy
eyes when you need it to work. For example,
my treatment protocols don’t call for drip rate calculations anymore. Take epi: my protocols call for 1 milligram
of any concentration of epinephrine into a liter bag of saline, run wide open
and titrated to effect. In advance I
know the concentration of epi in the liter bag is 1 microgram per milliliter
(System 2). But I don’t need to think
through it when with a sick patient on a drip.
I need to inject the med and open the line (System 1).
Airline pilots don’t rely on their memories when using
checklists and emergency procedures.
They pull out a book. Trying to
rack your memory for the emergency procedure while in the emergency is calling
on System 2. Having your copilot read
the pertinent list of actions is using System 1. This is also the idea behind the Handtevy pediatric treatment system – figure it
all out in advance and then look it up. Don’t
change your treatments between adult patients and pediatric patients; simply open
a book and look up the smaller doses. Switch
System 2 thought processes to System 1 thought processes. We know adult treatments so well that we are
working in System 1 when we are treating adult patients. Pediatric patients, especially sick ones,
freak us out and we didn’t memorize all the dosages well enough to cut through
the fog. The need to recall pediatric treatments
fall on System 2. Don’t depend on System
2 for your success. Switch 2 for 1 and
get to work.
Take the time to recognize where system mistakes occur and
fix them in advance. Make your job
simpler, so you don’t need to rely on System 2 while under pressure. Don't get me wrong - System 2 is not the
problem. Don’t get me wrong. System 2 is what allows us to calculate the
trajectory of a probe we’re shooting at Pluto, after all. But take the strengths of System 1,
especially the speed of System 1, and combine that with the strengths of System
2. You do that by working out your
actions in advance.
*Yeah, I wish he had come up with more descriptive system
names, too…
No comments:
Post a Comment