Navigating Through Ambiguity – Surgeons and Submarine Training
Posted by danspira
In a recent commencement address, “Failure and Rescue,” (ht RF) author Atul Gawande discusses situations where people are confronted with a situation going terribly and unavoidably wrong. Borrowing from his experience as a physician, he tells the story of a patient whose surgery led to an unexpected complication, and how that case was emblematic of what is sometimes done well — and sometimes done not so well — in the field of medicine.
Gawande says what separates the best health care institutions from the rest is not a lower track record of post-surgery complications. Rather, it their ability to recognize and deal with those complications — a lower incidence of “failure to rescue” — that distinguishes them.
He writes (and I emphasize in bold) —
This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.
When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all.
The sooner you’re able to see clearly that your best hopes and intentions have gone awry, the better. You have more room to pivot and adjust. You have more of a chance to rescue.
But recognizing that your expectations are proving wrong—accepting that you need a new plan—is commonly the hardest thing to do. We have this problem called confidence. To take a risk, you must have confidence in yourself. In surgery, you learn early how essential that is. You are imperfect. Your knowledge is never complete. The science is never certain. Your skills are never infallible. Yet you must act. You cannot let yourself become paralyzed by fear.
Yet you cannot blind yourself to failure, either. Indeed, you must prepare for it. For, strangely enough, only then is success possible.
So you will take risks, and you will have failures. But it’s what happens afterward that is defining. A failure often does not have to be a failure at all. However, you have to be ready for it—will you admit when things go wrong? Will you take steps to set them right?—because the difference between triumph and defeat, you’ll find, isn’t about willingness to take risks. It’s about mastery of rescue.
Actually, what Gawande is describing here is risk management — hygienically phrased or not.
Risk is merely deviation from expectation, and risk management is the art and science of dealing with that deviation. Gawande wants us to…
- anticipate potential risks
- listen for (and be open to) data that challenges our assumptions
- take swift action to determine the best possible course of action
- consult the advice of specialists and other experts
- maintain a sense of urgency
This connects nicely to something that Fred G once mentioned to me back in 2008 (when I started this post… yes, this a longstanding draft that has been gathering digital dust… thank you Atul Gawande and RF for the mental kick). Fred recommended I look into how submariners are trained, and how their thinking and practice differs from other areas of the military. The big idea there was that some people have a knack for learning-on-the-fly and for being successful in unfamiliar territory.
Rime of the Modern Submariner
Submarine training is all about navigating unfamiliar territory and dealing with things as they come. It’s not about being so smart that you know everything that’s going on and what to do about it. Rather, it’s about being smart enough to deal with the fact that you don’t know everything… and have the skills to do something about it.
Could certain kinds of education enhance that ability?
Following Fred’s suggestion, I went off and scoured the Interwebs for what I could find on this topic. Here are a couple of articles I found of interest:
Learning C++ “Submarine Style”: a case study
Inf. Syst. & Decision Sci. Dept., Univ. of South Florida, Tampa, FL, USA;
This paper appears in: Education, IEEE Transactions on
Publication Date: Feb. 2005
Volume: 48, Issue: 1
On page(s): 150- 156
INSPEC Accession Number: 8285338
Digital Object Identifier: 10.1109/TE.2004.837044
Date Published in Issue: 2005-02-14 11:24:22.0
This case study describes a successful introductory course in C++ with a design that draws extensively upon techniques used in the training of nuclear submarine personnel. Techniques adopted include emphasis on completion of practical exercises as opposed to concept mastery, self-paced learning based on extensive materials prepared for the course, use of oral examinations to validate student achievement, use of undergraduate teaching assistants to assist and examine students, and a strong peer-learning focus with group collaboration being actively encouraged. Over the two-year period during which the course evolved, substantial increases in completion rates and the amount of material that is covered have been experienced. In addition, certain elements of the course design-particularly the emphasis on group work, use of online support, and use of “state-of-the-art” tools-seem more consistent with current programming practice than the conventional programming course, emphasizing lectures and completion of individual assignments.
Finding a funding fit
Three psychologists garner off-the-beaten track research support.
By SADIE F. DINGFELDER
APA – Monitor on Psychology
June 2005, Vol 36, No. 6
Print version: page 22
Psychologist Wayne Gray, PhD, also seeks to solve practical problems while furthering psychological science. In search of such a question, Gray phoned Susan S. Kirschenbaum, PhD, an engineering psychologist at the Naval Undersea Warfare Center Division in Rhode Island, and learned that the Navy is designing a new fleet of submarines. The new submarines may include a computer workstation for the commander, which means the Navy needs to know how commanders seek out and process information, Gray says. So the psychologist proposed to find out by running Navy officers through a submarine simulation and analyzing their decision-making processes.
“As an effort to do cognitive modeling in a complex, Navy-relevant task domain, the proposal was obviously attractive,” says Susan Chipman, PhD, manager of the cognitive science program at ONR.
The task? How submarine commanders solve the common, but thorny problem of locating enemy submarines.
“What they are doing is like playing hide-and-seek in a dark warehouse,” Gray says. “You don’t turn on your flashlight to find the other players because then they will see where you are.”
Instead, says Gray, commanders work with their crew to “listen” quietly to the ocean’s sounds through passive radar, attempting to distinguish vibrations of an enemy ship’s engine from that of passing merchant vessels.
Gray and his collaborators sought to create a cognitive model for submarine commanders’ problem- solving by putting experienced commanders through a simulation and asking them to think out loud. The researchers found that the commanders develop a mental map of what information they are missing–for example, they might know an enemy ship’s position but not speed or direction of movement. The commanders then attempt to fill in that information using small strategies, such as turning their own submarine to better hear enemy-ship noise.
“It sounds simple, but they approached the problem unlike any experts I have ever studied,” says Gray. Chess players–a common subject for such research–make decisions in a world of perfect knowledge, he says. This allows them to think several steps ahead and develop long-term plans. In contrast, submarine commanders spend most of their time trying to find basic information, Gray notes. And they frequently discard strategies and start over if they sense that the state of the world has changed–that the enemy submarine has moved, for example.
Navy officials won’t comment on how, specifically, they used the findings of the research. However, Gray notes that in addition to developing a new cognitive model, he also gained insight into how to study an unusual group of experts. Like the commanders, Gray found that he had to drop old theories of decision-making as data accumulated that didn’t fit his schema.
If we merge the above excerpts into the Gawande piece, we start to develop the following competency framework for people who have to deal with ambiguous, high-stakes situations:
- anticipates potential risks
- listens for (and is open to) data that challenges previous assumptions
- willingness to be wrong
- takes swift action to determine the best possible course of action
- consults the advice of specialists and other experts
- maintains a sense of urgency
- retains multiple variables in working memory while filling information gaps
As for the part about having a willingness to trust experts, there’s also the skill of how to speak to experts, as well as having a broad enough knowledge of the experts’ various specialized fields and tools, so as to be able to consult with them as efficiently as possible
Few of us know what kinds of problems we’ll be working in 5 to 10 years from now… and we’ll likely be working on those problems using tools and expertise that currently don’t exist. Given that the pace of change in all fields isn’t slowing down, it seems like we could all use some submarine training.
About danspiraMy blog is at: http://danspira.com. My face in real life appears at a higher resolution, although I do feel pixelated sometimes.
Posted on June 7, 2012, in Business, Career, Learning, Management, Metaphors, Risk Management and tagged Business, career, choices, decision making, learning, life, Management, metaphors, navy, Psychology. Bookmark the permalink. 3 Comments.