Navigating Through Ambiguity – Surgeons and Submarine Training


In a recent commencement address, “Failure and Rescue,” (ht RF) author Atul Gawande discusses situations where people are confronted with a situation going terribly and unavoidably wrong. Borrowing from his experience as a physician, he tells the story of a patient whose surgery led to an unexpected complication, and how that case was emblematic of what is sometimes done well — and sometimes done not so well — in the field of medicine.

Gawande says what separates the best health care institutions from the rest is not a lower track record of post-surgery complications. Rather, it their ability to recognize and deal with those complications — a lower incidence of “failure to rescue” — that distinguishes them.

He writes (and I emphasize in bold)

This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to  prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.

When things go wrong, there seem to be three main pitfalls to avoid, three  ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no  plan at all. Say you’re cooking and you inadvertently set a grease pan on fire.  Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no  plan at all.

(…)

The sooner you’re able to see clearly that your best hopes and intentions  have gone awry, the better. You have more room to pivot and adjust. You have  more of a chance to rescue.

But recognizing that your expectations are proving wrong—accepting that you  need a new plan—is commonly the hardest thing to do. We have this problem called  confidence. To take a risk, you must have confidence in yourself. In surgery,  you learn early how essential that is. You are imperfect. Your knowledge is  never complete. The science is never certain. Your skills are never infallible.  Yet you must act. You cannot let yourself become paralyzed by fear.

Yet you cannot blind yourself to failure, either. Indeed, you must prepare  for it. For, strangely enough, only then is success possible.

(…)

So you will take risks, and you will have failures. But it’s what happens afterward that is defining. A failure often does not have to be a failure at  all. However, you have to be ready for it—will you admit when things go wrong?  Will you take steps to set them right?—because the difference between triumph  and defeat, you’ll find, isn’t about willingness to take risks. It’s about  mastery of rescue.

Actually, what Gawande is describing here is risk management — hygienically phrased or not.

Risk is merely deviation from expectation, and risk management is the art and science of dealing with that deviation. Gawande wants us to…

  • anticipate potential risks
  • listen for (and be open to) data that challenges our assumptions
  • take swift action to determine the best possible course of action
  • consult the advice of specialists and other experts
  • maintain a sense of urgency

This connects nicely to something that Fred G once mentioned to me back in 2008 (when I started this post… yes, this a longstanding draft that has been gathering digital dust… thank you Atul Gawande and RF for the mental kick). Fred recommended I look into how submariners are trained, and how their thinking and practice differs from other areas of the military. The big idea there was that some people have a knack for learning-on-the-fly and for being successful in unfamiliar territory.

Rime of the Modern Submariner

Submarine training is all about navigating unfamiliar territory and dealing with things as they come. It’s not about being so smart that you know everything that’s going on and what to do about it. Rather, it’s about being smart enough to deal with the fact that you don’t know everything… and have the skills to do something about it.

Could certain kinds of education enhance that ability?

Following Fred’s suggestion, I went off and scoured the Interwebs for what I could find on this topic. Here are a couple of articles I found of interest:



Learning C++ “Submarine Style”: a case study

Gill, T.G.
Inf. Syst. & Decision Sci. Dept., Univ. of South Florida, Tampa, FL, USA;

This paper appears in: Education, IEEE Transactions on
Publication Date: Feb. 2005
Volume: 48, Issue: 1
On page(s): 150- 156
ISSN: 0018-9359
INSPEC Accession Number: 8285338
Digital Object Identifier: 10.1109/TE.2004.837044
Date Published in Issue: 2005-02-14 11:24:22.0

Abstract
This case study describes a successful introductory course in C++ with a design that draws extensively upon techniques used in the training of nuclear submarine personnel. Techniques adopted include emphasis on completion of practical exercises as opposed to concept mastery, self-paced learning based on extensive materials prepared for the course, use of oral examinations to validate student achievement, use of undergraduate teaching assistants to assist and examine students, and a strong peer-learning focus with group collaboration being actively encouraged. Over the two-year period during which the course evolved, substantial increases in completion rates and the amount of material that is covered have been experienced. In addition, certain elements of the course design-particularly the emphasis on group work, use of online support, and use of “state-of-the-art” tools-seem more consistent with current programming practice than the conventional programming course, emphasizing lectures and completion of individual assignments.

http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?tp=&arnumber=1393116&isnumber=30321



Finding a funding fit

Three psychologists garner off-the-beaten track research support.
By SADIE F. DINGFELDER
APA – Monitor on Psychology
June 2005, Vol 36, No. 6
Print version: page 22

(…)

Psychologist Wayne Gray, PhD, also seeks to solve practical problems while furthering psychological science. In search of such a question, Gray phoned Susan S. Kirschenbaum, PhD, an engineering psychologist at the Naval Undersea Warfare Center Division in Rhode Island, and learned that the Navy is designing a new fleet of submarines. The new submarines may include a computer workstation for the commander, which means the Navy needs to know how commanders seek out and process information, Gray says. So the psychologist proposed to find out by running Navy officers through a submarine simulation and analyzing their decision-making processes.

“As an effort to do cognitive modeling in a complex, Navy-relevant task domain, the proposal was obviously attractive,” says Susan Chipman, PhD, manager of the cognitive science program at ONR.

The task? How submarine commanders solve the common, but thorny problem of locating enemy submarines.

“What they are doing is like playing hide-and-seek in a dark warehouse,” Gray says. “You don’t turn on your flashlight to find the other players because then they will see where you are.”

Instead, says Gray, commanders work with their crew to “listen” quietly to the ocean’s sounds through passive radar, attempting to distinguish vibrations of an enemy ship’s engine from that of passing merchant vessels.

Gray and his collaborators sought to create a cognitive model for submarine commanders’ problem- solving by putting experienced commanders through a simulation and asking them to think out loud. The researchers found that the commanders develop a mental map of what information they are missing–for example, they might know an enemy ship’s position but not speed or direction of movement. The commanders then attempt to fill in that information using small strategies, such as turning their own submarine to better hear enemy-ship noise.

“It sounds simple, but they approached the problem unlike any experts I have ever studied,” says Gray. Chess players–a common subject for such research–make decisions in a world of perfect knowledge, he says. This allows them to think several steps ahead and develop long-term plans. In contrast, submarine commanders spend most of their time trying to find basic information, Gray notes. And they frequently discard strategies and start over if they sense that the state of the world has changed–that the enemy submarine has moved, for example.

Navy officials won’t comment on how, specifically, they used the findings of the research. However, Gray notes that in addition to developing a new cognitive model, he also gained insight into how to study an unusual group of experts. Like the commanders, Gray found that he had to drop old theories of decision-making as data accumulated that didn’t fit his schema.

http://www.apa.org/monitor/jun05/fit.html



If we merge the above excerpts into the Gawande piece, we start to develop the following competency framework for people who have to deal with ambiguous, high-stakes situations:

  • anticipates potential risks
  • listens for (and is open to) data that challenges previous assumptions
  • willingness to be wrong
  • takes swift action to determine the best possible course of action
  • consults the advice of specialists and other experts
  • maintains a sense of urgency
  • retains multiple variables in working memory while filling information gaps

As for the part about having a willingness to trust experts, there’s also the skill of how to speak to experts, as well as having a broad enough knowledge of the experts’ various specialized fields and tools, so as to be able to consult with them as efficiently as possible

Few of us know what kinds of problems we’ll be working in 5 to 10 years from now… and we’ll likely be working on those problems using tools and expertise that currently don’t exist. Given that the pace of change in all fields isn’t slowing down, it seems like we could all use some submarine training.

Advertisements

About danspira

My blog is at: http://danspira.com. My face in real life appears at a higher resolution, although I do feel pixelated sometimes.

Posted on June 7, 2012, in Business, Career, Learning, Management, Metaphors, Risk Management and tagged , , , , , , , , , . Bookmark the permalink. 3 Comments.

  1. See: http://query.nytimes.com/gst/fullpage.html?res=9806E7D71330F932A35753C1A9639C8B63

    Game Theory: POKER; Bluffing and the Royal Flush of Cold Warfare

    By James McManus

    Published: October 1, 2005

    Nicholas D. Kristof recently wrote on the Op-Ed page of this newspaper about North Korean power plants that may be capable of producing weapons-grade nuclear materials: ”It’s possible that North Korea is bluffing or is resuming construction only to have one more card to negotiate away.”

    Kristof naturally assumed his readers would understand what a bluff was, and how bluffs might fail or succeed. For thousands of years before poker was invented, bluffs, counterbluffs and the ability to deduce opponents’ intentions and strengths from contradictory signals were at the heart of most tribes’ and countries’ defense tactics. (According to the Random House Dictionary of the English Language, the German word bluffen, to bluster or frighten, is related to the Dutch bluffen, to make a trick at cards. The English version, apparently combining both meanings, seems to have first appeared around 1665.)

    Beginning in the early 19th century, the importance of bluffing in military and other affairs spurred the new game’s development and popularity. No other pastime so perfectly captured the essence of this deceitful yet often lifesaving tactic — of making someone believe you will fight to the death, for example, without having to shed any blood.

    By the middle of the 20th century, with the nuclear arms race neck and neck, two brilliant Princeton professors helped the United States pull ahead of the Soviet Union. The economist Oskar Morgenstern served as a close adviser to President Dwight D. Eisenhower, and a math whiz named John von Neumann made vital contributions to the Manhattan Project, information theory and computer technology. Perhaps most important, both men provided deep mathematical insight into the nature of bluffing when they wrote ”Theory of Games and Economic Behavior” in 1944.

    Their 648-page magnum opus was a groundbreaking model of economic and social organization, based on a theory of games of strategy. Almost overnight, it revolutionized not only economics and defense strategy but also created out of whole cloth a new field of scientific inquiry — game theory — that was eventually used to analyze presidential candidates, baseball salaries, poker hands and nuclear stalemates.

    As Morgenstern wrote later: ”The cold war is sometimes compared to a giant chess game between the United States and the Soviet Union, and Russia’s frequent successes are sometimes attributed to the national preoccupation with chess. The analogy, however, is quite false, for while chess is a formidable game of almost unbelievable complexity, it lacks salient features of the political and military struggles with which it is compared.”

    Since chess is a game of complete information, it offers no opportunities to bluff, which leaves it ”far removed from political reality where the threatening nation has to weigh the cost not only to its enemies, but to itself, where deceit is certainly not unheard of, and where chance intervenes.”

    Such elements were basic to poker, which Morgenstern called ”a game of wile and artifice” and used all caps to emphasize that ”THE BEST HAND NEED NOT WIN.” Consistent winners, he wrote, ”rely on their ability to perceive opportunities offered by each changing situation, and on artful deception through bluffing.”

    He conceded that chess might be a more moral game but insisted that poker tactics were more useful when ”countries with opposing aims and ideals watch each other’s move with unveiled suspicion.” He concluded, ”If chess is the Russian national pastime and poker is ours, we ought to be more skillful than they in applying its precepts.”

    Inspired by his poker games with the military brass during the Manhattan Project, von Neumann fine-tuned game theory as a model of how potentially deceitful countries interact when they have opposing interests. Zero-sum games of complete information no longer interested him. He found poker more lifelike, its tactics gratifyingly similar to those deployed by generals and presidents. Indeed, this was probably what gave the six-tenths-of-a-gram plastic cards their uncanny weight in his hand in the first place — not unlike plutonium 239, the royal flush of cold warfare elements and the one we may still have to answer for.

    In spite of its name, which sounds like fun, game theory is an unplayful branch of mathematics in which naked self-interest determines every decision. It was used to determine how much to spend on conventional forces versus ICBM’s, which city to nuke in retaliation for a blitz of West Berlin, whether to bomb Cuban missile sites or blockade the whole island. The discipline’s ruthless expediency also made it the perfect tool for understanding poker strategy, and vice versa.

    Von Neumann intuitively understood that poker was distilled competition, a less deadly version of warfare. The best strategy involves probability, psychology, luck and budgetary acumen but is never transparent; it depends on the counterstrategies deployed by the enemy.

    Expert players misrepresent their hands, simulate irrational behavior, use surprise to intimidate and deploy other mind games to confuse their opponents. When Kim Jong Il rattles his rickety but nuclear-tipped saber, he’s trying to convince his enemies that he’s willing to set them all-in in the ultimate no-limit showdown.

    Photos: Kim Jong Il, left, of North Korea seems to have taken to heart lessons in John von Neumann and Oskar Morgenstern’s work on game theory. (Photo by Korea News Service via Reuters)

  2. i purposely been taking my time to read this one.. as it touches on a subject dear to me.

    gawanade makes a very good point re: confidence. it’s a tricky situation, confidence is normally understood as the conscious (or more likely subconscious?) underestimation of potential risks. it’s why optimists do well in life. (this is well referenced)

    however, cognitive sciences are telling us we fall in quite a few cognitive pitfalls when we are confident. one of them is our inability to “see” failure. (confirmation bias)

    it takes a very delicate balance of accepting failure and understanding probability theory to be able to risk manage effectively without losing confidence. it’s easier said than done. it turns out our cognitive biases are not fixable. at least not by ourselves. maybe a solution is pairing up and mutual criticism (in a surgery setting) where one would keep track of the other’s cognitive pitfalls.

    submarine training is not really a good comparison with surgery. the submariner is constantly developing his intuition by immediate feedback. you turn.. the sound changes.. and eventually, you figure out if there was really a ship or not, within a fairly short period of time.

    the surgeons, dealing with acute surprises have to take quick decisions, and act like submariners.. and train their instincts.. however, it becomes a whole different issue when it’s about longterm outcome.

    “which course of action would have had the better long term outcome?” is something that we can’t train our instincts to do. we fool ourselves instead. this lack of quick feedback messes surgeons up (or anyone who is supposed to be doing outcome sensitive procedures), because sometimes (i would argue often) our determinants of short term success are also our determinants of long term failure. it takes a damn good surgeon to live with this.

Leave a Reply -- for humans only, no spambots

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: