Monday, April 20, 2015

Coping With Failures

Bridge collapse, 1940, Tacoma, Washington
(University of Washington Libraries, Special Collections)


Lucy Suchman opens her book, Plans and Situated Actions: The Problem of Human-machine Communication, by comparing different navigation methods employed by European and Trukese navigators. “The European navigator begins with a plan – a course – which he has charted according to certain universal principles, and he carries out his voyage by relating his every move to that plan. His effort throughout his voyage is directed to remaining ‘on course.’ If unexpected events occur, he must first alter the plan, then respond accordingly.”

The Trukese navigator, on the other hand, “…begins with an objective rather than a plan. He sets off towards the objective and responds to conditions as they arise in an ad hoc fashion.” The Trukese navigator steers according to information provided by the wind, tide, stars, clouds, and the sound of water on the side of his boat. While his objective is clear from the outset, his actual course is contingent on the unique circumstances that he cannot anticipate in advance. “His effort is directed to doing whatever needed to reach the objective. If asked, he can point to his objective at any moment, but he cannot describe his course.”

Suchman asserts the European navigator exemplifies the prevailing cognitive science model of purposeful action and “… the view that purposeful action is determined by plans is deeply rooted in the Western human sciences as the correct model of the rational actor… My own contention, however, is that as students of human action, we ignore the Trukese navigator at our peril.”1  

The following two stories demonstrate how adopting the “navigation philosophy” (either of the Trukese’s or the European’s) affects how one copes with failures during project implementation. The first story is told by David Panhorst, an R&D manager at Armament Research, Development and Engineering Center (ARDEC):    

…I managed the development of the Army’s first smart tank bullet. The projectile’s built-in sensor searches the ground for targets. Upon locating a target, the sensor aims and triggers a warhead to hit and defeat the target from above, where it is most vulnerable. This fly-over, shoot-down technique differs from the hit-to-defeat method used by current tank ammunition.

The first two iterations of fin design tests failed miserably. Since David had budgeted only for two iterations, more tests would have resulted in higher costs for the project. To contain cost growth, David authorized additional test objectives as part of the fin retest. The problem with having so many layers of test objectives, however, was that with each failure, all subsequent test data were lost. Nevertheless, the team pressed on and continued to add bells and whistles to each successive redesign test until they had gathered enough data on other subsystems to solve the problem.

It wasn’t until we stopped and regrouped that we were able to refocus on the original objective and concentrate on the fin deployment problem, which, after all, is the first thing that needs to happen. By refocusing our efforts on the primary objective, we fixed the problem and developed a fin that opened reliably and predictably.  

Following this project David reflected: As children, the first lesson we learned when we were introduced to ball sports, such as baseball or tennis, was “Keep your eye on the ball.” In project management, it is no different, you must maintain project focus. In our case, we should have focused on validating the project concept rather than minimizing test cost. 2   
Terry Little from the U.S. Air Force suggests a completely different way of coping with such failures during project implementation:

We had a test where a warhead didn’t go off. The flight of the cruise missile was perfect, but the warhead didn’t go off. You could look at this and say, “Well that’s just one little piece of it.” But if the warhead doesn’t go off, you don’t have a weapon.

In this case, we quickly determined that there was a design problem with the fuse, and we moved on to have a successful retest—but not before we mined every bit of information that we possibly could out of that failure.

Whenever there is a failure, the first thing to do is to go through a short grieving period. On JASSM [Joint Air-to-Surface Standoff Missile], whenever we had a failure, I allowed grieving for one day. We could grieve and mope, get drunk, wring our hands, say “ain’t it awful.” We could do that for one day, and then it was time to put it behind us. That’s a Terry Little rule.

… When you have a problem on a project, all of a sudden people want to know, ‘What’s going on? What’s the problem?’ Most project managers want those questions to go away, so there is a tendency to want to jump to the solution and respond, “Don’t worry, we’ve got this in hand. It was just one of those things.”

What you need to do is dig until you get to the root cause of the problem, until you are certain that you understand why this failure happened. Yes, it takes a little longer. Yes, it costs more money. In my case, a week to get to the root cause of a problem is $4 or $5 million. But you’ve got to do it, so that when you move forward you know that you have learned from the failure. A lot of times what caused it will be a gnat, but you should still kill it with a sledgehammer and smash it to bits until you are convinced that, though you may have other failures in the future, that particular one will never occur again—ever. You can’t move forward from a failure if you’re worried about repeating the same mistake.3   
These two stories present two strikingly different management philosophies. Following the European navigator approach, David Panhorst was too concerned with the original test plan (which called for only two iterations of design tests), rather than focusing on the objective of the project (“to develop a fin that opened reliably and predictably”). More importantly, he was reluctant to modify the original plan. In contrast, Terry Little quickly modified the plan in order to spend the necessary time mining “every bit of information” from the failure. Like the Trukese navigator, Little’s primary focus was on the objectives: “… if the warhead doesn’t go off, you don’t have a weapon.” In so doing, Little exhibited that a focus on objectives requires a readiness to modify the plans leading to those objectives!
1 Suchman. 1987. Plans and Situated Actions: The Problem of Human-machine Communication. Cambridge UK: Cambridge University Press, vii. Suchman refers to a story, quoted by G. Berreman, which was written by T. Gladwin in 1964. L.G. Berreman. 1966. Anemic and Emetic Analyses in Social Anthropology. American Anthropologist 86, 2: 346-54. T. Gladwin. 1964. Culture and logical process. In Explorations in Cultural Anthropology: Essays Presented to George Peter Murdock, ed. W. Goodenough.  New York, NY: McGraw Hill.
2 Keep Your Eye on the Ball,” David Panhorst, ARDEC. 2000. In A. Laufer and E.J. Hoffman, Project Management Success Stories, 131-33.  New York, NY: John Wiley & Sons.
3 “A Terry Little Rule,” Terry Little, U.S. Air Force. 2005.  In A. Laufer, T. Post, and E.J. Hoffman, Shared Voyage: Learning and Unlearning from Remarkable Projects, 123-24. Washington, DC: The NASA History Series.

No comments:

Post a Comment