Monday, May 18, 2015

Playing it safe -- or not




In his book Augustine’s Laws, Norman Augustine discusses the problems of process-oriented cultures, among them too many regulations and the dangers of “playing it safe.” One of the arguments that Augustine presents against the growth of regulations is: “The fallacy in using regulations to prevent problems is that if managers could ignore the old regulation, they can ignore the new one, too.”  Still, Augustine asserts that “Regulations grow at the same rate as weeds,” as demonstrated in the following example: “… in 1946 the US Atlantic fleet was comprised of 778 ships and sailed under regulations contained in a 72-page pamphlet. In contrast, today’s Atlantic Fleet may only have 297 ships, but it is well equipped with regulations—308 pages of them.”1
Terry Little, the Director of the Air Force Acquisition Center of Excellence, believes that in today’s dynamic environment we must unlearn the old way of thinking, and in particular, we must stop “playing it safe.”
A lot of processes that we have are built on lack of trust. When you hand somebody an 11-page specification rather than a 100-page document, however, you are sending a clear signal that you trust them to do the right thing. My belief is that, as an individual project manager, you can go a long way in that direction by starting not with the notion that someone has to earn your trust, but starting with the presumption that they’re trustworthy until proven otherwise. It allows things like an 11-page specification. My biggest disappointment in the past has been when I have given project managers the opportunity to innovate, and they don’t know what to do with it. They demand processes, rigidity, templates, and prescriptions. It is as if you give them a blank check and they write it for a dollar. What you’ve got to do, is to “unlearn”… all of our processes that are not oriented toward speed or credibility, but are oriented toward not making a mistake, playing it safe.2 
As demonstrated by Scott Cameron, the Global Process Owner of Project Management at Procter & Gamble, successful project managers do not “play it safe.”  
I have noted during my career that there is a never-ending amount of rules and restrictions forced upon project managers under the guise of helping them “be successful” in managing their projects. It appears to be a one-way street; many regulations are added, but few (if any) are removed. I had the opportunity to help clean out such a closet [of standard procedures] as part of a project management leadership team I was part of.
Scott reports that although the “cleaning out” required three consecutive review cycles, eventually they sharply cut the number of standard procedures.  Instead of 18 technical standards and 32 standard operating procedures, project managers at Procter & Gamble are now employing only four of each. 
The project management community was delighted with these reductions and felt empowered by them. It gave them more flexibility to manage their projects and develop their own personal management style. The streamlining process enabled us to reduce the effort, costs and time required to maintain these standards. 3  
Developing a successful project culture requires learning to trust and unlearning the “play it safe” approach.

1.  Augustine, N. (1986). Augustine’s Laws. New York, NY: Viking Penguin, 101-7.

2.  “Speed Merchants: A conversation with Scott Cameron and Terry Little” (2003). Ask Magazine, 11 (March): 26-9. http://appel.nasa.gov/ask/about/overview/index.html

3.  Cameron, W. S. (2004). “Cleaning Out the Closet,” Proctor and Gamble, Ask Magazine, 20 (November): 19-21. http://appel.nasa.gov/ask/about/overview/index.html


Monday, April 20, 2015

Coping With Failures

Bridge collapse, 1940, Tacoma, Washington
(University of Washington Libraries, Special Collections)


Lucy Suchman opens her book, Plans and Situated Actions: The Problem of Human-machine Communication, by comparing different navigation methods employed by European and Trukese navigators. “The European navigator begins with a plan – a course – which he has charted according to certain universal principles, and he carries out his voyage by relating his every move to that plan. His effort throughout his voyage is directed to remaining ‘on course.’ If unexpected events occur, he must first alter the plan, then respond accordingly.”

The Trukese navigator, on the other hand, “…begins with an objective rather than a plan. He sets off towards the objective and responds to conditions as they arise in an ad hoc fashion.” The Trukese navigator steers according to information provided by the wind, tide, stars, clouds, and the sound of water on the side of his boat. While his objective is clear from the outset, his actual course is contingent on the unique circumstances that he cannot anticipate in advance. “His effort is directed to doing whatever needed to reach the objective. If asked, he can point to his objective at any moment, but he cannot describe his course.”

Suchman asserts the European navigator exemplifies the prevailing cognitive science model of purposeful action and “… the view that purposeful action is determined by plans is deeply rooted in the Western human sciences as the correct model of the rational actor… My own contention, however, is that as students of human action, we ignore the Trukese navigator at our peril.”1  

The following two stories demonstrate how adopting the “navigation philosophy” (either of the Trukese’s or the European’s) affects how one copes with failures during project implementation. The first story is told by David Panhorst, an R&D manager at Armament Research, Development and Engineering Center (ARDEC):    

…I managed the development of the Army’s first smart tank bullet. The projectile’s built-in sensor searches the ground for targets. Upon locating a target, the sensor aims and triggers a warhead to hit and defeat the target from above, where it is most vulnerable. This fly-over, shoot-down technique differs from the hit-to-defeat method used by current tank ammunition.

The first two iterations of fin design tests failed miserably. Since David had budgeted only for two iterations, more tests would have resulted in higher costs for the project. To contain cost growth, David authorized additional test objectives as part of the fin retest. The problem with having so many layers of test objectives, however, was that with each failure, all subsequent test data were lost. Nevertheless, the team pressed on and continued to add bells and whistles to each successive redesign test until they had gathered enough data on other subsystems to solve the problem.

It wasn’t until we stopped and regrouped that we were able to refocus on the original objective and concentrate on the fin deployment problem, which, after all, is the first thing that needs to happen. By refocusing our efforts on the primary objective, we fixed the problem and developed a fin that opened reliably and predictably.  

Following this project David reflected: As children, the first lesson we learned when we were introduced to ball sports, such as baseball or tennis, was “Keep your eye on the ball.” In project management, it is no different, you must maintain project focus. In our case, we should have focused on validating the project concept rather than minimizing test cost. 2   
Terry Little from the U.S. Air Force suggests a completely different way of coping with such failures during project implementation:

We had a test where a warhead didn’t go off. The flight of the cruise missile was perfect, but the warhead didn’t go off. You could look at this and say, “Well that’s just one little piece of it.” But if the warhead doesn’t go off, you don’t have a weapon.

In this case, we quickly determined that there was a design problem with the fuse, and we moved on to have a successful retest—but not before we mined every bit of information that we possibly could out of that failure.

Whenever there is a failure, the first thing to do is to go through a short grieving period. On JASSM [Joint Air-to-Surface Standoff Missile], whenever we had a failure, I allowed grieving for one day. We could grieve and mope, get drunk, wring our hands, say “ain’t it awful.” We could do that for one day, and then it was time to put it behind us. That’s a Terry Little rule.

… When you have a problem on a project, all of a sudden people want to know, ‘What’s going on? What’s the problem?’ Most project managers want those questions to go away, so there is a tendency to want to jump to the solution and respond, “Don’t worry, we’ve got this in hand. It was just one of those things.”

What you need to do is dig until you get to the root cause of the problem, until you are certain that you understand why this failure happened. Yes, it takes a little longer. Yes, it costs more money. In my case, a week to get to the root cause of a problem is $4 or $5 million. But you’ve got to do it, so that when you move forward you know that you have learned from the failure. A lot of times what caused it will be a gnat, but you should still kill it with a sledgehammer and smash it to bits until you are convinced that, though you may have other failures in the future, that particular one will never occur again—ever. You can’t move forward from a failure if you’re worried about repeating the same mistake.3   
These two stories present two strikingly different management philosophies. Following the European navigator approach, David Panhorst was too concerned with the original test plan (which called for only two iterations of design tests), rather than focusing on the objective of the project (“to develop a fin that opened reliably and predictably”). More importantly, he was reluctant to modify the original plan. In contrast, Terry Little quickly modified the plan in order to spend the necessary time mining “every bit of information” from the failure. Like the Trukese navigator, Little’s primary focus was on the objectives: “… if the warhead doesn’t go off, you don’t have a weapon.” In so doing, Little exhibited that a focus on objectives requires a readiness to modify the plans leading to those objectives!
1 Suchman. 1987. Plans and Situated Actions: The Problem of Human-machine Communication. Cambridge UK: Cambridge University Press, vii. Suchman refers to a story, quoted by G. Berreman, which was written by T. Gladwin in 1964. L.G. Berreman. 1966. Anemic and Emetic Analyses in Social Anthropology. American Anthropologist 86, 2: 346-54. T. Gladwin. 1964. Culture and logical process. In Explorations in Cultural Anthropology: Essays Presented to George Peter Murdock, ed. W. Goodenough.  New York, NY: McGraw Hill.
2 Keep Your Eye on the Ball,” David Panhorst, ARDEC. 2000. In A. Laufer and E.J. Hoffman, Project Management Success Stories, 131-33.  New York, NY: John Wiley & Sons.
3 “A Terry Little Rule,” Terry Little, U.S. Air Force. 2005.  In A. Laufer, T. Post, and E.J. Hoffman, Shared Voyage: Learning and Unlearning from Remarkable Projects, 123-24. Washington, DC: The NASA History Series.

Monday, March 23, 2015

The Impact of Project Collaboration



This month’s blog is adapted from a recent article entitled “What Successful Project Managers Do” published in the March 2015 issue of the MIT Sloan Management Review. In addition to Alex Laufer and Jeff Russell, the other co-authors are Edward J. Hoffman, NASA's Chief Knowledge Officer, and Scott W. Cameron, Global Project Management Technology Process Owner at Procter & Gamble.  

In the article we demonstrated that the first role assumed by successful project managers in a variety of industries is to develop project collaboration. In today’s blog we present four examples that highlight the paramount impact of project collaboration.

Example A, The Power of Trust

NASA project manager Tim Flores looked at three projects at the Jet Propulsion Laboratory (JPL): the Pathfinder, Climate Orbiter, and Polar Lander. Although all three projects had the same guiding principles, similar scope, and shared elements (including some team members), the Pathfinder was a success while the other two failed.

Flores thought he’d find that the Pathfinder project stood out from the others in things like resources, constraints, and personnel. To some extent this was true, but what he found to be the primary difference was the level of collaboration. The Pathfinder team developed trusting relationships within an open culture. They felt free to make decisions without fearing severe punishment for mistakes. The other two project teams never developed such a high level of trust.

Example B, The Power of Openness

NASA’s Wide-Field Infrared Explorer (WIRE) mission studied how galaxies formed and evolved. They used a telescope so delicate it was sealed inside a solid hydrogen cryostat. Shortly after launch, the cryostat’s cover ejected prematurely and the Explorer craft tumbled wildly through space. The entire mission failed.

Jim Watzin, project manager at NASA and a team member at the WIRE project, commented on the official NASA after-action report: “WIRE failed because people could not or would not communicate well with each other….Individuals simply were uncomfortable allowing others to see their work.” He added: “The real lesson of this loss is that any team member that does not participate as a true team player should be excused (from the project).”

Example C, The Power of Partnership

Allen, a payload manager in NASA’s Advanced Composition Explorer (ACE) project, developed trust between his team and 20 groups of instrument-developers using a three-stage plan. First, he picked team members who could operate in a university environment—they knew when to bend or even break the rules. Second, he permanently moved his team into a university environment, realizing it would provide a more open, flexible culture than the home-base environment. Third, he came up with an uncommon process for interacting with the scientist groups.

Allen anticipated that it would be difficult to get the scientists to regard his team as partners. Having dealt with NASA before, the scientists tended to believe Allen and his people would demand tons of paperwork and that things be done “just so.” Many of the scientists weren’t even sure they should share with Allen’s team the problems they were encountering.

Because Allen’s team had to review instrument development, he believed the best way to do this was to focus on building trust and convincing the scientists his team was there to help them problem-solve. So Allen and his team went to each university and stayed on-site for an extended period of time, spending days and nights working with the scientists, helping them solve their problems. They were there not as controllers but as partners.

Example D, The Power of Interdependence

Procter & Gamble (P&G) launched a large construction project at one of its European plants. Karl, the contractor’s project manager, had no interest in team-building efforts and kept brushing them off. Pierre, the P&G project manager, finally found an opportunity to change Karl’s attitude. Three months into construction, the contractor (Karl’s group) accidentally placed the foundations off the mark, pouring about 600 lineal feet of foundation concrete in the wrong place.

Pierre recognized the volatile and disruptive nature of the mistake. He could have ordered Karl to fix the mistake by tearing down the foundations and starting over, but he realized that would have damaged the contractor’s reputation and ego. He understood that the best plans evolve in waves as the project unfolds and more information becomes available and reliable. He thus chose a different approach. Pierre took several days to meet and negotiate with users and designers, modified the interior plant layout, and minimized the damage without tearing down the foundations and starting over.

While the mistake resulted in costly changes, Karl’s reputation was preserved and the project wasn’t delayed. Eventually Karl came around to embrace Pierre’s team approach—namely, “If they fail, we fail.” Realizing they are all interdependent led to the development of a collaborative relationship.  Developing mutual interdependence and mutual responsibility for project results is key to project success.

Tuesday, February 17, 2015

The Payoff: Working With Trust



Last month’s blog was entitled The Cost of Working Without Trust. This month we share a story1 which stands in marked contrast. It is from Jerry Madden, a successful NASA project manager:
It’s easier to sell a project if you can show it is based on a previous successful project. People feel safer with the tried and true. When we introduced the ISEE* project, we told NASA headquarters that we would build the spacecraft exactly like the IMP-H**. The structure would be the same and we would keep the same basic electronics as far as possible. Because I believed it could be done, I issued this edict to everyone in the project and even appointed one of the best mechanical engineers to be in charge of the structure.
A short while into the project, he showed up at my office and said, “When are you going to get down off your high horse and let us design a structure for ISEE that can fly. IMP-H just will not work.” I was taken aback. This meant taking a risk rather than going down the same road of a successful project. Had he been someone else, I would have asked to have him assigned to a different project. How could I trust someone to carry out directives in a project in which I totally believed?
However, I trusted him explicitly because I had worked with him for a long time. This was an opportunity to change direction, based on solid judgment. I swallowed my pride, looked at him squarely in the face and said, “If that is the way you truly feel, now is about as good a time as any. Will it still have 16 sides and will it look the same on the outside?”
He said that it would, but it will be larger and will bear almost no resemblance on the inside. I said, “Fine. I will inform Headquarters that we will have to mod [modify] the structure and make additional changes to the internal systems.” Fortunately, the project was set up as a highly flexible organization with competent and experienced management and was made the change to the new concept with very little static. In the end, the successful ISEE spacecraft from a distance looked the same as IMP-H but internally, both in electronics and structure, there was no resemblance. You have to know when to eat your own words and enjoy the taste.
Practically the same sort of thing occurred with my counterpart in ESA who was building the ISEE-B spacecraft that was mated to the ISEE-A for launch. In the selling of the program their project manager had accepted too stringent requirements on the spacecraft partially to fit within the IMP-H. The ESA method of management was slightly different from ours so when they got to the execution phase, they brought in a new manager known for his competency and integrity, whose perspective was not jaded and who was able to see the project elements in a fresh light. He took one look at the system and like my mechanical engineer, had only one thing to say, “This is rubbish!” Here too, the future of this important project now hung on the opinion of one man.
Fortunately, my counterpart did not have to eat his own words but only those of his predecessor, who had determined the spacecraft concept design. My counterpart was straightforward and honest and recognized the value of flexibility, which helped to make the program change workable. As a true leader, he was able to affect the sharp change to the concept. Our two teams immediately formed one team to get the job done because we recognized not only the competence of our colleagues but also their integrity.
A key lesson from this story is some of the most important decisions, made by people in a rigorous and “thing-based” profession (engineering), and some of the most sophisticated technological artifacts, are often made and accepted based on two human concepts: intuition and trust. Soft is hard.
1 Source:  Madden, J. (2000). In A. Laufer & E. J. Hoffman, Project management success stories: Lessons of project leaders (pp. 104-105). New York, NY: Wiley.
*ISEE means International Sun-Earth Explorer, one of NASA’s Explorers spacecrafts.
**IMP means Interplanetary Monitoring Platform; IMP-H was one in a series of such spacecrafts.