Executive Seminar

Executive Seminar

Friday, June 19, 2015

Conduct Learning-Based Reviews (part 1 of 2)





A previous story at this blog, Saving Lives: Expecting Problems or Burying Them (June 2014), highlights learning during the life of a project through a metaphor from the operating room. In this, the first of a two-part article, we will demonstrate how successful project managers ensure that learning plays a central role during project review. This month we focus on practices employed at NASA, while next month we will discuss examples from Procter & Gamble and other organizations.
Project reviews are seen primarily as a means of control by the client and upper management. Brian Muirhead from NASA, who led the design, development and launch of the flight system for the Pathfinder mission to Mars, describes the prevailing atmosphere during the review process:
“The routine is daunting. Members of the board sit at a horseshoe-shaped table, the chairman in the middle. A team member stands in front of them and launches his presentation. It usually isn’t long before one of the review board members interrupts the presenter with a question—rather like an attorney presenting oral arguments before the supreme court. The skeptical expressions, the intense looks, the scowls and smiles, are giveaways. And just as at the supreme court, the questions are generally polite, occasionally harsh, but all with a clear aim of probing for the truth. 1
Because project reviews are perceived as serving the needs of upper management, insufficient attention is paid to the overall needs of the project team, and in particular to the negative implications of the preparations required for the review. Brian Muirhead discusses the time leading up to a project review during the Pathfinder mission:
“Formal project reviews come with a clear, but unavoidable, downside. Done well, the preparations can take an enormous amount of time for the team. Preparations for a formal board review can take too many of us—me and the project’s top managers plus a number of key managers and engineers at the next level down—off the line for as much as six weeks. Necessary to the overall process, but a significant distraction; and even worse, a significant loss in momentum.”2
At NASA, two other project managers, dissatisfied with the perceived role vs. actual practice of project reviews, took steps to radically change the situation.
With his project up for another review, Marty Davis, a project manager at Goddard Space Flight Center, developed ways through which project reviews could benefit the person being reviewed more than the reviewer. Marty pushed for the creation of a review team, composed of internal technical staff and external specialists, who could provide feedback and joint problem-solving. In addition, Marty requested that, in order to provide consistency and eliminate the need to revisit issues, these same individuals participate in review milestones throughout the project lifecycle.
Marty was assigned an internal co-chair and recommended an external co-chair. He told both co-chairs that they could have seven members, and neither could duplicate the same technical specialties. Incorporating his approach into the review process, Marty’s next review lasted two days, with one day of presentation and one day of one-on-one sessions, followed by a caucus with the review team. The independent experts identified areas of concern, many of which, after one-on-one meetings with the specialized project staff and the review team’s technical specialists, were resolved. The issues that remained open were assigned a Request for Action (RFA). Eventually, Marty was left with just five RFAs.3
Susan Motil, another project manager from NASA, used Marty Davis's model after a bad experience with Concept Review. Desiring involvement in the review board selection, Susan wasn’t trying to take the panel’s independence or hide a problem, but rather look for particular expertise. She did just that, and acquired a panel with handpicked expertise and management approval. Her two sets of reviews – one for each subsystem, and one for the entire system – would have a dialogue with the engineers, who would show them the hardware and test data, and be open for reviewers’ questions.
Susan compared the direct outcomes of the initial, unsuccessful Concept Review, and the second review based on Davis’s model. Davis’s model allowed the team to spend significantly less time and effort on the RFAs, and it cost the project about $200,000, as compared to the $700,000 price tag for the initial review.4
Both Marty and Susan concluded that learning-based reviews are a must. They can help you identify problems in your project, which may make the difference between mission failure and mission success, and if implemented effectively, they can be accomplished without excessive interruption to project progress and with limited extra cost.
NEXT MONTH:  In part 2 of this two-part article we will look further at the role of learning in projects with examples from Procter & Gamble and others, concluding with a suggested practice for peer review.
IN THE MEANTIME:  Read more on integrating planning and review with learning in the recent article, “What Successful Project Managers Do,” published in the MIT Sloan Management Review.5

1.      B.K. Muirhead & W.L. Simon (1999). High Velocity Leadership: The Mars Pathfinder Approach to Faster, Better, Cheaper. New York, NY: Harper Collins Publishers, 23-4, 86-7.
2.      ibid.
3.      Davis, M. (2001). “Tangled Up in Reviews.” Goddard Space Flight Center. Ask Magazine 4 (July): 8-11.  http://appel.nasa.gov/ask/about/overview/index.html
4.      Motil, S. (2003).  “So This Is Knowledge Sharing.” Glenn Research Center. Ask Magazine 10 (January): 6-9.  http://appel.nasa.gov/ask/about/overview/index.html
5.      Laufer, A., Hoffman, E. J., Russell, J.S., & Cameron, W. S. (2015). “What Successful Project Managers Do.” MIT Sloan Management Review, 56(3), 42-51. http://sloanreview.mit.edu/article/what-successful-project-managers-do 

Monday, May 18, 2015

Playing it safe -- or not




In his book Augustine’s Laws, Norman Augustine discusses the problems of process-oriented cultures, among them too many regulations and the dangers of “playing it safe.” One of the arguments that Augustine presents against the growth of regulations is: “The fallacy in using regulations to prevent problems is that if managers could ignore the old regulation, they can ignore the new one, too.”  Still, Augustine asserts that “Regulations grow at the same rate as weeds,” as demonstrated in the following example: “… in 1946 the US Atlantic fleet was comprised of 778 ships and sailed under regulations contained in a 72-page pamphlet. In contrast, today’s Atlantic Fleet may only have 297 ships, but it is well equipped with regulations—308 pages of them.”1
Terry Little, the Director of the Air Force Acquisition Center of Excellence, believes that in today’s dynamic environment we must unlearn the old way of thinking, and in particular, we must stop “playing it safe.”
A lot of processes that we have are built on lack of trust. When you hand somebody an 11-page specification rather than a 100-page document, however, you are sending a clear signal that you trust them to do the right thing. My belief is that, as an individual project manager, you can go a long way in that direction by starting not with the notion that someone has to earn your trust, but starting with the presumption that they’re trustworthy until proven otherwise. It allows things like an 11-page specification. My biggest disappointment in the past has been when I have given project managers the opportunity to innovate, and they don’t know what to do with it. They demand processes, rigidity, templates, and prescriptions. It is as if you give them a blank check and they write it for a dollar. What you’ve got to do, is to “unlearn”… all of our processes that are not oriented toward speed or credibility, but are oriented toward not making a mistake, playing it safe.2 
As demonstrated by Scott Cameron, the Global Process Owner of Project Management at Procter & Gamble, successful project managers do not “play it safe.”  
I have noted during my career that there is a never-ending amount of rules and restrictions forced upon project managers under the guise of helping them “be successful” in managing their projects. It appears to be a one-way street; many regulations are added, but few (if any) are removed. I had the opportunity to help clean out such a closet [of standard procedures] as part of a project management leadership team I was part of.
Scott reports that although the “cleaning out” required three consecutive review cycles, eventually they sharply cut the number of standard procedures.  Instead of 18 technical standards and 32 standard operating procedures, project managers at Procter & Gamble are now employing only four of each. 
The project management community was delighted with these reductions and felt empowered by them. It gave them more flexibility to manage their projects and develop their own personal management style. The streamlining process enabled us to reduce the effort, costs and time required to maintain these standards. 3  
Developing a successful project culture requires learning to trust and unlearning the “play it safe” approach.

1.  Augustine, N. (1986). Augustine’s Laws. New York, NY: Viking Penguin, 101-7.

2.  “Speed Merchants: A conversation with Scott Cameron and Terry Little” (2003). Ask Magazine, 11 (March): 26-9. http://appel.nasa.gov/ask/about/overview/index.html

3.  Cameron, W. S. (2004). “Cleaning Out the Closet,” Proctor and Gamble, Ask Magazine, 20 (November): 19-21. http://appel.nasa.gov/ask/about/overview/index.html


Monday, April 20, 2015

Coping With Failures

Bridge collapse, 1940, Tacoma, Washington
(University of Washington Libraries, Special Collections)


Lucy Suchman opens her book, Plans and Situated Actions: The Problem of Human-machine Communication, by comparing different navigation methods employed by European and Trukese navigators. “The European navigator begins with a plan – a course – which he has charted according to certain universal principles, and he carries out his voyage by relating his every move to that plan. His effort throughout his voyage is directed to remaining ‘on course.’ If unexpected events occur, he must first alter the plan, then respond accordingly.”

The Trukese navigator, on the other hand, “…begins with an objective rather than a plan. He sets off towards the objective and responds to conditions as they arise in an ad hoc fashion.” The Trukese navigator steers according to information provided by the wind, tide, stars, clouds, and the sound of water on the side of his boat. While his objective is clear from the outset, his actual course is contingent on the unique circumstances that he cannot anticipate in advance. “His effort is directed to doing whatever needed to reach the objective. If asked, he can point to his objective at any moment, but he cannot describe his course.”

Suchman asserts the European navigator exemplifies the prevailing cognitive science model of purposeful action and “… the view that purposeful action is determined by plans is deeply rooted in the Western human sciences as the correct model of the rational actor… My own contention, however, is that as students of human action, we ignore the Trukese navigator at our peril.”1  

The following two stories demonstrate how adopting the “navigation philosophy” (either of the Trukese’s or the European’s) affects how one copes with failures during project implementation. The first story is told by David Panhorst, an R&D manager at Armament Research, Development and Engineering Center (ARDEC):    

…I managed the development of the Army’s first smart tank bullet. The projectile’s built-in sensor searches the ground for targets. Upon locating a target, the sensor aims and triggers a warhead to hit and defeat the target from above, where it is most vulnerable. This fly-over, shoot-down technique differs from the hit-to-defeat method used by current tank ammunition.

The first two iterations of fin design tests failed miserably. Since David had budgeted only for two iterations, more tests would have resulted in higher costs for the project. To contain cost growth, David authorized additional test objectives as part of the fin retest. The problem with having so many layers of test objectives, however, was that with each failure, all subsequent test data were lost. Nevertheless, the team pressed on and continued to add bells and whistles to each successive redesign test until they had gathered enough data on other subsystems to solve the problem.

It wasn’t until we stopped and regrouped that we were able to refocus on the original objective and concentrate on the fin deployment problem, which, after all, is the first thing that needs to happen. By refocusing our efforts on the primary objective, we fixed the problem and developed a fin that opened reliably and predictably.  

Following this project David reflected: As children, the first lesson we learned when we were introduced to ball sports, such as baseball or tennis, was “Keep your eye on the ball.” In project management, it is no different, you must maintain project focus. In our case, we should have focused on validating the project concept rather than minimizing test cost. 2   
Terry Little from the U.S. Air Force suggests a completely different way of coping with such failures during project implementation:

We had a test where a warhead didn’t go off. The flight of the cruise missile was perfect, but the warhead didn’t go off. You could look at this and say, “Well that’s just one little piece of it.” But if the warhead doesn’t go off, you don’t have a weapon.

In this case, we quickly determined that there was a design problem with the fuse, and we moved on to have a successful retest—but not before we mined every bit of information that we possibly could out of that failure.

Whenever there is a failure, the first thing to do is to go through a short grieving period. On JASSM [Joint Air-to-Surface Standoff Missile], whenever we had a failure, I allowed grieving for one day. We could grieve and mope, get drunk, wring our hands, say “ain’t it awful.” We could do that for one day, and then it was time to put it behind us. That’s a Terry Little rule.

… When you have a problem on a project, all of a sudden people want to know, ‘What’s going on? What’s the problem?’ Most project managers want those questions to go away, so there is a tendency to want to jump to the solution and respond, “Don’t worry, we’ve got this in hand. It was just one of those things.”

What you need to do is dig until you get to the root cause of the problem, until you are certain that you understand why this failure happened. Yes, it takes a little longer. Yes, it costs more money. In my case, a week to get to the root cause of a problem is $4 or $5 million. But you’ve got to do it, so that when you move forward you know that you have learned from the failure. A lot of times what caused it will be a gnat, but you should still kill it with a sledgehammer and smash it to bits until you are convinced that, though you may have other failures in the future, that particular one will never occur again—ever. You can’t move forward from a failure if you’re worried about repeating the same mistake.3   
These two stories present two strikingly different management philosophies. Following the European navigator approach, David Panhorst was too concerned with the original test plan (which called for only two iterations of design tests), rather than focusing on the objective of the project (“to develop a fin that opened reliably and predictably”). More importantly, he was reluctant to modify the original plan. In contrast, Terry Little quickly modified the plan in order to spend the necessary time mining “every bit of information” from the failure. Like the Trukese navigator, Little’s primary focus was on the objectives: “… if the warhead doesn’t go off, you don’t have a weapon.” In so doing, Little exhibited that a focus on objectives requires a readiness to modify the plans leading to those objectives!
1 Suchman. 1987. Plans and Situated Actions: The Problem of Human-machine Communication. Cambridge UK: Cambridge University Press, vii. Suchman refers to a story, quoted by G. Berreman, which was written by T. Gladwin in 1964. L.G. Berreman. 1966. Anemic and Emetic Analyses in Social Anthropology. American Anthropologist 86, 2: 346-54. T. Gladwin. 1964. Culture and logical process. In Explorations in Cultural Anthropology: Essays Presented to George Peter Murdock, ed. W. Goodenough.  New York, NY: McGraw Hill.
2 Keep Your Eye on the Ball,” David Panhorst, ARDEC. 2000. In A. Laufer and E.J. Hoffman, Project Management Success Stories, 131-33.  New York, NY: John Wiley & Sons.
3 “A Terry Little Rule,” Terry Little, U.S. Air Force. 2005.  In A. Laufer, T. Post, and E.J. Hoffman, Shared Voyage: Learning and Unlearning from Remarkable Projects, 123-24. Washington, DC: The NASA History Series.

Monday, March 23, 2015

The Impact of Project Collaboration



This month’s blog is adapted from a recent article entitled “What Successful Project Managers Do” published in the March 2015 issue of the MIT Sloan Management Review. In addition to Alex Laufer and Jeff Russell, the other co-authors are Edward J. Hoffman, NASA's Chief Knowledge Officer, and Scott W. Cameron, Global Project Management Technology Process Owner at Procter & Gamble.  

In the article we demonstrated that the first role assumed by successful project managers in a variety of industries is to develop project collaboration. In today’s blog we present four examples that highlight the paramount impact of project collaboration.

Example A, The Power of Trust

NASA project manager Tim Flores looked at three projects at the Jet Propulsion Laboratory (JPL): the Pathfinder, Climate Orbiter, and Polar Lander. Although all three projects had the same guiding principles, similar scope, and shared elements (including some team members), the Pathfinder was a success while the other two failed.

Flores thought he’d find that the Pathfinder project stood out from the others in things like resources, constraints, and personnel. To some extent this was true, but what he found to be the primary difference was the level of collaboration. The Pathfinder team developed trusting relationships within an open culture. They felt free to make decisions without fearing severe punishment for mistakes. The other two project teams never developed such a high level of trust.

Example B, The Power of Openness

NASA’s Wide-Field Infrared Explorer (WIRE) mission studied how galaxies formed and evolved. They used a telescope so delicate it was sealed inside a solid hydrogen cryostat. Shortly after launch, the cryostat’s cover ejected prematurely and the Explorer craft tumbled wildly through space. The entire mission failed.

Jim Watzin, project manager at NASA and a team member at the WIRE project, commented on the official NASA after-action report: “WIRE failed because people could not or would not communicate well with each other….Individuals simply were uncomfortable allowing others to see their work.” He added: “The real lesson of this loss is that any team member that does not participate as a true team player should be excused (from the project).”

Example C, The Power of Partnership

Allen, a payload manager in NASA’s Advanced Composition Explorer (ACE) project, developed trust between his team and 20 groups of instrument-developers using a three-stage plan. First, he picked team members who could operate in a university environment—they knew when to bend or even break the rules. Second, he permanently moved his team into a university environment, realizing it would provide a more open, flexible culture than the home-base environment. Third, he came up with an uncommon process for interacting with the scientist groups.

Allen anticipated that it would be difficult to get the scientists to regard his team as partners. Having dealt with NASA before, the scientists tended to believe Allen and his people would demand tons of paperwork and that things be done “just so.” Many of the scientists weren’t even sure they should share with Allen’s team the problems they were encountering.

Because Allen’s team had to review instrument development, he believed the best way to do this was to focus on building trust and convincing the scientists his team was there to help them problem-solve. So Allen and his team went to each university and stayed on-site for an extended period of time, spending days and nights working with the scientists, helping them solve their problems. They were there not as controllers but as partners.

Example D, The Power of Interdependence

Procter & Gamble (P&G) launched a large construction project at one of its European plants. Karl, the contractor’s project manager, had no interest in team-building efforts and kept brushing them off. Pierre, the P&G project manager, finally found an opportunity to change Karl’s attitude. Three months into construction, the contractor (Karl’s group) accidentally placed the foundations off the mark, pouring about 600 lineal feet of foundation concrete in the wrong place.

Pierre recognized the volatile and disruptive nature of the mistake. He could have ordered Karl to fix the mistake by tearing down the foundations and starting over, but he realized that would have damaged the contractor’s reputation and ego. He understood that the best plans evolve in waves as the project unfolds and more information becomes available and reliable. He thus chose a different approach. Pierre took several days to meet and negotiate with users and designers, modified the interior plant layout, and minimized the damage without tearing down the foundations and starting over.

While the mistake resulted in costly changes, Karl’s reputation was preserved and the project wasn’t delayed. Eventually Karl came around to embrace Pierre’s team approach—namely, “If they fail, we fail.” Realizing they are all interdependent led to the development of a collaborative relationship.  Developing mutual interdependence and mutual responsibility for project results is key to project success.