November 2 executive seminar

November 2 executive seminar

Friday, July 17, 2015

Conduct Learning-Based Reviews (part 2 of 2)

This is the second of a two-part story. (Read part 1 here.)
Changing one’s mindset to consider reviews as a vital learning opportunity is not easy. While the reviewing organization often refuses to abandon the “review as control” perspective, the subject of the review may be equally resistant to changing its approach, due to overconfidence or skepticism. In the following two examples, winning or losing a project was largely determined by the contractors’ willingness (or lack thereof) to learn from project reviews.
The first example comes from Jenny Baer-Riedhart, a NASA program manager. In 1994, NASA initiated the Environmental Research Aircraft and Sensor Technology (ERAST) program, which focused on converting high-altitude, long-endurance unmanned aerial vehicles (UAVs) into research platforms. Because of the difficulty in controlling the risks involved, UAV industry development lagged behind the interest in and knowledge of how to improve the technology.
To mitigate the risks and stimulate the industry, NASA adopted a radically different approach and formed a joint sponsored research program with four of the main players in the industry. For the ERAST reviews, NASA brought in people with experience in a particular area of aircraft development and testing. The companies not open to NASA’s advice did not fare well. Although one of these companies was superb on paper, with genius employees, it crashed its UAV twice. Reflecting on the case, Jenny concluded that had the company been open to NASA’s advice during reviews, it might have prevented the crashes.1
Similarly, Terry Little the U.S. Air Force’s project manager of Joint Air-to-Surface Standoff Missile (JASSM) concluded that learning-based reviews played a major role in distinguishing between the two finalists. Although the losing company had good engineers and disciplined processes, it failed to listen to feedback, which led to its downfall.2
Learning-based reviews are also highly regarded in industries that engage in more traditional projects, such as the design and construction of manufacturing facilities. One approach that naturally facilitates a learning rather than a control focus in the review is to establish review panels composed of peers, rather than senior managers or experts, who are expected to report to senior managers following the review. In praise of the peer review practice employed at Proctor and Gamble, Scott Cameron, Global Project Management Technology Process Owner at P&G, asserts: “The most successful method we have found to improve project performance is to conduct anywhere between 1-5 peer reviews throughout the life of a project.” 3
Peer Review Practice
Purpose of the Peer Review
To gain the most valuable input in the shortest amount of time to improve the chances for a successful project and avoid disasters.
Whom to Invite
Just peers, no hierarchy. A diverse group of ten to twenty people consisting of technical engineers, project managers, construction managers, purchasing managers, finance managers, research and development personnel, and contractors. 
What Protocol to Use
The project team and the project manager concisely communicate their technical and execution strategies. They then invite peers to comment, critique, and ask clarifying questions. Pre-work can be sent out to the peers to review prior to the meeting. Peers should be open, honest, and engaged, or should not bother to attend.
How Long Should It Be?
A maximum of 6-8 hours, including lunch and breaks.
How to Summarize the Discussion
Take copious notes and display them on the wall. In the last peer review I attended, there must have been 30-40 pages of flip-chart paper capturing the ideas on a $50MM project. These were then typed and distributed to all the participants with a note thanking them for helping improve the success of the project.
What to Expect of a Peer Review
Out of the notes, there were only 5-10 “nuggets” that the project team used to improve the project. Implementing these nuggets more than made up for the cost of the peer review. As we have conducted more peer reviews, we’ve noted that the invited peers are taking “nuggets” they had not considered back to their projects and programs.4

1.    Laufer, A., Hoffman  E., and Cohen, D. (2012). “Flying Solar-Powered Airplanes: Soaring High on Spirit and Systems,” Chapter 3 in Laufer, A., Mastering the Leadership Role in Project Management: Practices that Deliver Remarkable Results. NJ: FT Press.
2.    Laufer, A., Ward, D., and Cockburn, A. (2012). “Developing a Missile: The Power of Autonomy and Learning,”  Chapter 1 in Laufer, A., Mastering the Leadership Role in Project Management: Practices that Deliver Remarkable Results. NJ: FT Press.
3.    The Hour Glass and the Project Manager,” W. Scott Cameron, Procter & Gamble. 2001. Ask Magazine 4 (July): 27-8.
4.    ibid.

Friday, June 19, 2015

Conduct Learning-Based Reviews (part 1 of 2)

A previous story at this blog, Saving Lives: Expecting Problems or Burying Them (June 2014), highlights learning during the life of a project through a metaphor from the operating room. In this, the first of a two-part article, we will demonstrate how successful project managers ensure that learning plays a central role during project review. This month we focus on practices employed at NASA, while next month we will discuss examples from Procter & Gamble and other organizations.
Project reviews are seen primarily as a means of control by the client and upper management. Brian Muirhead from NASA, who led the design, development and launch of the flight system for the Pathfinder mission to Mars, describes the prevailing atmosphere during the review process:
“The routine is daunting. Members of the board sit at a horseshoe-shaped table, the chairman in the middle. A team member stands in front of them and launches his presentation. It usually isn’t long before one of the review board members interrupts the presenter with a question—rather like an attorney presenting oral arguments before the supreme court. The skeptical expressions, the intense looks, the scowls and smiles, are giveaways. And just as at the supreme court, the questions are generally polite, occasionally harsh, but all with a clear aim of probing for the truth. 1
Because project reviews are perceived as serving the needs of upper management, insufficient attention is paid to the overall needs of the project team, and in particular to the negative implications of the preparations required for the review. Brian Muirhead discusses the time leading up to a project review during the Pathfinder mission:
“Formal project reviews come with a clear, but unavoidable, downside. Done well, the preparations can take an enormous amount of time for the team. Preparations for a formal board review can take too many of us—me and the project’s top managers plus a number of key managers and engineers at the next level down—off the line for as much as six weeks. Necessary to the overall process, but a significant distraction; and even worse, a significant loss in momentum.”2
At NASA, two other project managers, dissatisfied with the perceived role vs. actual practice of project reviews, took steps to radically change the situation.
With his project up for another review, Marty Davis, a project manager at Goddard Space Flight Center, developed ways through which project reviews could benefit the person being reviewed more than the reviewer. Marty pushed for the creation of a review team, composed of internal technical staff and external specialists, who could provide feedback and joint problem-solving. In addition, Marty requested that, in order to provide consistency and eliminate the need to revisit issues, these same individuals participate in review milestones throughout the project lifecycle.
Marty was assigned an internal co-chair and recommended an external co-chair. He told both co-chairs that they could have seven members, and neither could duplicate the same technical specialties. Incorporating his approach into the review process, Marty’s next review lasted two days, with one day of presentation and one day of one-on-one sessions, followed by a caucus with the review team. The independent experts identified areas of concern, many of which, after one-on-one meetings with the specialized project staff and the review team’s technical specialists, were resolved. The issues that remained open were assigned a Request for Action (RFA). Eventually, Marty was left with just five RFAs.3
Susan Motil, another project manager from NASA, used Marty Davis's model after a bad experience with Concept Review. Desiring involvement in the review board selection, Susan wasn’t trying to take the panel’s independence or hide a problem, but rather look for particular expertise. She did just that, and acquired a panel with handpicked expertise and management approval. Her two sets of reviews – one for each subsystem, and one for the entire system – would have a dialogue with the engineers, who would show them the hardware and test data, and be open for reviewers’ questions.
Susan compared the direct outcomes of the initial, unsuccessful Concept Review, and the second review based on Davis’s model. Davis’s model allowed the team to spend significantly less time and effort on the RFAs, and it cost the project about $200,000, as compared to the $700,000 price tag for the initial review.4
Both Marty and Susan concluded that learning-based reviews are a must. They can help you identify problems in your project, which may make the difference between mission failure and mission success, and if implemented effectively, they can be accomplished without excessive interruption to project progress and with limited extra cost.
NEXT MONTH:  In part 2 of this two-part article we will look further at the role of learning in projects with examples from Procter & Gamble and others, concluding with a suggested practice for peer review.
IN THE MEANTIME:  Read more on integrating planning and review with learning in the recent article, “What Successful Project Managers Do,” published in the MIT Sloan Management Review.5

1.      B.K. Muirhead & W.L. Simon (1999). High Velocity Leadership: The Mars Pathfinder Approach to Faster, Better, Cheaper. New York, NY: Harper Collins Publishers, 23-4, 86-7.
2.      ibid.
3.      Davis, M. (2001). “Tangled Up in Reviews.” Goddard Space Flight Center. Ask Magazine 4 (July): 8-11.
4.      Motil, S. (2003).  “So This Is Knowledge Sharing.” Glenn Research Center. Ask Magazine 10 (January): 6-9.
5.      Laufer, A., Hoffman, E. J., Russell, J.S., & Cameron, W. S. (2015). “What Successful Project Managers Do.” MIT Sloan Management Review, 56(3), 42-51. 

Monday, May 18, 2015

Playing it safe -- or not

In his book Augustine’s Laws, Norman Augustine discusses the problems of process-oriented cultures, among them too many regulations and the dangers of “playing it safe.” One of the arguments that Augustine presents against the growth of regulations is: “The fallacy in using regulations to prevent problems is that if managers could ignore the old regulation, they can ignore the new one, too.”  Still, Augustine asserts that “Regulations grow at the same rate as weeds,” as demonstrated in the following example: “… in 1946 the US Atlantic fleet was comprised of 778 ships and sailed under regulations contained in a 72-page pamphlet. In contrast, today’s Atlantic Fleet may only have 297 ships, but it is well equipped with regulations—308 pages of them.”1
Terry Little, the Director of the Air Force Acquisition Center of Excellence, believes that in today’s dynamic environment we must unlearn the old way of thinking, and in particular, we must stop “playing it safe.”
A lot of processes that we have are built on lack of trust. When you hand somebody an 11-page specification rather than a 100-page document, however, you are sending a clear signal that you trust them to do the right thing. My belief is that, as an individual project manager, you can go a long way in that direction by starting not with the notion that someone has to earn your trust, but starting with the presumption that they’re trustworthy until proven otherwise. It allows things like an 11-page specification. My biggest disappointment in the past has been when I have given project managers the opportunity to innovate, and they don’t know what to do with it. They demand processes, rigidity, templates, and prescriptions. It is as if you give them a blank check and they write it for a dollar. What you’ve got to do, is to “unlearn”… all of our processes that are not oriented toward speed or credibility, but are oriented toward not making a mistake, playing it safe.2 
As demonstrated by Scott Cameron, the Global Process Owner of Project Management at Procter & Gamble, successful project managers do not “play it safe.”  
I have noted during my career that there is a never-ending amount of rules and restrictions forced upon project managers under the guise of helping them “be successful” in managing their projects. It appears to be a one-way street; many regulations are added, but few (if any) are removed. I had the opportunity to help clean out such a closet [of standard procedures] as part of a project management leadership team I was part of.
Scott reports that although the “cleaning out” required three consecutive review cycles, eventually they sharply cut the number of standard procedures.  Instead of 18 technical standards and 32 standard operating procedures, project managers at Procter & Gamble are now employing only four of each. 
The project management community was delighted with these reductions and felt empowered by them. It gave them more flexibility to manage their projects and develop their own personal management style. The streamlining process enabled us to reduce the effort, costs and time required to maintain these standards. 3  
Developing a successful project culture requires learning to trust and unlearning the “play it safe” approach.

1.  Augustine, N. (1986). Augustine’s Laws. New York, NY: Viking Penguin, 101-7.

2.  “Speed Merchants: A conversation with Scott Cameron and Terry Little” (2003). Ask Magazine, 11 (March): 26-9.

3.  Cameron, W. S. (2004). “Cleaning Out the Closet,” Proctor and Gamble, Ask Magazine, 20 (November): 19-21.

Monday, April 20, 2015

Coping With Failures

Bridge collapse, 1940, Tacoma, Washington
(University of Washington Libraries, Special Collections)

Lucy Suchman opens her book, Plans and Situated Actions: The Problem of Human-machine Communication, by comparing different navigation methods employed by European and Trukese navigators. “The European navigator begins with a plan – a course – which he has charted according to certain universal principles, and he carries out his voyage by relating his every move to that plan. His effort throughout his voyage is directed to remaining ‘on course.’ If unexpected events occur, he must first alter the plan, then respond accordingly.”

The Trukese navigator, on the other hand, “…begins with an objective rather than a plan. He sets off towards the objective and responds to conditions as they arise in an ad hoc fashion.” The Trukese navigator steers according to information provided by the wind, tide, stars, clouds, and the sound of water on the side of his boat. While his objective is clear from the outset, his actual course is contingent on the unique circumstances that he cannot anticipate in advance. “His effort is directed to doing whatever needed to reach the objective. If asked, he can point to his objective at any moment, but he cannot describe his course.”

Suchman asserts the European navigator exemplifies the prevailing cognitive science model of purposeful action and “… the view that purposeful action is determined by plans is deeply rooted in the Western human sciences as the correct model of the rational actor… My own contention, however, is that as students of human action, we ignore the Trukese navigator at our peril.”1  

The following two stories demonstrate how adopting the “navigation philosophy” (either of the Trukese’s or the European’s) affects how one copes with failures during project implementation. The first story is told by David Panhorst, an R&D manager at Armament Research, Development and Engineering Center (ARDEC):    

…I managed the development of the Army’s first smart tank bullet. The projectile’s built-in sensor searches the ground for targets. Upon locating a target, the sensor aims and triggers a warhead to hit and defeat the target from above, where it is most vulnerable. This fly-over, shoot-down technique differs from the hit-to-defeat method used by current tank ammunition.

The first two iterations of fin design tests failed miserably. Since David had budgeted only for two iterations, more tests would have resulted in higher costs for the project. To contain cost growth, David authorized additional test objectives as part of the fin retest. The problem with having so many layers of test objectives, however, was that with each failure, all subsequent test data were lost. Nevertheless, the team pressed on and continued to add bells and whistles to each successive redesign test until they had gathered enough data on other subsystems to solve the problem.

It wasn’t until we stopped and regrouped that we were able to refocus on the original objective and concentrate on the fin deployment problem, which, after all, is the first thing that needs to happen. By refocusing our efforts on the primary objective, we fixed the problem and developed a fin that opened reliably and predictably.  

Following this project David reflected: As children, the first lesson we learned when we were introduced to ball sports, such as baseball or tennis, was “Keep your eye on the ball.” In project management, it is no different, you must maintain project focus. In our case, we should have focused on validating the project concept rather than minimizing test cost. 2   
Terry Little from the U.S. Air Force suggests a completely different way of coping with such failures during project implementation:

We had a test where a warhead didn’t go off. The flight of the cruise missile was perfect, but the warhead didn’t go off. You could look at this and say, “Well that’s just one little piece of it.” But if the warhead doesn’t go off, you don’t have a weapon.

In this case, we quickly determined that there was a design problem with the fuse, and we moved on to have a successful retest—but not before we mined every bit of information that we possibly could out of that failure.

Whenever there is a failure, the first thing to do is to go through a short grieving period. On JASSM [Joint Air-to-Surface Standoff Missile], whenever we had a failure, I allowed grieving for one day. We could grieve and mope, get drunk, wring our hands, say “ain’t it awful.” We could do that for one day, and then it was time to put it behind us. That’s a Terry Little rule.

… When you have a problem on a project, all of a sudden people want to know, ‘What’s going on? What’s the problem?’ Most project managers want those questions to go away, so there is a tendency to want to jump to the solution and respond, “Don’t worry, we’ve got this in hand. It was just one of those things.”

What you need to do is dig until you get to the root cause of the problem, until you are certain that you understand why this failure happened. Yes, it takes a little longer. Yes, it costs more money. In my case, a week to get to the root cause of a problem is $4 or $5 million. But you’ve got to do it, so that when you move forward you know that you have learned from the failure. A lot of times what caused it will be a gnat, but you should still kill it with a sledgehammer and smash it to bits until you are convinced that, though you may have other failures in the future, that particular one will never occur again—ever. You can’t move forward from a failure if you’re worried about repeating the same mistake.3   
These two stories present two strikingly different management philosophies. Following the European navigator approach, David Panhorst was too concerned with the original test plan (which called for only two iterations of design tests), rather than focusing on the objective of the project (“to develop a fin that opened reliably and predictably”). More importantly, he was reluctant to modify the original plan. In contrast, Terry Little quickly modified the plan in order to spend the necessary time mining “every bit of information” from the failure. Like the Trukese navigator, Little’s primary focus was on the objectives: “… if the warhead doesn’t go off, you don’t have a weapon.” In so doing, Little exhibited that a focus on objectives requires a readiness to modify the plans leading to those objectives!
1 Suchman. 1987. Plans and Situated Actions: The Problem of Human-machine Communication. Cambridge UK: Cambridge University Press, vii. Suchman refers to a story, quoted by G. Berreman, which was written by T. Gladwin in 1964. L.G. Berreman. 1966. Anemic and Emetic Analyses in Social Anthropology. American Anthropologist 86, 2: 346-54. T. Gladwin. 1964. Culture and logical process. In Explorations in Cultural Anthropology: Essays Presented to George Peter Murdock, ed. W. Goodenough.  New York, NY: McGraw Hill.
2 Keep Your Eye on the Ball,” David Panhorst, ARDEC. 2000. In A. Laufer and E.J. Hoffman, Project Management Success Stories, 131-33.  New York, NY: John Wiley & Sons.
3 “A Terry Little Rule,” Terry Little, U.S. Air Force. 2005.  In A. Laufer, T. Post, and E.J. Hoffman, Shared Voyage: Learning and Unlearning from Remarkable Projects, 123-24. Washington, DC: The NASA History Series.