Tuesday, February 18, 2014

Command and Control

Command and Control: Nuclear weapons, the Damascus Accident, and the illusion of safety.  Eric Schlosser.  The Penguin Press, NY.  2013.

Command and Control, by Eric Schlosser, is not a project management book.  Quoting the inside cover of the dust jacket, “Command and Control interweaves the minute-by-minute story of an accident at a nuclear missile silo in rural Arkansas with a historical narrative that spans more than fifty years.”
The anecdotes, stories, and discussions in the book are about systems and practices in operations, maintenance and support.  But projects culminate in systems and practices in operations, maintenance and support.  Command and Control is a case study – or a collection of case studies – on topics near and dear to project management:  risk, management and control, communications management, and lessons learned.  I can recommend the book as a fascinating read for any number of reasons, but this is a blog on project management best practices.  Let’s look at a few lessons for us:  the few, the proud, the PMs.

Excluding military veterans, few of us will have the opportunity (if you call it that) over the course of our careers to participate in a true life-or-death project.  But the servicemen depicted in Command and Control worked too close to a nuclear core sitting on top of an unstable mixture of liquid fuel and oxygen packaged in a paper-thin shell and buried in an underground containment vessel cum concentration chamber.  The book describes occasions when the fail safes intended to protect them (and us) worked and, in minute by minute detail, one specific occasion when they didn’t.
Much of the book is about the history of super-secret military nuclear arms development and deployment during the cold war.  It is worth reading this book to see how often it is one person, championing a cause against a tide of resistance, who is responsible for the implementation of a failsafe or risk-reducing feature that ultimately saved lives.  I think this is an apt metaphor for my experience getting owners and sponsors to responsibly address both project and product risk.

I was also fascinated by the account of the unintended, unplanned testing of the limits of the launch complex blast doors – and the conflicted confidence, by the various players, in the structural integrity of those doors under catastrophic load.  Not to mention the comical -- in other circumstances – decisions to override, bypass and ignore the available protective features.  In our projects, how often, in our urgency to get a system out the door, do we face similar choices and make similar decisions?  For what our project deploys to production, do we ever plan for operational systems management in a disaster under realistic, degraded conditions?
Another relevant theme is organizational decision making under stress.  The US military operates in an hierarchical, command and control structure.  But when she’s about to blow, that structure broke down because of both communication limits and independent agency.  The fog of uncertainty when radios don’t work, the most knowledgeable person can’t be reached, the tools don’t give the information needed, leadership is communicating conflicted priorities, and different experts give contradictory advice at high volume, means that wrong decisions will be made – at all levels up and down the line.  Further, the guy in the bunker, face-to-face with the devil, may decide that he knows better what to do than the general sitting safely under a far-away mountain.  Or he may decide to follow orders anyway, despite knowing it’s the wrong decision.  The book tells the tale of a real event, so there is no magical fairy tale ending.  We as PMs need to understand that, for all our expectations that the team will follow where we lead, those team members are each independent agents with their own motives, priorities, and interests that may – or not – coincide with ours.

Another theme we can learn from is the lessons learned result.  It’s disappointing that over 50 years later, we still haven’t learned to address the systemic root cause of a failure rather than blaming the grunt on the ground.  After all, fixing the root cause means addressing a capability that management and leadership is responsible for;  blaming the grunt deflects the attention away from the upper castes.  This is a topic I am preparing to take up at length in the near future.
We as project managers have two levels of learning from Command and Control.  One is that we can see how things really work when the fan is spinning – it isn’t clean, it isn’t pretty, and it’s not like they describe in the textbook.  Just as importantly, we should also pay attention that our projects, which end and we walk away from, produce these things that people have to live with and work in.  It was projects that built the Titan II missile, the launch complex, the operational processes, and all the complex and complicated pieces that came together and, well, blew up.  It was our professional ancestors that built these products and systems.  What will our progenitors a half century from now be reading about us?

© 2014 Chuck Morton.  All Rights Reserved.