Crisis Management Lessons from a ‘Steely-Eyed Missile Man’

Crisis Management Lessons from a ‘Steely-Eyed Missile Man’

A NASA flight director’s down-to-earth wisdom for business leaders.

We’ve all had tough days at work.

But few of us have made decisions that risked billion-dollar spacecraft and astronauts’ lives.

As a NASA flight director and director of mission operations, Paul Sean Hill managed some pretty dicey situations. Like the time in 1991 when a cooling-system failure imperiled the shuttle Discovery, demanding fast action that ran counter to many experts’ opinions.

Along the way, Hill, author of Leadership, from the Mission Control Room to the Boardroom, picked up some down-to-earth wisdom that would be valuable for any leader — especially about making fast, informed decisions and training teams for the unexpected.

“Regardless of what you’re managing,” he said, “your team has the opportunity to screw up a project, cost a customer money, run off customers, or even bankrupt the company. While that doesn’t appear as impressive as blowing up a rocket and killing astronauts, you can still cause failures that the company can’t afford.”

But how do leaders stay as cool as the so-called “steely-eyed missile men” of NASA, whatever the crisis, or industry?

For starters, there’s no substitute for relentless preparation.

“Take cyberattacks,” Hill said, “if you were to compare that to flying in space, the thing that you want the team to always have in front of them is, What is the worst that can happen and what is our no-kidding fail safe? What actions do we have to take to secure our data and our system? Even if it’s an overreaction.”

In situations with so much at stake, he added, there may no downside to overreaction — or over preparation.

“You have to go through all of those types of thought processes in the calm light of day,” he said, “when you’re not under attack. Then you have to practice the decision-making in simulations, like we do in space flight, to make sure that when it happens you really are going to decide that way and there’s not something that you missed.

“And then you have to hold people’s feet to the fire to do it.”

Follow the Playbook (or Toss It Aside)

At the same time, flexibility is critical. In the case of the 2001 shuttle incident, some of those best-laid plans went out the proverbial window.

When the cooling system showed signs of serious failure, one engineer feared a buildup of ice, which if dislodged could have destroyed sensitive parts of the system, leaving the Discovery unflyable. Following conventional shuttle wisdom, other engineers believed such an ice buildup was impossible.

In the end, Hill made the call: Assume the problem was ice and position the spacecraft to receive maximum heat from the sun. Later tests on the ground proved that he was right — and that the situation had indeed been dire.

“If we had kept going,” Hill explained, “kept flying the way we had been flying and didn’t take the actions that we had, we likely would have lost that shuttle and would have had to send a rescue shuttle up to get those astronauts down from the International Space Station.”

Another option, immediately bringing Discovery home for an emergency landing, could have spelled disaster from a total cooling system failure.

But while there were disagreements among the mission control team, a clear decision was reached with the best information available.

That is exactly what we’re trained for: making those types of decisions in the worst of conditions.

- Paul Hill

“As we went through it,” Hill recalled, “you would have been shocked at the lack of adrenaline, the lack of apparent excitement in the room. Because that is exactly what we’re trained for: making those types of decisions in the worst of conditions.”

And sometimes not making the expected decision.

“In those cases,” Hill said, “it is as important not just to be willing to take the action but to be able to say out loud, ‘Here’s what I’m seeing. Here’s why I think it’s bad here. Here’s what I think we need to do. And by the way, some of our experts think that this isn’t the right thing to do. I think it is and here’s why.’ ”

It was hours later when Hill allowed himself the luxury of a few emotions.

“I walked outside and looked up at the sky and my hands were shaking a little bit,” he admitted. “I thought, ‘Oh my God, did that just happen?’ Because at some point you do look back and realize, it’s a damn good thing we made the right decisions because we weren’t even completely sure how much we had at stake.”

In Space, No One Can Hear You Scream ‘Too Much Information!’

Data is, of course, essential to any good decision, especially in a crisis. But today, information overload is a constant fear. So it’s critical for leaders to ensure that their experts have access to smart data, not just big data.

“You need some combination of the right people evaluating the data,” he said, “rather than one person becoming data overloaded. But you also need to have the right tools, the right systems. Anybody that’s dealt with big data in various industries knows that you better have the right software processing that data so you can make sense of it.”

That’s true everywhere as the Internet of Things connects more and more endpoints, though in a modern spacecraft the data deluge is particularly intense.

“There’s something crazy like 800,000 or a million different sensors all taking different measurements on the space station at any given moment,” Hill explained.

“You better have done your homework in the software that’s processing all that data and having the people sit in front of the systems evaluating that data so it’s not going to either mislead you into thinking we don’t have a problem, or fool you into taking a rash action.”

In any industry, Hill stresses that technology change often must happen hand-in-hand with culture change. And sometimes that involves generational conflict, as younger workers expect new technologies and greater responsibility.

“Where we get stuck,” Hill warned, “is we frequently will point back down from the executive level, and we’ll constrain the people below us.”

In the 1990s, for example, Hill said his calls for new technology were sometimes met with, “We didn’t need that to go to the moon. Dammit, why do you guys need that now?”

In a crisis or not, the right balance of experts — old, young, from different disciplines, etc. — is key to good decisions, Hill argues. But the leader has to make the final call.

“Have the management processes to make sure that all of those people are heard,” he clarified, “but let’s make sure they are heard by the levels of management who also have the right perspective and the right experience to make the decision.”

In NASA’s case, Hill said, the flight director simply can’t absorb enough information alone, and needs to delegate decisions to other experts. Of course, in the digital age, that’s true just about everywhere — whether facing a crisis or just staying ahead of the curve.

“As a leader you can do the same thing,” Hill concluded. “Encourage each level of management to push as much of their authority down as possible. But a necessary component of that has to be at each level, asking why. ‘Hey, why is it you think this is the right answer?’ ”

Did you like this article?

3
0

Risk Management & Security