Trust in the Face of Adversity

The afternoon of February 6th was routine – I made a fruit smoothie, chatted with my mother a bit, and afterwards I would prepare to go on a walk around the neighborhood.  The monotony was broken up when my sister came to the table with her laptop in hand, with what looked like a streaming website on the screen.  The stream was of SpaceX’s test launch of the Falcon Heavy rocket, carrying one of CEO Elon Musk’s luxury cars set to orbit the sun.  My mother and I had not heard of the event, and it was fascinating to see in real time.  It also made my mother very nervous.  She remembered what happened with Challenger over 30 years ago, and even though the Falcon Heavy rocket was unmanned, there was still a sense of tension that was hard to shake.  Thankfully, the launch was a success, and we left the table knowing we have a car orbiting the sun.  After that, I knew that at some point during the semester I had to read Kumar et al on Bounded Awareness and the Challenger disaster.  This blog post will take a look at that in addition to two other articles on KM in the context of disaster:  Chua et al on A tale of two hurricanes: Comparing Katrina and Rita through a knowledge management perspective, and Ibrahim et al’s Information sharing and trust during major incidents: Findings from the oil industry.

The morning of the Challenger launch was cold – far colder than previous launches had to deal with, and there was concern among the engineers that the ship’s O-rings would not function properly.  They implored management to delay the launch, but despite their concerns the launch proceeded as planned.  As the article points out, the managers were not merely business-people with no working knowledge on how rockets work.  On the contrary, many were probably engineers themselves at one point, and were highly knowledgeable people.  In this case, the managers were bounded by their own tacit knowledge gained from countless successful launches in the past.  They saw the information presented by the engineers through a different lens, and while not dismissive, they did not see it being as relevant as the engineers made it out to be.  To me this also indicated a lack of trust between the two parties.  The managers preferred to trust their own existing knowledge over that of the engineers, even though the engineers had the advantage of spending a much greater deal of their work time around the faulty equipment that ultimately caused the failure.

The next article I looked at examined a more recent disaster, hurricane Katrina, and compared it the far less devastating Rita that occurred shortly after.  Federal response to Katrina was slow, and for a while the various support organizations involved did not appear to be coordinating very well.  State and federal agencies acted independently, with multiple people seeming to be in charge.  This stands in stark contrast to the Rita response, where cooperation between agencies was high, and trusted each other enough to do their jobs.  This trust had not been established when Katrina first hit, but going through such a disaster together no doubt provided a powerful learning experience from which to base it around.  While other factors, such as the politics of the time, cannot be ignored, trust and the better organization that came with it played a significant role in Rita’s markedly quick response.

Often operating in relative isolation with dangerous machinery and chemicals, oil rigs must be well prepared to deal with a crisis.  In this environment where a mistake or misdirection could be life-threatening, trust between individuals is more important than ever.  Such was the subject of Ibrahim’s study.  One case study in particular was highlighted, where 13 firefighters died in the Mann Gulch disaster.  Communication broke down between the firefighters and the foreman, and orders were not followed.  The firefighters did not know the foreman, and what little communication occurred between them was not clear, and evidently did not inspire confidence.  It’s not as if the firefighters were not adequately trained, but there was no time to build trust and assure one another that they knew what they were doing.  Hindsight is 20-20 and we can talk about how the instructions could have been communicated differently, but it is at least clear that once again, trust is an essential component of successful information transfer.

Stephen summed up the Ibrahim article with:

“In disasters, knowledge can mean the difference between life and death, and so can the effective transfer of that knowledge.”

Especially after reading the three articles highlighted here, I’m inclined to agree.  Not only does knowledge have to be communicated effectively, but also very quickly.  Without trust between the acting individuals and agencies, there is not as much incentive to follow on the fly instructions, or there will be hesitation where such a thing cannot be afforded.  There is already so much uncertainty in an active disaster situation, and the consequence of communication failure can be fatal.

Chua, A. Y. K. (2007). A tale of two hurricanes: Comparing Katrina and Rita through a
knowledge management perspective. Journal of the American Society of Information
Science and Technology, 58(10), 1518-1528

Ibrahim, N. H., & Allen, D. (2012). Information sharing and trust during major incidents:
Findings from the oil industry. Journal of the American Society of Information Science
and Technology, 63(10), 1916-1928

Kumar J, A., & Chakrabarti, A. (2012). Bounded awareness and tacit knowledge:
Revisiting Challenger disaster. Journal of Knowledge Management, 16(6), 934-949.

10 thoughts on “Trust in the Face of Adversity

  1. Hi Alex, I’ve read two of the three articles (Chua and Kumar) and the thing that struck me about the two articles was that there were lessons to be learned (Hurricane Katrina informed many Hurricane Rita actions) but not enough to completely solve the problem. LIke peeling an onion, the first set of issues led to new problems. The patterns I saw in that article somewhat map to the management actions in the Challenger incident – management does not see the need for the alarm at the same level of relevance – so you end up with (mis)informed actions. That trust is part of the equation is a new wrinkle – I’ll suggest you take a look at the Kothari article wherein he mentions that a team has to trust that someone else is competent enough to augment the team and help inform public health policy because of their knowledge (Social capital perhaps? I hadn’t thought about this aspect until now).

    Like

  2. Btw, as a huge space fan (I used to work on the Space Shuttle program), I thought the single coolest thing of the SpaceX launch day was the simultaneous landing of the side boosters. It was like something out of science fiction. Too bad the center booster had a bad hair day – it would have been a huge success for SpaceX overall – a real “hat trick”.

    Like

  3. Your thoughts on these articles underline what I’m thinking about these days…how do we “know who” in the organization has the expertise that should be valued the most? I have been doing some reading about “expert directories” in big companies, a fascinating idea to me. Currently, I don’t work in an organization big enough to have invested in something like this, but I have been toying with the idea that it would be a good idea to execute some kind of “knowledge audit” that would be accessible, not only in a crisis situation, but also to help take advantage of opportunities that arise when you don’t expect them.

    Like

  4. I used to work at Booz-Allen, which is a global consultancy firm. They had a system called Knowledge OnLine, which we all add to feed our project experiences into in the form of a resume. My resume was 10 pages by the time I left.

    I hadn’t thought about it in decades and just came across mention of it while working on the paper, in an article by Lynne Markus.

    Like

    1. Hi Jenn, sorry to be slow. It was sort of both a chore and cool. It was a chore because annually we had to update the big resume with new project information. It was cool because you could count on new projects coming your way. If you were techy, you could backtrack the people and find who had really cool projects and offer to work with them again.

      Like

  5. I haven’t read the Kumar et al article on the Challenger disaster, but it is next on my list. I encounter that sort of bias where people trust in their own tacit knowledge bounded by past experience. In some situations, the incidence of this phenomenon has been so bad, that it honestly seems like me and the other party are speaking two different languages. Very. Frustrating. There must be balance–our own experiences obviously inform what we know and understand, but the key is being open to hearing other’s people’s thoughts and ideas and being willing to incorporate them into our personal body of knowledge.

    Like

    1. It’s hard to break down those walls of experience. The trust issues is important because it takes much, much more than good information or sound reasoning to modify a person’s, ourselves included, world view.

      Like

      1. I lack a good word for those moments. I call them, ‘epiphanies’ only because it’s a bit like a religious experience.

        Liked by 1 person

Leave a comment