Post 9r: Collision at Sea-What to do? Pt 1
Introduction
This post returns to the NTSB and Navy investigation reports of the collision between the US Navy destroyer JOHN S MCCAIN (JSM) and the tanker ALNIC MC (ALNIC) and asks “What can be learned?” In my experience, “not much” if you depend on the findings of official reports. You can do better than this. I’ll show you how.
The JSM was overtaking ALNIC in the Singapore Traffic Separation Scheme (TSS) when the Helmsman incorrectly reported that he had lost control of steering. While the Bridge watch team attempted to understand what was happening and avoid ALNIC, personnel in After Steering took control of steering with a locked-in rudder order of 30 degrees to port. This turned the ship into the path of ALNIC and the two ships collided. Ten JSM sailors died when their berthing compartment was crushed by the impact of ALNIC’s bow, forty-eight were injured, and the ship sustained over $100 million in damage.
The two most dangerous casualties for a ship at sea in peacetime are fires and a collision with another vessel. For avoiding collisions, the two most important ship control variables are speed and steering. In restricted waters like that of the Singapore TSS, the risk due to loss of either control is magnified due to the presence of many other vessels and navigational hazards like shoal (shallow) water.
Learning from investigation reports is fraught. Based on my experience of reading over a hundred of them, most do not help you learn what to do differently. I noted this in my Preoccupation with Failure post. For the report writers, learning does not seem as important as documenting the calamity and assigning responsibility. This is just the way things are. The responsibility for extracting meaningful lessons from an investigation report and deciding what to do differently falls upon its readers.
This post will not critique the conclusions from either report nor will it describe what the personnel on JSM should have done. Interested readers can study the reports and decide for themselves. The conclusions of the NTSB Marine Accident Report can be found on pages 37-39 of NTSB/MAR-19/01 PB2019-100970, https://www.ntsb.gov/investigations/accidentreports/reports/mar1901.pdf. The conclusion of the Navy Report are on pages 59-61, https://www.doncio.navy.mil/FileHandler.ashx?id=12011.
This post begins with what I consider the main issues associated with the collision. It continues with recommendations for what any watchstander can do to influence some of the main issues. In subsequent posts, there will be recommendations appropriate for junior officers, department heads, and Commanding Officers and other senior leaders.
By “main issues,” I don’t mean errors. I mean what was going on behind the errors, like the blindspots that existed for the crew about deficiencies in the steering control system manuals, the appropriate operating mode of the system, and the state of their training. Blind spots are critical deficiencies in processes or equipment that are not visible. They are things you aren’t aware that you don’t know. You can’t fix a blind spot like “watchstanders lacked a basic level of knowledge on the steering control system” (Navy Report, p.59) if you don’t know it exists. Even if you made yourself an expert on your ship’s steering control system, that’s no protection against your other blind spots.
My recommendations form a pyramid of High Reliability Organizing: junior officers should consider the actions appropriate to their role and responsibility and all the actions I recommend for any watchstander. While it is not possible for department heads and the Commanding Officer to take all the actions recommended for everyone below them, they must be constantly evaluating how well their subordinates are doing them. The best way to do this is through audits and surveillances.
Main Issues Associated with the Collision
In summary, the collision sequence was set in motion by one order: transferring control of thrust while JSM was in a congested waterway and overtaking ALNIC, only 600 yards away. This order and the way it was carried out set the stage for the impact of everything else that went wrong. Latent conditions (post 9L) (Reason, 1997), like lack of formality in changing steering control stations, complicated the watchstanders’ efforts to understand the problem and make evasive maneuvers to avoid ALNIC. The operational status of the steering system (backup manual because the CO erroneously believed that it was more reliable) and the training state of the watchstanders involved in ship control were additional latent conditions.
* Reason, J. (1997). Managing the risks of organizational accidents. Routledge.
The main issues, in chronological order, associated with the collision were:
Lack of proficiency with steering casualty response (watch team on duty, possibly others)
Routinely operating the steering system in a lower reliability mode (CO, unidentified others)
Poor understanding of steering system operating modes and indications (many personnel)
No procedure for transferring steering control between stations
Not using available indications to verify conning orders (Bridge team)
Entering the TSS with the most inexperienced OOD (CO, XO, Senior Watch Officer, the OOD’s Department Head)
Stationing the Sea and Anchor Detail *after* entering the Traffic Separation Scheme (CO)
The simplest lesson from the collision is: don’t enter a traffic-congested restricted maneuvering situation at high speed at night with a Bridge team that doesn’t understand the helm control system and isn’t proficient in a loss of steering casualty under the direction of the organization’s least experienced operator without setting the Sea and Anchor Detail. This is true, but not helpful. The investigation reports are no better, contaminated as they are by hindsight bias, platitudes (“loss of situational awareness”), and vagueness. Attributing the cause of the collision to “a lack of effective operational oversight of the destroyer by the US Navy” (NTSB Report, p. viii) or “poor judgment and decision making of the Commanding Officer” (Navy Report, p.59) provides no insight about what to do differently.
Organizational accidents are like Tolstoy’s description of unhappy families, “All happy families are alike; each unhappy family is unhappy in its own way,” (Tolstoy, 1995, p.1). In the same way, each organizational accident is a unique collection of things gone wrong. What follows is a set of recommendations that anyone at any level of the organization can do to make things safer.
* Tolstoy, L. (1995). Anna Karenina (A. Maude & L. Maude, Trans.). Wordsworth Editions.
What Anyone Can Do
First, read incident reports differently. When reading them the first time, don’t read from beginning to end because you won’t be able to make sense of the sequence of events, which are often a jumble of useful and useless detail. The most useful information is almost always at the end of the report, which usually includes a determination of causality and sometimes recommendations.
What is important and what is useless detail in an investigation report?Most incident reports omit essential context and include unimportant facts. For example, the Navy report noted that “the Officer of the Deck, in charge of the safety of the ship, and the Conning Officer on watch at the time of the collision did not attend the Navigation Brief the afternoon prior” (p.60). Of course attendance provides greater awareness of the risks, but I seriously doubt that having “maximum awareness” as the report stated would have made any difference.
After you understand the causes and recommendations, go back to the beginning of the report and now read the sequence of events. Instead of reading what people did and thinking, “Those idiots!”, think about how their actions made sense to them at the time. Think about what is left out of the report that might be relevant, like the difficulty of entering a TSS at night! For example, the CO rejected the recommendation of three officers that wanted to schedule the Sea and Anchor Detail earlier. If you believe that the safety of the ship or your career is at stake and the CO rejects your recommendation, what will you do? The report of the collision gives you the opportunity to carefully consider this ethical dilemma before you face it. Don’t waste this opportunity by glossing over it. After you’ve thought about it, discuss it with your peers to get their opinions.
Second, the reports noted that “multiple bridge watchstanders lacked a basic level of knowledge on the steering control system” (Navy Report, p.59). You do not have to accept the training you receive and the qualification process of your organization as the final word on *your* level of knowledge. Qualifications reflect the *minium* standard. Don’t be satisfied with that. Instead of thinking, “I’m qualified so I’m done learning,” think of your qualification as a license to learn more, to become an expert. Develop a deep understanding of all the equipment associated with your watch or area of responsibility.
Technical and operations manuals exist for nearly all equipment on Navy ships. Some are really good and some are horrible. Their quality matters less than making the effort to understand their contents and make corrections, when necessary.
From my earliest days in the Navy, I studied all the technical documentation that pertained to the equipment for which I was responsible, both on and off watch. More than once, my careful study found errors or inadequacies. When I thought the technical manuals or system diagrams were bad, I checked my conclusions with people I trusted. I traced systems. I consulted experienced operators and I submitted deficiency reports. More than once, I found contradictions in manuals or information in a technical manual that was omitted from procedures.
The NTSB found that the “steering and thrust control written operating procedures … were inadequate” (NTSB Report, p.38, Finding #24). Don’t wait until after a collision for outside investigators to identify problems with your technical documentation. If you and your chain of command are convinced that a manual or procedure is inadequate, you should create supplemental guidance approved by the Commanding Officer and notify the technical authority by message. I did this once and the technical authority modified its original guidance quickly. You can request help through your chain of command, who can elicit assistance from the staff at your Immediate Superior in Command (the Commanding Officer’s boss). This has the added benefit of alerting other ships to the problem.
Third, know the casualty procedures for your watch and review them constantly. The JSM OOD did not follow the loss of steering casualty procedure, reducing speed to 10 kts instead of “bare steerage way” (NTSB Report, p.14). Frequent review of the causality procedures and standing orders applicable to your watch should be part of your personal continuing training. Since the qualification requirements reflect the minimum standard for knowledge and capability for a watchstation, you should be able to pass a test of your level of knowledge at any time.
Fourth, know how all equipment at your watchstation responds to control signals. The “John S McCain bridge team was not monitoring the Lee Helmsman’s response to orders and therefore did not recognize that the throttles were mismatched” (NTSB Report, p.27). Check system response for every order you give and every action you take. Know what the response should be before you check for it.
Fifth, use and believe your indications. Neither “the CO, XO, OOD, JOOD, JOOW, conning officer, BMOW, helmsman, nor the lee helmsman” used available bridge displays to confirm the reported steering casualty (NTSB Report, p.29). Know the indications for all casualties. Compare indications across redundant or backup displays.
Sixth, in training mode (before qualification), ask “why?” a lot. You have a license to do so. People expect it. Don’t be afraid to admit ignorance even after you’re qualified. In HRO, the stakes are too high to remain ignorant.
Finally, speak up (Weick, 1995)! Recognize that senior leaders make a lot of decisions. Some of these decisions will be bad for reasons too numerous to list here. Take responsibility for asking questions when you can do so without endangering yourself and others. Rickover advised nuclear operators to act as if the fate of the world depended on them. Don’t assume that your boss knows everything. In some circumstances, you’ll have more information than he or she does. When in doubt, speak up to compare your understanding with others. You might be wrong or you might be right. Either way you’ll learn something from the interaction.
* Weick, K. E. (1995). Sensemaking in organizations (Vol. 3). Sage.
Conclusion
I used my list of the main issues associated the collision to formulate recommendations applicable to a person at any level of the chain of command. These issues can exist on any ship at any time. In subsequent posts, I will provide recommendations that are appropriate for junior officers, department heads, and Commanding Officers and other senior leaders.