Just Culture Stories: The Good, the Bad, and the Ugly

April 20, 2018

The author tells three stories about voluntary reporting and just culture. Fitting for this month’s international theme, the stories reflect cultures from North America, Europe, and Asia.

A colleague from a U.S. university called and asked me for example stories related to the implementation of just culture. We spoke for a while as my memory and inclination for story-telling churned up a few examples. At the end of the conversation, the colleague said something like “Bill, those stories are great, you should write them down.” Read below and decide for yourself if they have value to you. The stories are as true as my memory permits.

Generally Recognized Attributes for a Just Culture

The term “just culture” is a household word in aviation safety. The concept advocates responsibility and accountability for each worker. It extends that accountability to the entire organization. Sometimes error is a function of human frailty, or even misfortune. Sometimes the root cause of an error goes beyond human performance and rests with the work environment, the expected activity, and the resources necessary to complete work safely, effectively, and efficiently.

Elements of just culture usually include clear communication and trust between labor and management, shared value of safety, shared desire to know about errors and to prevent reoccurrences, a system to report and investigate events, and a cooperative National Aviation Authority. The just culture policy is usually documented and well understood. Everyone must be “on board” to achieve the just culture.

Just culture programs can be complemented with investigative tools like Boeing’s MEDA (Maintenance Error Decision Aide) and analytic tools like the Outcome Engenuity Just Culture Algorithm or the Baines Simmons FAiR System. Regulatory programs like the Aviation Safety Reporting System (ASRS), the Aviation Safety Action Program (ASAP), and other voluntary disclosure programs support just culture.

With that said, this article is about stories. It is not a Just Culture lecture.

The Good Story: Just Culture Before It Was Cool

A large German engineering company had expanded its MRO around Europe (West and East), into the Americas, and then to Asia, both via organic growth and via acquiring existing MRO facilities. Typically, the German company would rotate executives from the parent company into local management roles. The role would be held for a few years. This story is about the first rotation of a German executive into the Asian work environment.

On one of the German executive’s first days on the job at a newly acquired MRO facility, there occurred a significant maintenance error. An engineering crew damaged a large engine cowling during removal by using the hangar lift. The damage to the cowling was extensive. All employees fully expected termination of the lift operator, who appeared to be the most responsible party. It was likely that other licensed engineers would also lose their long-time jobs.

The new German expatriate executive took the lead on the investigation. There had not been an explicit just culture policy since this event preceded popular adoption of such a concept. Immediately, the executive looked at the work environment, how the workers were trained for the engine cowl removal task, the clarity of the procedures, the adequacy of support equipment, and more. He and his team concluded that some aspects of the work environment – procedures, training, human factors approach, etc. – had not positioned the workers for success and that the maintenance error had been an honest mistake. In a quest for justice, the executive did not fire anyone. They addressed all the contributing factors and installed a replacement cowl.

Later on, the German executive asked the same engineer to operate the lift for the new cowling installation. That was just! The entire workforce learned of the “damaged engine cowl event” and the fair treatment of the worker. That show of just culture influenced the new German-Asian cooperation in a manner that has had an extraordinary long-term impact on safety and efficiency.

The moral to this story is that you do not need a lot of process and procedures to achieve justice. While written policy is very important, a just attitude is most important. 

The Bad Story: A Small Error During Training Can Be Costly

This story goes back nearly eight years, when Airlines for America (A4A) cooperated with the FAA to design, develop, and implement a ground/maintenance version of the Flight Ops Line Operations Safety Audit (LOSA). The maintenance LOSA development process and all related products are available at www.humanfactorsinfo.com

LOSA is a peer-to-peer assessment that takes place during normal operations. It does not have to be triggered by an event. LOSA permits observers to identify the strengths and weaknesses in the organization. Observations are absolutely nonpunitive because no names or identifying characteristics are recorded. Using a threat and error system, the observer may look at safety threats and whether the workers are managing the threats, or not.

Training is critical for LOSA programs to succeed. All employees must understand the LOSA concept. The general population of employees must know that LOSA observations are nonthreatening. LOSA observers require about eight hours of training to ensure that protocols are followed and that data is somewhat consistent among observers.

The negative story occurred during initial testing of the LOSA observation training. The LOSA team and trainers were preparing to launch LOSA at a package carrier for ground observations. In order to start the LOSA testing, there were extensive deliberations between labor and management. It was further complicated by FAA LOSA involvement. It took nearly nine months for all to agree to the LOSA test.

Because it was still in testing, necessary critical LOSA training was not delivered to all employees. The workforce merely saw people with clipboards walking around ground operations. That is seldom a welcome sight. The labor leaders told employees not to worry because it was a test and, in any case, no one would record names.

As fate would have it, one of the LOSA observation trainees noticed that a worker was not wearing protective shoes. Of course, that is a threat to worker safety. It was a valid LOSA observation and the observer noted it. Coincidently, the observer trainee was a friend and next-door neighbor of the manager of that area. During a coffee break the trainee saw the manager in the hallway and casually mentioned the improper footwear observation. The manager proceeded to send the worker home for the day without pay! That small incident negated nine months of planning and set the LOSA implementation back at least an additional year.

The lesson learned is that you cannot half-way implement a critical program. The observer was not ready, the manager was not ready, and what little the workforce knew was wrong on the very first day. That was bad!

The Ugly Story: Be Sure that the Policy is Clear to Everyone

Just culture implementation is not without growing pains. As early as the mid-1990s some airlines were listening to James Reason. Early adopters saw the safety, efficiency, and fairness merits of a voluntary reporting system based on just investigations. One such large carrier decided to test the voluntary reporting concepts. It was a large company with a powerful labor union. The top labor leaders and senior managers saw the potential benefits. When an event occurred everyone wanted to determine the root cause and find corrective actions.

The company went to great lengths to establish reporting procedures and just culture policies. The combination of the union and management delivered training to everyone. Since it was a radically new program, not all managers were convinced of its value and were concerned that it might lessen accountability. Many workers were fearful that a reported error would trigger disciplinary action.

Most just culture champions were at corporate headquarters where the largest repair facility was based. Leadership decided that the initial implementation would be at a satellite repair facility. The reasonable expectation was that it would be easier to ensure 100 percent training coverage for all of management and labor at the smaller facility.

Very early in the just culture implementation there was a maintenance event that would require expensive rework. The workers made a mistake. The supervisors and middle management understood the error and did not take action against the errant workers. When the top manager at the satellite facility saw the cost of the error he took immediate disciplinary action against not only the workers but also the managers who followed the just culture policy.

The union at all company facilities justifiably pulled out of all just culture participation. It was years before confidence in just culture was restored. That was ugly!

Summary

When it comes to voluntary reporting, there are many good, bad, and ugly stories. As you read this article, I am sure you thought of examples from your own experience. As I wrote this article and this summary, I thought of many more. Let me end on a positive note please. I went to an ASAP Event Review Committee meeting. It was like a courtroom hearing. A representative of the errant mechanic admitted that the mechanic did not follow a procedure. He reported the error. In this case, the company representative felt that there should be a stiff penalty. The labor representative felt that a mild letter of reprimand would be acceptable. The FAA member was the last to vote to achieve the necessary unanimous vote for action against the employee. He firmly stated: “I worked at an airline just like this one, with the same aircraft, for 20 years. Nearly everyone ignored that procedure. Let’s stop blaming the worker and fix the procedure.” The end!

About the Author

Dr. Bill Johnson | Chief Scientific and Technical Advisor Human Factors in Aviation Maintenance, FAA

““Dr. Bill” Johnson is a familiar name and face to many industry and government aviation audiences. Johnson has been an aviator for over 50 years. He is a pilot, mechanic, scientist/engineer, college professor, and senior executive during his career. That includes 16+ years as the FAA Chief Scientific and Technical Advisor for Human Factors.

Dr. Bill has delivered more than 400 Human Factors speeches and classes in over 50 countries. He has 500 + publications, videos, and other media that serve as the basis for human factors training throughout the world.

Recent significant awards include: The FAA “Charles E. Taylor Master Mechanic” (2020); The Flight Safety Foundation - Airbus “Human Factors in Aviation Safety Award” (2018), and the International Federation of Airworthiness “Sir Francis Whittle Award” (2017).

Starting in 2021 Johnson formed Drbillj.com LLC. In this new venture he continues to bring decades of human factors experience to aviators, worldwide.