Jump to content

Non-Punitive Reporting Policy


N8

Recommended Posts

Should a Non-Punitive Reporting Policy have a time limit? I see the thought behind this but not sure where I stand on this. I'd like to hear some thoughts (Don I'm looking forward to your 2 cents)

------------------------------------

Levels of Reporting Protection

There are multiple levels of protection for Safety Report submissions. Safety Reports that are received within 24 hours, and are in accordance with the intent of the policy, will be reviewed with the highest level of protection for the employee. Safety Reports that are received after the 24-hour period will be reviewed by the Safety Office, however the level of protection may be diminished based on the nature of the occurrence. This policy is vital to mitigate the “wait and see” attitude for incidents that may be punitive in nature. Reports that are received after a third party report indicating that an occurrence had happened will be dealt with at the discretion of the appropriate manager.

....

It is important to note that if a written report is not received within a 24 hours of a verbal report, full immunity cannot be guaranteed. The intent of this policy is to prevent a “wait and see” attitude towards punitive acts.

Link to comment
Share on other sites

Well it seems to me if you know of an issue and you wait more than 24 hours to report it then you are bordering on negligence and are subject to punative measures. One would like to think that as soon as you see an issue you will report it immediately.

It would be a real shame if you waited and something happened because no mitigating measures were taken.

Link to comment
Share on other sites

N8;

I don't think it's possible, (let alone wise) to assess/judge a reporting system in the abstract. I don't think an answer to the question would be meaningful or important without reference to an existing, working system. I say this because the history of the organization, the leadership, what the relationship between management and employees is, the regulatory oversight and the culture of the mainly the flight safety, flight operations and maintenance operations groups are like.

A system is always "in process". A system just starting out needs to behave differently than a mature system which is known by all to be effective and important to the safety of the organization. In short, there isn't one standard for such systems, nor should there be.

An "immature" system which is in the process of implementing a safety reporting system must build trust over time. To do so it must implement a safety reporting policy which makes it safe at all times under all circumstances for all employees to report mistakes, an untoward event, an observed risk or hazard or other unsafe occurrence. Flight safety reporting systems are known to work best when such system is unconditional and known to be so by most employees, who would normally rarely interact with such a system.

It takes long time, without management or regulator mistakes (such as using safety information to punish an employee or prosecute a case either internally or legally), to build such trust and confidence in a system that no matter what, one can report or self-report and know that the information will be strictly deidentified as far as possible, beyond the reach of the regulator, beyond the reach of flight operations and any others whose positions carry the authority to affect an employees livelihood and well-being within the organization.

That said, most of us live in what is essentially an English common law system in which society extracts punishment for wrongdoing.

The push-and-pull between these two societal facts and social systems is at the heart of the question you've asked. The question is most often asked after an accident where, sometimes ever before the investigation is finished, a crew is "blamed" for the accident and lawsuits launched, without "finding out". In fact we expect that kind of litigious behaviour in societies, both western and eastern.

Each one of us can probably cite a case, not necessarily an aviation one, where such "prosecution-of-the-perp" is justified in our own minds. It's a very old tradition, while self-reporting is, by comparision, brand new and rarely if ever experienced in any other sector of society.

So we are asking a great deal of a very old system of legal notions versus new notions of risk and accidents. In a reporting system like this kind, the notion of "forgiveness" is invoked in order to support a greater good...learning from mistakes and preventing loss of or damage to life, property and reputation.

We have a litigious society and many have argued that that is the very reason aviation improves at all. There may be some validity to that claim but I haven't seen it effectively argued yet. I have seen results of a reporting system however, and know through experience that it is a factor in preventing incidents, and even an accident or two...

These are powerful factors nevertheless. It is a very tough call for those accustomed to immediate responses to an event. "Blame" often comes from childhood and we keep it, sometimes forever. In the industry, the twin notions have an ebb and flow depending upon whether the society is a "law-and-order" society or a slightly more liberal one, (as most are today, except maybe for Arizona...).

My own thoughts are, in a mature safety reporting system where long-established cultural behaviour and standards, (like QANTAS, for example) means that safety reporting is routine and expected, an inordinately "delayed" report which comes to light only after third-party circumstances make it so, would be seen and handled differently from a less mature system still trying to ensure trust.

Where "one is expected to....", (name the behaviour), non-compliance is viewed differntly than in a less well-developed system. The reasons I think are obvious.

To keep this short, - In a new system, (ten years old or so...just an educated guess), the system should be entirely open to the occasional missed report with absolutely no comment or repercussions or record of tardiness. An enlightened, "gentle" reminder should be sufficient if someone has missed reporting. The person who has been "reminded" and who is then submitting the report should be thanked and that would end the matter, period. Avoiding a report signals a deeper problem, and if the system is otherwise working well, then the response and process would be individually oriented. That doesn't mean punishment, but the notion of punishment will be interpreted differently where one may be defensive, as opposed to having other reasons. Clearly however, this is not at the systemic level and enlightened carriers have processes and agreements to which both the association and the airline can turn if indicated.

But discipline, enforcement, "time off", etc as a response? Never. Moon the Loon is absolutely correct. If the airline desires to punish, then it is signalling that it doesn't want to know something and would rather focus on the individual for not reporting than on what might be under the surface. Usually such responses involve either internal politics or cost, usually both.

In a mature system where reporting behaviours are the norm, the response can be more flexible, perhaps a bit firmer even, but it must be accompanied by an open mind as to why the report was missing in the first place. It could be as innocent as a two-week cycle where the employee simply had no earlier opportunity. But it could be the first signs of a changing employee safety culture.

Either way, in either system, one does not want to leap or conclude something too quickly, as one is always going to be "shaded" by one's own views and never the views of the other.

The notion of a reporting system being a "get out of jail card" is kept alive by those who are stuck in cultures which would just as soon punish the individual as fix their system. In fact I would submit that they probably think their system is pretty darn good and doesn't need fixing..."it's just so-and-so who usually screws up" and if we could only get the goods on him/her...

I strongly suspect that if we were somehow able to count the actual occasions when a pilot actually used a report to cover up an intentional act done without regret, that we might honestly see two or three out of the tens of thousands of aviation safety reports received within airlines, especially given that many know that FOQA data is being, (or at least should be) examined by association pilots on a daily basis.

Sometimes just the plain shock of embarrassment and chagring that one could do something "so stupid", takes a bit of time to get over, but in a mature reporting system, I expect that would not happen very often. One kicks oneself, and writes.

I hope this is useful, N8 - Don

John S., regarding your question which is a good one, (what is meant by "safety"?)... I think we just can't invoke the notion (or the word) "safety" and be automatically legitimate in observations or demands - we can't invoke "the safety card" and expect that things will fall to hand. There's a business to run as well. The key is in keeping the balance between the resources needed to continue the operation and the resources needed to ensure that the operation is protected, primarily through knowledge and data, and that comes through flight safety programs such as the one we're discussing here.

For some, safety is absence of events which impede production. For others it is the absence of injury and damage and for others still, it is more narrowly conceived as "no fatal accidents".

To me, safety is watching trends in the data for precursors to an event or possibly an accident while examining outliers and communicating both to operations and flight safety departments. Sometimes an operation comes apart badly and a discussion with the crew (with an association pilot, not airline management) helps advance safety more than a "month of reports"!

I hope that's also useful.

best,

Don

Link to comment
Share on other sites

Thanks for your thoughts folks.

This process will require much growth by all users.

Nate

Nate...it'll never be "there", or "finished"...it's always "arriving". That means it requires constant support and pushing towards non-punitive, unconditional reporting.

That doesn't mean mistakes and intentional avoidance will not occur. But the occasional mis-use of this kind of system will never come close to outweighing the benefits of knowing. There are no supportable reasons in aviation safety work to put a clock on reporting with a scaled punishment after nn hours or days. Politics and the bureaucracy don't count in flight safety work but these areas are where the resistance will come from for the reasons mentioned above.

There are systems in place now that are working in the way I describe, the growth you mention already well underway. I know smaller operators which aggressively support such reporting systems. Today there is superb, hosted software which provides a detailed, robust system which protections and communications built in to achieve the goals of rapid dissemination and confidentiality or levels of knowing depending upon position. These same systems provide "closure" processes where the issues raised are reported as dealt-with and closed. These systems can run (take minutes, record attendees, etc) meetings where safety problems are raised, discussions had and decisions made. The process is moving towards the realization that politics, power, ego and organizational structure can impede flight safety even as these very human factors are unavoidable realities.

Still, we see, mostly on another aviation rumour forum, comments (it's a rumour forum...) that some operators prefer to punish than know, perhaps under the illusion that knowing prevents plausible deniability. It doesn't and can't of course, and that gets proven time and again after an accident, but the point is simple. Knowing, in aviation is safer than not knowing, even if someone temporarily looks bad.

That's why I think owners of smaller operations, and CEOs and senior directors of departments which are directly related to operations in larger organizations should know as much about the principles of aviation safety as the organization's flight safety people doing the work. By that I mean understanding why safety reporting systems and data programs work as well as they do despite their costs, need for resources/staffing and levels of confidentiality. That's the owner's or CEO's business and they need to lead from a position of understanding. They are the ones whose actions and words to the troups letting them know what their organization's priorities are.

Don

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.



×
×
  • Create New...