Jump to content

Can someone help me get it?


Mitch Cronin

Recommended Posts

Mitch;

From The Fallible Engineer, November, 1991 (at http://www.uow.edu.au/arts/sts/sbeder/fallible.html )

"Engineer Barry McMahon has found his clients believe that a factor of safety implies certainty plus a bit more and they are far more concerned with the risk of conservative design than they are with other sources of risk. Conservative design tends to be more expensive and so there is always pressure to reduce factors of safety.

"The factor of safety is itself a heuristic which changes with time and circumstance. For a factor of safety to be effective the means of failure must be known and the cause of the failure determinable by experiment. All engineering structures incorporate factors of safety and yet some still fail. When this happens the factor of safety might be increased. However when a particular type of structure has been used often and without failure there is a tendency for engineers to suspect that these structures are overdesigned and that the factor of safety can be reduced. Petroski comments The dynamics of raising the factor of safety in the wake of accidents and lowering it in the absense of accidents can clearly lead to cyclic occurrences of structural failures. He points out that this cyclic behaviour occurred with suspension bridges following the failure of the Tacoma Narrows Bridge.

"This process of fine-tuning or cutting safety margins to reduce costs in the face of success is not confined to structural engineering and is present across all engineering disciplines. William Starbuck and Frances Milliken, researchers at New York University, have studied the Challenger Space Shuttle disaster and concluded that the same phenomenon was present there."

The Starbuck paper on Challenger is worth reading. It can be found at:

http://pages.stern.nyu.edu/~wstarbuc/mob/challenge.html

You've already been exposed to Vaughn's book. The above was written long before, but adds to the notions expressed about risk managment and safety in technologically based enterprises. Airline work is certainly nothing close to Shuttle work, but that doesn't mean the principles aren't the same in terms of mindful behaviours. NASA too, was totally driven by economic concerns until 1986. They're currently having to "re-visit" some experiences and re-learn some lessons.

I would like to introduce the concept of "stochastic" as well. I found this concept in some reading in Gregory Bateson, (Mind and Nature) in reference to biological models. But I thought the notion was useful in other areas as well.

It comes from the root word, "stocha" which means "to shoot", usually with an arrow. It refers specifically to how the arrow finds its target and what the results are.

The concept is neither obscure or complex, (I've seen it used in economic and political arenas..not sure why). A quick example can show this:

An outcome may be termed "stochastic" when it tends towards or favours a certain outcome, but there is a large element of unpredictability involved.

So. A kid is running across a freeway. He succeeds on the first try and many tries after. Sooner or later however, we know what's going to occur.

The outcome is quite certain, but we don't know how it will get there or when, except we know some factors are in place which almost assure the outcome. Pursuing the example, the patterns are obscure in terms of "which car", "which lane" etc until they begin to form fairly rapidly. All this is obvious (and unpleasant) but these processes are at work every day in risky technologies.

There are variations on that theme and I'm sure others here can add to our knowledge.

You ask then, How can someone be that thick?

Easy. They don't have to pay attention to the stochastic process because most of the time, the result is "nothing". Besides, they're probably too busy putting out the hundreds of office brushfires which occur on a daily basis to see.

What's more, because we live in an exclusively quantitative (business-run) world, quantifying stochastic processes is next to impossible, both by definition and by fact. Thus it is exceedingly difficult to demonstrate that a series of actions/causes/events etc etc is leading somewhere. So for someone looking to spend a lot of money for "nothing" is like spending a lot of money on insurance. One questions what one can get away with, (or cut out) after a while.

When "success" is the result, (See? Nothing happened.), a new level of "normalcy" is established. When others come along who don't know the history of why something is the way it is, they go about making people more senior to them happy by finding ways to cut budgets, and when the result is, nothing happens, they're work is justified.

By the time something does happen, the "patterns which connect" have gone.

That's why story-telling and looking at the patterns is so important, but is entirely undervalued in business and indeed in daily life. Data by itself can't predict anything. But "story" (the interpretive gesture) can.

This is simplified of course and bits may perhaps be open to mild challenge, but the essential processes are there and they're true. The NASA report on Columbia is already confirming it as did the Challenger accident.

That's how people making such decisions "get thick".

And in that apparent "thickness", seemingly very intelligent and dedicated people who are doing their level best for the corporation, do these things. Nobody "cuts" corners. They just find less expensive and therefore quantifiably justified ways to do things.

Its akin but not equal to or the same as Jim Reason's swiss cheese model.

Its all in the literature of risk management in high technologies and how to counter this very phenomena. It is not easy by any stretch, especially these days when organizations are claiming they can hardly make ends meet.

Link to comment
Share on other sites

Mitch;

From The Fallible Engineer, November, 1991 (at http://www.uow.edu.au/arts/sts/sbeder/fallible.html )

"Engineer Barry McMahon has found his clients believe that a factor of safety implies certainty plus a bit more and they are far more concerned with the risk of conservative design than they are with other sources of risk. Conservative design tends to be more expensive and so there is always pressure to reduce factors of safety.

"The factor of safety is itself a heuristic which changes with time and circumstance. For a factor of safety to be effective the means of failure must be known and the cause of the failure determinable by experiment. All engineering structures incorporate factors of safety and yet some still fail. When this happens the factor of safety might be increased. However when a particular type of structure has been used often and without failure there is a tendency for engineers to suspect that these structures are overdesigned and that the factor of safety can be reduced. Petroski comments The dynamics of raising the factor of safety in the wake of accidents and lowering it in the absense of accidents can clearly lead to cyclic occurrences of structural failures. He points out that this cyclic behaviour occurred with suspension bridges following the failure of the Tacoma Narrows Bridge.

"This process of fine-tuning or cutting safety margins to reduce costs in the face of success is not confined to structural engineering and is present across all engineering disciplines. William Starbuck and Frances Milliken, researchers at New York University, have studied the Challenger Space Shuttle disaster and concluded that the same phenomenon was present there."

The Starbuck paper on Challenger is worth reading. It can be found at:

http://pages.stern.nyu.edu/~wstarbuc/mob/challenge.html

You've already been exposed to Vaughn's book. The above was written long before, but adds to the notions expressed about risk managment and safety in technologically based enterprises. Airline work is certainly nothing close to Shuttle work, but that doesn't mean the principles aren't the same in terms of mindful behaviours. NASA too, was totally driven by economic concerns until 1986. They're currently having to "re-visit" some experiences and re-learn some lessons.

I would like to introduce the concept of "stochastic" as well. I found this concept in some reading in Gregory Bateson, (Mind and Nature) in reference to biological models. But I thought the notion was useful in other areas as well.

It comes from the root word, "stocha" which means "to shoot", usually with an arrow. It refers specifically to how the arrow finds its target and what the results are.

The concept is neither obscure or complex, (I've seen it used in economic and political arenas..not sure why). A quick example can show this:

An outcome may be termed "stochastic" when it tends towards or favours a certain outcome, but there is a large element of unpredictability involved.

So. A kid is running across a freeway. He succeeds on the first try and many tries after. Sooner or later however, we know what's going to occur.

The outcome is quite certain, but we don't know how it will get there or when, except we know some factors are in place which almost assure the outcome. Pursuing the example, the patterns are obscure in terms of "which car", "which lane" etc until they begin to form fairly rapidly. All this is obvious (and unpleasant) but these processes are at work every day in risky technologies.

There are variations on that theme and I'm sure others here can add to our knowledge.

You ask then, How can someone be that thick?

Easy. They don't have to pay attention to the stochastic process because most of the time, the result is "nothing". Besides, they're probably too busy putting out the hundreds of office brushfires which occur on a daily basis to see.

What's more, because we live in an exclusively quantitative (business-run) world, quantifying stochastic processes is next to impossible, both by definition and by fact. Thus it is exceedingly difficult to demonstrate that a series of actions/causes/events etc etc is leading somewhere. So for someone looking to spend a lot of money for "nothing" is like spending a lot of money on insurance. One questions what one can get away with, (or cut out) after a while.

When "success" is the result, (See? Nothing happened.), a new level of "normalcy" is established. When others come along who don't know the history of why something is the way it is, they go about making people more senior to them happy by finding ways to cut budgets, and when the result is, nothing happens, they're work is justified.

By the time something does happen, the "patterns which connect" have gone.

That's why story-telling and looking at the patterns is so important, but is entirely undervalued in business and indeed in daily life. Data by itself can't predict anything. But "story" (the interpretive gesture) can.

This is simplified of course and bits may perhaps be open to mild challenge, but the essential processes are there and they're true. The NASA report on Columbia is already confirming it as did the Challenger accident.

That's how people making such decisions "get thick".

And in that apparent "thickness", seemingly very intelligent and dedicated people who are doing their level best for the corporation, do these things. Nobody "cuts" corners. They just find less expensive and therefore quantifiably justified ways to do things.

Its akin but not equal to or the same as Jim Reason's swiss cheese model.

Its all in the literature of risk management in high technologies and how to counter this very phenomena. It is not easy by any stretch, especially these days when organizations are claiming they can hardly make ends meet.

Link to comment
Share on other sites

Mitch;

From The Fallible Engineer, November, 1991 (at http://www.uow.edu.au/arts/sts/sbeder/fallible.html )

"Engineer Barry McMahon has found his clients believe that a factor of safety implies certainty plus a bit more and they are far more concerned with the risk of conservative design than they are with other sources of risk. Conservative design tends to be more expensive and so there is always pressure to reduce factors of safety.

"The factor of safety is itself a heuristic which changes with time and circumstance. For a factor of safety to be effective the means of failure must be known and the cause of the failure determinable by experiment. All engineering structures incorporate factors of safety and yet some still fail. When this happens the factor of safety might be increased. However when a particular type of structure has been used often and without failure there is a tendency for engineers to suspect that these structures are overdesigned and that the factor of safety can be reduced. Petroski comments The dynamics of raising the factor of safety in the wake of accidents and lowering it in the absense of accidents can clearly lead to cyclic occurrences of structural failures. He points out that this cyclic behaviour occurred with suspension bridges following the failure of the Tacoma Narrows Bridge.

"This process of fine-tuning or cutting safety margins to reduce costs in the face of success is not confined to structural engineering and is present across all engineering disciplines. William Starbuck and Frances Milliken, researchers at New York University, have studied the Challenger Space Shuttle disaster and concluded that the same phenomenon was present there."

The Starbuck paper on Challenger is worth reading. It can be found at:

http://pages.stern.nyu.edu/~wstarbuc/mob/challenge.html

You've already been exposed to Vaughn's book. The above was written long before, but adds to the notions expressed about risk managment and safety in technologically based enterprises. Airline work is certainly nothing close to Shuttle work, but that doesn't mean the principles aren't the same in terms of mindful behaviours. NASA too, was totally driven by economic concerns until 1986. They're currently having to "re-visit" some experiences and re-learn some lessons.

I would like to introduce the concept of "stochastic" as well. I found this concept in some reading in Gregory Bateson, (Mind and Nature) in reference to biological models. But I thought the notion was useful in other areas as well.

It comes from the root word, "stocha" which means "to shoot", usually with an arrow. It refers specifically to how the arrow finds its target and what the results are.

The concept is neither obscure or complex, (I've seen it used in economic and political arenas..not sure why). A quick example can show this:

An outcome may be termed "stochastic" when it tends towards or favours a certain outcome, but there is a large element of unpredictability involved.

So. A kid is running across a freeway. He succeeds on the first try and many tries after. Sooner or later however, we know what's going to occur.

The outcome is quite certain, but we don't know how it will get there or when, except we know some factors are in place which almost assure the outcome. Pursuing the example, the patterns are obscure in terms of "which car", "which lane" etc until they begin to form fairly rapidly. All this is obvious (and unpleasant) but these processes are at work every day in risky technologies.

There are variations on that theme and I'm sure others here can add to our knowledge.

You ask then, How can someone be that thick?

Easy. They don't have to pay attention to the stochastic process because most of the time, the result is "nothing". Besides, they're probably too busy putting out the hundreds of office brushfires which occur on a daily basis to see.

What's more, because we live in an exclusively quantitative (business-run) world, quantifying stochastic processes is next to impossible, both by definition and by fact. Thus it is exceedingly difficult to demonstrate that a series of actions/causes/events etc etc is leading somewhere. So for someone looking to spend a lot of money for "nothing" is like spending a lot of money on insurance. One questions what one can get away with, (or cut out) after a while.

When "success" is the result, (See? Nothing happened.), a new level of "normalcy" is established. When others come along who don't know the history of why something is the way it is, they go about making people more senior to them happy by finding ways to cut budgets, and when the result is, nothing happens, they're work is justified.

By the time something does happen, the "patterns which connect" have gone.

That's why story-telling and looking at the patterns is so important, but is entirely undervalued in business and indeed in daily life. Data by itself can't predict anything. But "story" (the interpretive gesture) can.

This is simplified of course and bits may perhaps be open to mild challenge, but the essential processes are there and they're true. The NASA report on Columbia is already confirming it as did the Challenger accident.

That's how people making such decisions "get thick".

And in that apparent "thickness", seemingly very intelligent and dedicated people who are doing their level best for the corporation, do these things. Nobody "cuts" corners. They just find less expensive and therefore quantifiably justified ways to do things.

Its akin but not equal to or the same as Jim Reason's swiss cheese model.

Its all in the literature of risk management in high technologies and how to counter this very phenomena. It is not easy by any stretch, especially these days when organizations are claiming they can hardly make ends meet.

Link to comment
Share on other sites

mmmmmm.... It seems, at times, that those who claim nothing will ever change until... ... may be correct. Quantifying the un-happened is pretty un-doable task.

A lady and her daughter were killed near here yesterday at a railway crossing... Likely, that railway crossing will now get the barrier bars installed. But for a long time it wasn't a problem... right up until it was.

If I understand correctly, that was a stochastic result.

So risk management has the percieved risks weighed against the costs of eliminating those risks, and somewhere, a man pays the price of error by losing his wife and daughter. Yet we'd surely all agree that some risks are so minimal that they don't warrant huge expense to counter... (I would argue all railway crossings ought to have barriers) Where do we draw those lines? Pretty tough decisions.

In aircraft maintenance, as in flying, the risks are huge, but so are the costs of "unnecessary" insurance. The after the fact changes - like installing the barriers - in this industry are often more expensive though, and have come at a price none would have been willing to pay, had they been able to see. The risks too often seem to be underestimated.

Thanks for the links Don... I'll put that book on my wanted list... I've read "fine tuning the odds" (thanks to you) before... but maybe I'll read it again anyway....

Cheers,

Mitch

Link to comment
Share on other sites

Thanks for that Don... I tried a response of sorts, but it's apparently been swallowed by the great gods of internet ether... I'm off to work, but I'll read this again tomorrow...

I'll add "The Fallible Engineer" to my books wanted list.

Cheers,

Link to comment
Share on other sites

Ah CARAC. Opportunity lost, and lost again.

This is a process controlled by insiders and those whose corporate memory exceeds that of the current Transport Canada staffers. It's like watching a game of Euchre some days.

To say this process is corrupt would be wrong. Corrupt is predictable. This process is naiive. These are people who genuinely think they are right. As a result they err with pride and authority. Quel mess.

That is the reason why folks like Msr. Jenner can say what he does without being run out of the room. BTW he took the same approach to flight crew duty times. I believe his words then had something to do with "were there enough accidents", or something to that effect.

Lobbyists do what they are paid to do. They don't complain about fatigue or they are gone. They don't understand, or particularly like, those whose lives they romanticise as spoiled, pampered and idle. They fence with civil servants whose idea of job security is tied to how long they can stand the environment, not whether or not the environment itself will disappear.

ALPA© and ACPA, like ALPA before them, had a chance to put the brakes on Flight and Duty Time. What happened? Two things, both of which you will recognise.

First, the group fissured along contract and safety lines. Basically, insisting on full rest as per NASA would have equated to more days and would have destroyed the commuters overnight. While the safety folks knew what was best, the industrial issues clouded the message. A clouded message in Tower C is no message at all. Next memo.

Second, and related to the first, ATAC, AQTA and NATA saw their opportunity. They intevened above the Working Group, above the Technical Committee, over the Chief's head and put a knife in the ribs of the entire process. The Director of the day issued an edict and a moratorium on change at the same time, et viola.

Civil servants have a very low pain threshold when it comes to conflict. The employee groups (CUPE. ACPA, ALPA©) only engage when their members lose tolerance, so are already excited when the discussions start. This has to be done calmly and with focus.

Is it safety you want? Fair enough, go for safety. Set limits that are not to be negotiated away for more money, or lifestyle improvements for those senior enough to avoid reserve. Put an irrefutable safety case together and you will win, but it will cost your members money. In the end, will they accept that outcome?

I doubt we will see that day.

Only my opinion

Vs

Link to comment
Share on other sites

Ah CARAC. Opportunity lost, and lost again.

This is a process controlled by insiders and those whose corporate memory exceeds that of the current Transport Canada staffers. It's like watching a game of Euchre some days.

To say this process is corrupt would be wrong. Corrupt is predictable. This process is naiive. These are people who genuinely think they are right. As a result they err with pride and authority. Quel mess.

That is the reason why folks like Msr. Jenner can say what he does without being run out of the room. BTW he took the same approach to flight crew duty times. I believe his words then had something to do with "were there enough accidents", or something to that effect.

Lobbyists do what they are paid to do. They don't complain about fatigue or they are gone. They don't understand, or particularly like, those whose lives they romanticise as spoiled, pampered and idle. They fence with civil servants whose idea of job security is tied to how long they can stand the environment, not whether or not the environment itself will disappear.

ALPA© and ACPA, like CALPA before them, had a chance to put the brakes on Flight and Duty Time. What happened? Two things, both of which you will recognise.

First, the group fissured along contract and safety lines. Basically, insisting on full rest as per NASA would have equated to more days and would have destroyed the commuters overnight. While the safety folks knew what was best, the industrial issues clouded the message. A clouded message in Tower C is no message at all. Next memo.

Second, and related to the first, ATAC, AQTA and NATA saw their opportunity. They intevened above the Working Group, above the Technical Committee, over the Chief's head and put a knife in the ribs of the entire process. The Director of the day issued an edict and a moratorium on change at the same time, et viola.

Civil servants have a very low pain threshold when it comes to conflict. The employee groups (CUPE. ACPA, ALPA©) only engage when their members lose tolerance, so are already excited when the discussions start. This has to be done calmly and with focus.

Is it safety you want? Fair enough, go for safety. Set limits that are not to be negotiated away for more money, or lifestyle improvements for those senior enough to avoid reserve. Put an irrefutable safety case together and you will win, but it will cost your members money. In the end, will they accept that outcome?

I doubt we will see that day.

Only my opinion

Vs

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.



×
×
  • Create New...