Bill, Thank you for sharing your experience. It highlights two well-known hazards that lead to injuries and fatalities: driving and confined space entry. This trigged an opportunity for additional sharing of best practices from my experience working in the oil and gas industry.
The oil and gas industry through the IOGP, its industry association, has developed a set of lifesaving rules to protect workers from injury and save lives. More details can be found by clicking on this link. These rules while developed for the oil and gas industry equally relate in my view to hazards that found on civil engineering job sites. These rules are quite simple to follow and can be so impactful in protecting lives and ensuing everyone goes home to their loved ones at the end of their shift. '
The rules include: 1) Bypassing safety controls, 2) Confined space, 3) Driving, 4) Energy isolation, 5) Hot work, 6) Line of fire, 7) Safe mechanical lifting, 8) work authorization, and 9) Working at height.
"The culture has always been the major factor in determining how close we get to losing someone on the job. That lands squarely in the lap of management.""How do we change, create, manipulate the ingrained culture (mindset) before major accidents?"Q. What do you believe is the unique source for this "Ingrained Culture?"
Culture is simply "The way we do things around here."i.e., Not what we preach at people, write in manuals, and post in construction trailers:
"Safety is Job #1!"
People work within the system their executive management uniquely controls.As Dr. Deming would say "Your management system reliably causes the behaviors you see."Until executive management accepts their need to change what they do and how they do it that ismaking almost all employees see safety policy and procedures as optional, so-called"Acts of God" will be their most frequent solution.
A relatively common top management ploy is assigning financial rewards to progress made by work crews to "Beat The Schedule."Just check the literature to validate this comment.
This is a cautionary tale about avoiding the terrible consequences of making assumptions about the unknown and not confirming them as things develop.
In the 70’s a vast array of tubewells were dug in Bangladesh as a part of international humanitarian and development projects. The tubewells were to provide safe potable water, and reduce the terrible (~1M/yr) death toll from contaminated drinking water. They were also to secure a reliable source of irrigation water that would allow additional, dry-season crops to be grown in the impoverished and famine ridden country. The relatively shallow groundwater pumped during the dry season would be replenished during the rainy season. As there were few indications of problems with groundwater quality, this seemed to be a wonderful idea.
There is arsenic in the aquifer, and the groundwater poisoned large numbers of people and contaminated vast acreages of farmland in an already desperately poor country. The assumption of good groundwater quality did not hold up, and this was not discovered until the damage was done. The chemistry of how the arsenic went from the soil to the groundwater is still not well understood, and may or may not have been triggered in whole or part by the pumping. Lack of monitoring and data gathering after the implementation of the project likely delayed the discovery of the scale of the problem.
In the mid 80’s, just prior to the larger public realization of the problem, I was in Bangladesh and learned of the tubewell scheme. Water quantity planning was, and is, my area of expertise. From that standpoint, the idea seemed quite good to me. It did not occur to me that others might not have checked out the quality issues. No “red flags” went off. Fortunately for me, I was not involved in the tubewell projects.
When the news of the contamination came out, I realized that, had I been engaged in the tubewell project, I might have (or might not have, since the responsibility would have been clear) made the same mistake. It was a “There but for the grace of God go I” moment.
I have always taken pride in thinking about the larger context of the problems I am trying to solve. I tell my employees, students, and mentees to define a problem two sizes larger than their intuition says is necessary and then to discard whatever is extraneous. Doing so is second nature to me. Except when it isn’t.
The horrible consequences of the tubewells continue to remind me that I have to be on guard for the “when it isn’t” times, and to reconsider any solutions I might propose in the two times larger context before I commit to them. I was fortunate to learn that lesson from something other than my own professional mistakes.
First, giving instructions in such a manner that there is no possibility of misunderstanding what is wanted, and second, checking up those orders to see that they are exactly followed."
It seems along the way we have promoted far too many folks into "Executive Row" who see themselves being served vs. serving..Stay Healthy!Cheers,Bill
 -"Personal Leadership in Industry," by David R. Craig and W. W. Charters. 1925.
McGraw-Hill Book Company, Inc. New York and London
-Chapter 5, "The Protection of Quality Standards," page 75-76, "Select The Right Man For The Job."
"While an SMS provides the mechanisms for an organization to perform its operational functions in a framework of safety risk-based decision making, a QMS ensures that this framework is operating in a structured, repeatable fashion and can meet its intended objectives. When it can't, it provides the means to take action to improve. They both must be planned and managed, depend on measurement and monitoring, involve a multifunctional approach and strive for continuous improvement. Thus, QMS and SMS processes can be highly complementary and will support the achievement of the overall organizational goals without compromising safety".