SafeTalk with SafeStart

S12Ep2 Success Factors White Paper: Systems and Data

December 18, 2023 SafeStart
SafeTalk with SafeStart
S12Ep2 Success Factors White Paper: Systems and Data
Show Notes Transcript Chapter Markers

Have you ever had a near miss at work and felt a surge of relief, only to be plagued by a fear of blame or embarrassment? This is episode 4 of 6 in our series discussing Safety Climate Success Factors and it demystifies the importance of reporting and learning from close calls.  Episode 4 of 6 discussing Safety Climate Success Factors.


Host: Tim Page-Bottorff
Guest: Chris Ross

https://safestart.com/file/idclip/

Tim Page-Bottorff:

Hey, welcome back to SafeTalk with SafeStart. I'm Tim Page-Bottorff and today we're going to continue to look at SafeStart's six success factors and you'll find our white paper from Dr. Pandora Brice and our fellow senior consultant, Peter Batrowny, in the show notes. But specifically today we're going to take a look into Systems and Data. Joining me to help me unpack all of this is fellow senior consultant Mr Chris Ross. Chris, welcome back to the podcast buddy.

Chris Ross:

Hey, great for having me. Thanks, Tim, great to catch up.

Tim Page-Bottorff:

Again, I placed the white paper down the show notes, but the reason why I asked Chris here today is to take a deeper dive into the Systems and Data portion of that paper and some of the other key elements related to the climate success factor. Chris, this portion really focuses on how to reinforce both the individual learning loop and the organizational learning loop by learning from reporting, which is usually with discussion and team input and analysis, and also integrating human factors, concepts and measurement in the systems. You want to expand on that.

Chris Ross:

Yeah, I'm excited to Tim. There's a lot of support for the importance of systems in the literature to prevent unintentional errors that can lead to injuries and or subpar business performance, because we know errors and mistakes is a big deal, just a natural part of things. A human factors approach acknowledges that not all mistakes are intentional deviations from standards. Some are just based on how our brain works.

Chris Ross:

The natural neurological functions in humans can sometimes reduce our awareness of risk under pressure like rushing and frustration. Our brains genuinely don't remember rules or processes in a moment of high risk, so there's additional internal factors in play as well. The neuroscience research has confirmed, unfortunately, that human memory, and especially explicit memory, which relies on storage and retrieval mechanisms In the medial temporal lobe I love the geeky part of this of the brain is complex and frequently unreliable. The nature of our brain underlies the ongoing importance for not only having systems, but also preparing workers with tools for managing high- risk situations and especially upset conditions. We also need to provide opportunities for workers to evaluate their own mistakes, so critical, especially those caused by their own human factors.

Tim Page-Bottorff:

Yeah, that's right. I know I'm just having a conversation with Larry. I don't know about you, but sometimes I've got Larry Wilson for our listeners. I got Larry speaking into my brain. Quite often I can actually hear him. Based on what you just said, the human brain is not a fail-safe device. I'll end that with air quotes, if you guys can see me. It's air quotes, but you can't. So to counter that philosophy, though and you know I heard Larry talk about this quite frequently too is that the aviation industry is a perfect example of using sort of operational checklists or systems, if you will, and, even more importantly, realizing that there needs to be a culture in which crew members actually use the checklists. It's kind of a norm, I mean, something that they do in everyday situations it's not just when there's an emergency, but they also have also they kind of developed crew resource management, which is a very interesting concept of what you just said.

Chris Ross:

Yeah, it's interesting you bring up aviation, Tim. This industry has done a lot to attempt to limit the effects of human error on performance. Checklists, creating a no blame environment, developing better crew resource management all designed to address human factors and safety. And it's not just aviation, but NASA as well has done a lot. It's not the checklist alone that produces performance reliability. Rather, it's the recognition amongst pilots that they're fallible. It can make mistakes and we know we've made millions of mistakes in our lifetime. It's the commitment not to operate from memory, because human memory, as we've just talked about, is not reliable. Most importantly, it's having a system of two people cooperate in working through and cross- checking each critical task, so the checklist is used with thought and not just pencil whipped and man. We ever seen pencil whip checklist before, right? So another really successful strategy is no fault or no blame.

Chris Ross:

This database was put together by the FAA. It's called the Aviation Safety Reporting Program. It's completely anonymous and non punitive, allowing aviators to report unintentional safety incidents without repercussion. Based on the NASA program, report intake has grown tremendously and in 2022, there were 8,000 reports a month. Man, that's the ultimate no blame environment we all hope to achieve. The aviation industry also gives us a great model for the rest of us called course management, which is a three step process to teach the five factors communication, situational awareness, decision making, teamwork and barriers that compromise safety, especially as it relates to human factors.

Tim Page-Bottorff:

You said something that made me laugh about pencil whipped checklist and inspections, and there's one common human factor that drives most pencil whipped inspection. That's that complacency component. So I'm going to relate this to what you just said. I learned from a really wise man a long long time ago when I first got my boat. So in regards to towing that boat, he actually said to me and this is another quote do a complete walk around like a pilot would before take off and make sure you check for everything. And so the framework, as it was designed, it has two loops an organizational learning loop and an individual learning loop, and so on the organizational side, we make a distinction between technical and people systems, sort of like I did with the boat. Can you explain that a little bit better?

Chris Ross:

Yeah, absolutely. You know, technical systems tend to be pretty well supported, documented. Everybody gets it. Typically, there's both an investment and attention. Technical systems for engineering, work processes, equipment maintenance, safety management systems. They're tracked, they're measured, they're well developed, they're mature, they're followed. On the other hand, while people systems are a large part of any organization, there may or may not be as much awareness of what these systems are, how to improve them, nor some potential unintentional human factors impacts on individuals. They can include practices on how people and teams work together, how information is communicated, systems for supervisors interacting with their people, feedback, culture and climate elements, psychological safety and much more. Those are all really critical components of people systems and in many cases, these are nowhere near as robust or as measured as the technical systems are.

Tim Page-Bottorff:

Yeah, you're absolutely right about that. I spent some time with our research team and psychological safety and that's just a feeling, and if you want to talk about feelings, Amy Edmondson said it's a feeling that you get, that you can work in an environment, and so, with the regard to the research team, we've also found several examples of studies that show that safety, climate and, of course, organizational culture are heavily dependent on having effective people systems, and it seems that when you're talking about people systems, there's actually a chance it could turn into what you refer to before, the name blame and shame game.

Chris Ross:

Yeah, that's why this no- blame mindset is so critical. The paper addresses that, along with our approach to climate success factors. A global business developed a program to reduce serious injuries and fatalities SIFs right and, as we know and is shocking to me SIFs are on the rise. We're on a 10-year increase of serious injury and fatality. A basic premise of the program was to encourage everybody to identify high- value near-miss or pSIFs, or potential for SIFs, and investigate them and determine root causes and precursors. Sounds great. It was a requirement of the program to inform senior management within eight hours of high potential near-misses when the program started, as the high potential near-misses were reported, an executive leader would call the local leadership and scold them for the incidents, even when no one was heard. This blame climate, as you can guess, immediately reduced the reporting of high- potential near-misses, thereby eliminating learning opportunities all those free for looking opportunities. Leadership only heard. When people got heard, a new leader assumed responsibility in the climate change. The high potential near-misses were celebrated and leaders called them quote- unquote golden nuggets learning opportunities, as climate led to more high- value near-misses being fed into the organizational learning loop and allowed the system meaningful insights Regarding those precursors, and the outcome was a dramatic reduction in serious injuries and fatalities, so that no blame mindset was just really critical.

Chris Ross:

As humans we all make mistakes and I love it. In Safe Start Now we've got a little exercise where we ask people to identify how many mistakes they've made. We've made billions If we can learn from those mistakes below the line. Where it's free for looking, we get such valuable information. One of the most important parts is that statistically very few near-misses, or recordables for that matter, actually have SIF potential. So the whole concept of near-miss reporting or discussions can start to have a lot of positive impact in the individual learning loop Great point.

Tim Page-Bottorff:

I will suggest that the more reporting that you get, it might produce more results. So don't be scared at the potential of first aid and don't be scared at the potential of further reporting injuries, because the more reporting you get, the more information that you get. So that's an incredible point. This whole concept of normalizing that discussion around human factors and even mistakes is a way to get better. It is exactly what the aviation industry, as we were talking about before, has found. Then, I think, a lot of our customers, they're using that as well. So, Chris, earlier you mentioned about how reporting can affect the individual learning loop. Can you expand on them?

Chris Ross:

Well, sure, Tim. For years organizations have tried really hard to increase near-miss reporting. We see that in almost every place we go, with varying degrees of success. I've been advocating what I called the dual path near-miss reporting for years. For the SIF, potential incidents in the near-misses that require an organizational fix, let's report and investigate. Let's put that into the organizational learning loop. But for those thousands of little bobbles and errors that only take place in the self area, let's just discuss them to access those lessons for ourselves that are free for looking. It's just purely in the individual learning loop where that value takes place. We're all familiar with some of the myths we all grew up with in the safety profession. One of them was some flawed conclusions by Heinrich, who proposed the relationship of 129 to 300. He drew the conclusion that by reducing the number of minor incidents, companies would see a correlating fall in major incidents. The whole theory was if you'd reduce the number of minor injuries in near-misses, you would automatically reduce the more serious major injuries and fatalities.

Tim Page-Bottorff:

The correlation conclusion, which, of course, you and I know has been widely disproven In many contemporary studies regarding SIFs, the prevention of SIFs, and the data that shows a lack of correlation between serious events and minor events together no-transcript, the most widely reported data. It actually indicates that around 21% of recordables have SIF potential and presumably the number of very minor near misses would be much fewer. Unfortunately, the number of SIF events have been flatlined for the past decade, and so, Chris, what do you think?

Chris Ross:

Well, yeah, so we've debunked one part of Heinrich, which is the correlation factor. So what I do like about the triangle and it does accurately depict or illustrate is there's more events at the bottom than there is at the top, and that's why I like it as an illustrative viewpoint. And all those near misses, close calls and at-risk behaviors that are free for the looking give us opportunities to learn from our own mistakes, not necessarily learning from others, but learning from ourselves. And, interestingly, we can only access those if we talk about them out loud and share them. So there's a lot of barriers in the way for workers to report near misses. Could be a cumbersome form or process. Oftentimes there's a fear of blame or embarrassment at the least, there's unwanted attention and questions and oftentimes there's not a perceived value or it's not worth it. Confusing guidelines, reluctance to participate in many more, it's just. It all ties back to these human factors.

Chris Ross:

But one of the biggest barriers is that we simply forget. So I don't know if you're anything like me and our listeners we've probably all blown through the yellow light and it was red before we made it all the way through. Fortunately, no oncoming traffic, so we go whew, and then we forget about it, or a worker trips over some debris and almost falls. In most cases, people first check to see if anybody saw him right. Did the cop see me? Did anybody see me? And if I didn't, it just vanishes. And that's the funny thing about this is we've got so many near- misses we don't even think about them anymore, and that bottom part of the pyramid represents an incredible treasure trove for us to practice getting better at our habits, but only if we talk about them.

Tim Page-Bottorff:

That's the key, to talk about them. And when you think about the number of near misses and close calls we've all had in our lifetime, I often joke at the workshops that you may not know the number. You could say easily it's 100,000 or a million, but really the answer should be I just can't count that high and so we don't know the number. It could be thousands, could be millions, and on a work site, say, of 200 people, it's kind of easy to imagine that there must be thousands of these little mishaps that happen every year. And if there are that many at work, there must be equal or greater numbers at home and even on the road.

Chris Ross:

Yeah, and there's value for us to explore these and talk about them, not report them. Just discussion only to create awareness we've been lucky and generate some motivation for us to improve our habits. So you know that goes back to, the CERT that we're all familiar with of just analyzing close calls. The value in discussing 99% of these little near misses is just for our own benefit. They are not SIF precursor events. They don't need organizational lessons learned. There's not engineering fixes, it's just lessons for us to learn. That feeds our internal learning loop and we get better at our own habits and fewer mistakes. But again, only if we talk about them.

Tim Page-Bottorff:

Yeah, a lot of our clients are actually embracing this tactic as well, and just recently in a public workshop, I've heard many of our customers actually say that they're using SafeStart as their climate tool or even some level of cultural reinforcement. Have you seen this?

Chris Ross:

Yeah, absolutely, and you know, in SafeStart, as I said, that was one of the biggest CERTs, right, it's just analyzing close calls, and at SafeStart we define culture as the way things are around here, how things happen around here, and it can be characterized as the personality of an organization complex blend of factors that are interconnected, one of which is safety, and culture can be really slow to change and evolve. On the other hand, climate is the way we think about safety or any other organizational elements within our organization at a certain point in time, or I like to say, climate is the way things are around here today or right now. It's a snapshot of how important employees think it is to act safely while performing their jobs, especially when nobody's looking right and just like the mood or the weather, safety climate can change in a snap, and that's what I like so much about both SafeStart and SafeLead is that it provides us with such effective climate tools in helping unlock what people do.

Tim Page-Bottorff:

Yeah, I appreciate that comment. I have actually made statements through SafeLead very specifically in the identification of climate is that if you want the climate to change for better long-term culture, you need to have people to actually get up and go to the thermostat to change it. So that change agent is so extremely important, but it can be used or focused if you change your focus on human factors. So these tools that you're talking about, they can help address our customer's concerns for safety, whether it's production or quality that might be top of mind for them, or even outside of safety, which we've been known to call performance errors. We kind of give rich detail in a very usable lexicon or even language that helps to demystify human performance.

Chris Ross:

Yeah, Tim, that language or lexicon is so important. It gives everybody the common language and I think, ultimately, SafeStart works for organizations because it works for workers. It works for people, be it communicating safety to one another or the portability of SafeStart, as you mentioned earlier, at home and on the road, not to mention employees taken at home, and I was just at a site they did SafeStart a couple of years ago and one of the senior leaders said that was the most impactful thing I ever did. I still use SafeStart with my now almost 18-year-old now son almost every day. That was powerful for me.

Tim Page-Bottorff:

Well, that is powerful. I'm glad you brought that example up because for the young folks that are out there and we're starting I don't know if it's age, Chris, or not, but I'm starting to see more and more of them it feels like they're much, much younger now they are.

Chris Ross:

They're getting younger.

Tim Page-Bottorff:

See you. Well, I won't generalize a generation. What I will say this, though, is that we've got some folks that, like you and I, we went to the school of hard knocks and we've learned the hard way. So, Chris, in the time we've got left, which is very short, let's talk a little bit about data.

Chris Ross:

Sure, there's a lot of aspects to data collection. Of course, the easiest data to collect is lagging data, and there's a lot of it safety, quality, production, in fact, most organizational systems have a heavy dose of measurement, rightly so, I guess the real trick is determine which leading indicator data is going to help us increase performance. I think it's a bit of a juggling act to develop or to balance data quantity versus quality. That's a challenge for all of us in a profession.

Tim Page-Bottorff:

Isn't it quality versus quantity, because sometimes we just want more. I was going to say, if we knew exactly what leading indicators would predicate or even prevent in terms of injuries, we'd all be rich, don't you think?

Chris Ross:

Heck. Yes, while we know there's not a correlation between near misses and serious events, back to Heinrich, there is a great deal of correlation between mistakes and human factors. Best in class organizations are doing a better job in understanding the true SIF precursors to focus on enhancing the organizational learning loop while allowing the non-SIF little mistakes to fuel the individual learning loop. While we don't want to report on every one of those little near misses, we do want to track because we want to start making it normal to talk about our mistakes and make it normal to talk about our own human factors on our personal performance.

Tim Page-Bottorff:

I agree, and that gets right back to climate, and so I'm picking up on what you said earlier. I encourage everyone to look for improvements in all traditional KPIs and that's key performance indicators, and if it involves humans, you should see improvements. But whether it's increases in production or quality, or even decreases in scrap rates, those are just a name of few.

Chris Ross:

Oh, gosh, yeah. And, as you know, some of our most successful Safe Start clients started tracking the correlation of human factors on scrap and rework, which was a startlingly high correlation and in every almost every organization I've been in, they will agree that there's like a 90 to 95% correlation between quality and performance errors. So once they start using the critical error reduction techniques or CERTs to address these errors, they get incredible measures of success in reducing scrap and rework. We know our tools work really well.

Tim Page-Bottorff:

Yeah, yes, they do.

Chris Ross:

Any other leading indicator data that you want to talk about that might be helpful to track Well one of the interesting things that's come out in SIF prevention is called verification of control audits, keeping a deep dive into the control methods that are in place for tasks.

Chris Ross:

And, as I learned about this a couple of years ago and started tracking it in my own SIF investigations, a surprising amount of SIF events occur when only the lowest level on the hierarchy of control methods are in place, oftentimes just admin or PPE, which means that we are relying on individuals not to have a bad day, not to have sleep interruptions or a fight with their kids or whatever it may be. So other worthwhile elements to track would be safety conversations, Gemba-walk recommendations, efforts to drive employee engagement, measuring the climate of open communication all really worthwhile. Although it can be very difficult data to track, I think it's important for organizations to evaluate their data collection on a regular basis and say is this giving us the information we need to get better? Sometimes it's hard to give up those legacy programs that no longer provide value, right, those golden elephants.

Tim Page-Bottorff:

Yeah, which a lot of people have a tendency to hold on to, and that's a great way to end in terms of data, all of those things Gemba-walks, safety conversations they're very valuable, and that's another thing about just getting out of the office. You got to get out on the floor, you got to get around people and you got to develop those soft skills to have those conversations. So very, very good. Thank you so much, Chris. It was a pleasure to have your insights and be on the podcast today.

Chris Ross:

Hey, thank you for having me, Tim. I appreciate it, you bet.

Tim Page-Bottorff:

So, on behalf of Chris and the entire team with Safe Talk at Safe Start, we'd say thank you for sharing some of your time with us. Remember, please check out the entire white paper and the show notes. Until our paths cross again, I am Tim Page-Bottorff for Safe Talk with Safe Start. We'll see you down the road.

Explore Systems and Data for Success
Reporting Near Misses and Learning From Mistakes