Write Medicine

We’re Going to Make Mistakes. What’s Your Contingency Plan?

April 27, 2021 Alexandra Howson PhD Season 1 Episode 7
Write Medicine
We’re Going to Make Mistakes. What’s Your Contingency Plan?
Show Notes Transcript Chapter Markers

Summary
Steve Powell DHA is the CEO and Founder of Synensys Global and is a recognized leader in performance improvement. He has led programs in the US Navy, commercial airline industry, and the healthcare industry for more than 30 years and is passionate about patient safety, quality control, and patient-centered care improvements.

Steve shares what he learned in safety when he was a Navy Pilot and how these experiences crossover nicely into the medical industry. He also shares his thoughts on what makes a team successful when it comes to patient handoffs, and the 5 key principles to a high-reliability organization.

Resources
Institute of Medicine
TeamSTEPPS
Kohn KT,  Corrigan JM, Donaldson MS, eds. To Err is Human: Building a Safer Health System. Washington, DC: Committee on Quality Health Care in America, Institute of Medicine: National Academy Press; 1999.
Nash D, Beliveau ME.  Two lessons hospitals can learn from their COVID response. MedPage Today. Dec 7, 2020.

Connect with Steve: Synensysglobal.com + LinkedIn
Connect with Alex: Thistleeditorial.com + LinkedIn

Podcast Team
Host: Alexandra Howson PhD
Sound Engineer: Suzen Marie
Shownotes: Anna Codina

 

Steve Powell DHA

Tue, 4/27 1:52PM • 50:20

SUMMARY KEYWORDS

patient, team, safety, patient safety, healthcare, teamwork, communication, people, nurse, organization, education, high reliability organizations, pilots, aviation, work, skills, practice, aircraft, called

 

00:04

Alex
Welcome to write medicine, a podcast that explores the minds motivations and practices of people who create content that connects with and educates healthcare professionals. I'm your host, Alex Howson, a former nurse, a medical sociologist and an education writer and researcher in healthcare. Join me to learn from education professionals about resources and tools of the trade and listen to stories about what drives them in the medical education field. If your work involves planning, designing or delivering education to healthcare professionals, this podcast is for you.

 

Alex
00:50

Teamwork, patient safety, preventable harm systems approaches to health care delivery. These are terms that are familiar to healthcare professionals, risk management and patient safety experts and educators in the continuing education field. Today, my guest is Steve Powell DHA, CEO and founder of Synensis Global, and a recognized leader in patient safety management and performance improvement in health care. Steve has led programs in the US Military Health System, the Veterans Administration, and the Centers for Medicare and Medicaid for more than 30 years. He's passionate about patient safety, quality control and patient centered care improvements. Today, Steve shares what he learned about safety when he was a US Navy pilot and how he's drawn on these experiences as a leader in the healthcare industry. He shares his expertise on what makes a team successful when it comes to patient handoffs and talks about key principles associated with a high reliability organization. I'm your host Alex house. Welcome to right medicine.

 

Alex
02:06

Hello, and welcome to the right medicine podcast. This is your host, Alex Howson and I'm here today with Steve Powell, the chief executive officer or Synensis , a safety and management and performance improvement organization that uses a systems approach to address safety and quality in health care. Welcome, Steve.

 

Steve
02:29

Hi, Alex. Glad to be

 

02:31

it's great to talk to you today. You and I have had many conversations, and I always really enjoy spending time with you. So I'm excited to learn a bit more about the work that you're doing today. And I think listeners of the podcast are going to be really excited about some of the things that you have to say, let's just start with who you are. And maybe you could talk a little bit about how you find your way into safety management and performance improvement in health care.

 

Alex
02:59

Great. Well, thank you. Thanks again, for having me, I'm really excited about being here. My kind of journey and safety started, you know, over 30 years ago, I was I was in the Navy, and was pilot background by an aircraft and and that experience was you know, through that I was sent to safety school for aviation safety. And that was kind of my beginning journey. From there, I left the military started working for commercial airline also transferred those safety skills into the commercial airline space. And then ultimately, I found myself being able to work in health care patients at space. So bringing a lot of those same concepts that we have from aviation, and really in that training and education environment, to actually healthcare organizations, especially hospitals. And that kind of is not probably unusual in the sense that that is cross industry or high risk industry, sort of models of safety and quality have evolved from other high risk industries like aviation, also nuclear power, and even the special forces from the military as well.

 

Steve
04:12

So that kind of high risk environment is certainly something we probably recognize now. But at the time you made that shift from aviation into health care, that was probably a pretty novel concept, was it not? I mean, you must have been right at the beginning. Yeah,

 

04:28

we really were I actually had a personal experience of patient safety in the care of my father, who suffered a serious safety event in care of his caregivers. And that event was sort of an awakening to me about the need for this approach to safety and quality that involve high reliability principles that had originated in the aviation industry. And that sort of came catapulted me into this space. And then I realized, you know, when I started to collaborate with, first the Military Health System, and they were really interested in thinking about how they could take some of the practices that they saw on the aviation side of military and bring them over to the healthcare side of the military. And that was kind of the first nexus of, of actually making that leap. And you're right. I mean, it really was, you know, the Wild West at first with that approach. And that was pretty much right after Institute of Medicine at that time, came out with their seminal report on patient safety known as two areas human, right around 2000 2001.

 

Alex
05:46

I know that a lot of our listeners in the continuing medical education space will be familiar with To err is human. But I wonder if you could talk a little bit about, you mentioned two things. One is bringing some of the safety practices from the military into healthcare. And the second is that concept of high reliability. What is that? And why should we be thinking about that in healthcare,

 

Steve
06:15

so that leap, and there's human error human talked about, really, the first line they brought in was the idea around crew resource management, or its predecessor, which was called cockpit resource management. And so this is just CRM as we knew it in aviation. And they really believed that it was important that we could better understand our translate these, what we know to be non technical skills. So it's not the actual flying of the aircraft. It's actually skills around decision making, communication, leadership, situation, monitoring those types of skills. in aviation, we found that those skills were the skills generally, that were the causal factors of serious accidents. And since we're dealing with human performance, it just seems likely that some of those same factors were involved in serious patient safety events as well. And a couple with that. So the so CRM really was something that that there was a thought process that we could, we could translate that for healthcare, and fit it to purpose, if you will. So take the things that were relevant from that, that body of research around CRM, and bring it into another high risk industry like healthcare. And along with that the science that was developing around high reliability organizations, and why can sutcliff we're kind of at the forefront of that work, along with James reason, who was also one of the leaders in the safety management research. And in the weichen sutcliff work, it originated on the deck of the aircraft carrier. And the concept was, how could we take work that's so risky, and make it so safe? And that was the question, they approach that high reliability work. And what they learned was that there were really five key principles that they were able to discover from that work in high reliability. And it's those principles that were the focus of some of the high reliability work in healthcare as well.

 

Alex
08:30

And so what are some of those principles?

 

Steve
08:34

Well, one is a preoccupation with failure, which actually seems sort of counterintuitive to the safety world. But it's this idea that we're always planning or expecting something to go wrong. So in high reliability organizations where we're really obsessed with thinking through what contingency might be in case, Plan A goes wrong, how are you going to be ready with a plan B, or Plan C. Another one is this commitment to resilience. So this ability for teams to snap back, so with the idea that we end with human error, that will, we're always going to make mistakes, it's a matter of how we are able to manage those mistakes or those errors that could prevent harm from reaching a patient. So those are some of those concepts and there are several others that are are very relevant to also the any high risk environment where the consequences or safety are so high.

 

Alex
09:40

I love that phrase commitment to resilience, and also kind of recognize that same sort of having contingencies contingency plans. I you know, I was a trauma operating room nurse for several years. And so I kind of recognize some of what you're talking about, in terms of, you always have to have this sense of What are your workarounds going to be if you know x, y or z? doesn't actually pan out?

 

Steve
10:07

Yeah, it really was Alex. I mean, you know, I think initially when we started thinking about safety management in healthcare coming from the aviation background, exactly what you just said was, was really important. It's not just about what we know, is safety one, which is kind of those procedures, protocols, checklists, those those types of hardwiring types of policies and protocols. It was also what you just said, it was the adaptability. And that's what we know is safety, too. So there's flexibility and adaptability, the ability to manage even things that we didn't expect, as much as the things that we did anticipate. That was the other side of, of high reliability is these organizations that were actually able to improvise, and improvise in a safe manner, have been needed.

 

Alex
11:01

I guess I have a couple of questions here in the work that you do, how much flexibility and adaptability Do you see in healthcare workers?

 

Steve
11:12

So I'll ask that question first. And then I'll come back to the second question.

 

11:17

So I think there's there's a tension between those two. And that's why I brought them up is that if you think about it, we can maybe see even the word work around as something bad, instead of work around potentially being something that needs to happen. So the the idea is that here, we're really talking about the the system, if you will, and some have said that systems are designed, perfectly designed to get the outcomes that you designed them for. And so oftentimes, what we know as the blunt end of a system development, meeting, sort of the management level, or the design level of a system doesn't equate or are reached the sharp end about where care actually is delivered. And then it looks like it would always work that way. But it Sharpie, and at the at the cares side at the care team side or at the bedside, that actual system may not work in all cases, and all if you will, algorithms or patient conditions. And so that that ability to be flexible and adaptable is important. And it was a really important principle of being a high reliability team. And so so there was this concept of we had a healthcare team, were they really what we knew as a high performing team, or were they just a group of people that showed up at the same time or asynchronously to take care of the patient?

 

Alex
12:57

Right. That's a great distinction to make. I know that I'm certainly in my experience of of working in trauma teams, sometimes. It definitely felt like the latter. It's a bunch of people showing up a little bit willy nilly. Sometimes, you know, not everybody's on the same page all the time. And so that kind of leads me to my second question, which is about about the team part. Yeah. Can you talk a little bit about the work that you do to create that sense of everyone in the team being on the same page?

 

Steve
13:35

Yeah, so we call that concept and this was what our first programming effort and I think one of our most successful so far to date was, within this team, safety management, construct and healthcare is working with the Military Health System and Agency for Healthcare Research and Quality. We developed a program based on all this science of teamwork, known as TeamSTEPPS, and istep, Stanford strategies and tools to enhance performance and patient safety. And what TeamSTEPPS did was it started with the concept of who's on the team, which I think is first and foremost, before you talk about the skills that high performing teams are known for. Right? So so the team structure was really important. And first and foremost at the top of the, if you will Troika of team is the patient and their family. And so that that, in essence is the key part, you know, as people will say, nothing as a patient, nothing about me without me. And so, so if you take that construct, and then there's a core theme, which is the team that actually does direct patient care, and then from there, there are other teams that could could play into the care of that team and will play into that team and knowing where you sit on And the team, and what role and responsibility you have on the team is absolutely crucial to knowing how you need to sometimes lead, and then other times how you need to support. And so that that was really critical. And then from there, the training that came from that sort of understanding of where I sit on the team and and what my role is, was skills and tools do, I need to apply based on conditions or safety situations or situations with patients that would help to improve safety, and to help improve collaboration, coordination, and most of all communication, which we knew from going into this that communication errors were really when you had Sentinel events, it was usually 70% of those Sentinel events had a root cause of a communication failure. And so again, communication was really important to the team, as well as leadership. Situation monitoring is a final learnable teachable skill and team sets, which is known as mutual support, which is that backup behavior that we need each and every day, to help us when we can't see the things that others on the team can see, or are able to, if we will have our back, especially in high risk, high consequence situations, or even high stress situations with patients.

 

Alex
16:30

Yeah, that having your back piece is key. I do want to ask, you know about your process about how you actually work with teams. But before that, I just wanted to kind of flag up, you know, some of the work that I do involves interviewing physicians and nurses and other health care workers, as you know. And it's often pretty sad to me to hear that a lot of people don't feel like they actually are part of a team is that something you see, I guess, especially when you're first working with an organization,

 

Steve
17:02

you know, and the way we are able to evolve that the process, the way we kind of see that without actually even seeing the tea, is by analyzing the culture of safety data. Usually, there's survey data around safety culture in organizations. And that's a really big tip or cue to us whether teams are struggling because there's several dimensions of patient safety cultures, including not only teamwork within units, but teamwork across units, which actually is where it's even harder to be a good team is when you're working with another unit. And so you have a handoff of a patient, that's a very risky sort of play, if you don't have good teams playing together and be like fumbling the football in a sporting event, but with much worse consequences here with a patient piece of patient information that's critical to their well being moving from one unit to another. So but I agree with you that a lot of times, we could have a team of doctors, maybe a team of nurses and team of respiratory therapists. But how do we become a high reliability team? How do we become teams of teams basically, is what this will look at? How do we all get on the same date, which we know from our research means that you have a shared mental model, the information is shared openly across the patient care team, including the patient, their family. So I think that you're right, I think that comes out also in another dimension of culture, known as communication, openness. So how willing are we an open? Are we to be able to share information, regardless of potential hierarchy or seniority, or professional specialty? So the teamwork we're talking about is multidisciplinary, interdisciplinary teamwork.

 

Alex
19:05

So you know, you're looking at cultural safety data. What other kinds of indicators, you know, when you're starting to work with an organization, or maybe a unit? And maybe you can say, how you break that down? What other kind of indicators are you looking at to get that baseline sense of where the organization is, in terms of thinking about and being receptive to improving teamwork?

 

Steve
19:34

Right, right. And so I think there's also attitudes, teamwork, attitudes. So for instance, how much do you value teamwork. I kind of take this story back from aviation, which is when you're trained as a pilot, the sort of Capstone event for a pilot early in their careers is called the solo. Which is is actually sort of It's counter productive in some ways in the development of pilots is, in some ways, yes, you have to operate independently be able to, to know your profession. But your Capstone is solo, when in fact, rarely do you except for single seat fighter aircraft. Do you ever fly an aircraft by yourself? There's always other support in the cockpit, maybe it's a wing man, even in a single seat fighter aircraft, but my son flies. So when I when I think of that is oftentimes we've sort of set ourselves up in some ways to not value teamwork. So understanding what strong believes you have, in teamwork is is another way that we can kind of understand where we sit, do you? Do you think teamwork is important? Or am I able to do all this on my own? I think another area is patient experience data data from things like age gaps, because patients talk about how their physicians and nurses, listen to them communicate with them, if you're not communicating well, with communication is a skill, it truly is a skill. And if you're not able to communicate well with your patients and families, it's likely that you're going to have difficulty communicating with your team members. And so that's another way and then I think, also that fits into that same mix of data is, is really the employee engagement. If you're not engaged, or you know you dread going to work, it's likely that you're you're also not really excited about team, you're maybe not resilient, at this point, they have something else going on in the organization from the goals of the organization management leadership, that is preventing you from doing your best work. That's also an area where it's a kind of an indicator that potentially this, an organization, in essence, has low readiness to be functioning as high performing teams, and that they really are maybe used meeting to improve their climate work climate, before they start to dig into some of these tougher teams skills and communication skills.

 

Alex
22:21

But once you have kind of amassed all that, you know, baseline data, how do you start the process of, you know, actually working with organizations, you know, figuring out whether they need to work on some more readiness or, or dive into really bolstering you know, a particular team or teams. Can you talk about your kind of process?

 

Steve
22:44

Yeah, I think it's kind of a top down bottom up. And if you think about the whole of this, it really is a large organizational change. And it can start in different ways. Some organizations like the military, decided they want to do this at scale. So this is how do we get all of our healthcare teams on the same page and working very closely together. And there's a reason maybe they have to do that more than other organizations because of the nature of how they deploy. And their themes are rarely intact, like you all show up and never have the same team. They're, they're usually formed teams, so that they have to come together very quickly in a combat environment, and be able to perform at the highest level possible for that team with all sorts of disparate skill levels, in generally, military help teams are at least 10 to 15 years, on average, less experienced than, say, a civilian health care team. And with that said, Yeah, so it's a much for dynamic, but yet at the same time, I think it also has shown me that these teams seem to be a little bit more amenable to trying things new and realizing their own limitations. I think once we get farther down in our careers, it might be a little harder to change some of our, our beliefs and and some of our perceptions and the ways we do work. And so I think it really starts with a kind of that readiness piece, not only at the team level, but at the senior leadership level, because if this is a large change management program, will lead senior leadership support resources for that. And then oftentimes, maybe we've seen that an organization will start the process of developing and improving the performance of their teams in their highest risk units like labor and delivery, the operating room, ICU, and as you mentioned, the emergency theater. So, so those, those are great places where we're TeamSTEPPS that really shown to have some of the best impacts of changing performance and improving patient safety. From there, it's it's really a traditional kind of model of, of training, which involves both the didactic of the knowledge, but also the development of those skills. We've used things like we did in aviation, we started incorporating simulation into our practice, okay, I think is very key. And then the furthering that to the application environment is none of this really works unless you practice it on the job. So it's really getting someone to help you mentor, coach, and if you will, help you see how your behaviors might be changing over time. So and again, a lot of it's just like, we know, it's sports teams. It's rehearse, rehearse, practice, practice. And the more you use these tools like briefings and huddles in debriefings, the more the teams start to become more cohesive, they coordinate better, and they collaborate. And then as you said, the communication really starts to improve, which, again, is directly related to reduction in these patient safety, preventable patient safety events,

 

Alex
26:22

right? So practice rehearsal, communications, support, mentorship, how long is a kind of typical process for you in terms of, you know, transforming a team culture,

 

Steve
26:37

usually, it really doesn't happen much quicker than a year. So it really kind of takes a season, if you will, you know, full season of this sort of focused practice before a team really starts to hit their stride. And like in a learning curve, you know, potentially people see those taking time, or briefs or checklists, they see it as actually taking more time. And they, you know, you get that dip of that learning curve initially. But then all it takes is one or two times that you're able to trap or mitigate or manage an error because of those events, where people start to see the value. And the stories from those good catches, come out from the teams in it, it sort of creates its own little momentum from there, or its own rapid cycle improvement. And the teams are constantly refining their quote unquote, playbook, right as to how they're going to do team and how they use the language of teams differently than they had before. They have a common language that they start to use, that usually takes you know, before that really starts to really hit their stride, and they're starting to perform, it starts at about a six month level, but you really start to see it start to kind of go in that hockey stick sort of approach out at about 12 months. And that's when you really start to see changes in the culture, changes in some of those perceptions or attitudes. And even those good catches, or stories that show that teamwork is working to prevent patient harm.

 

Alex
28:18

And presumably, you're, you're tracking hard data to see those kinds of shifts in the Americans have a common language, which presumably is an indicator of that shared mental model.

 

Steve
28:31

Exactly. So one of those is, you know, let's, let's just use, there's one of the tools, there's 15 tools and strategies and TeamSTEPPS are developed specifically for use at any given time. So one of those tools is called CUS—it stands for, Concerned, I'm Uncomfortable, this is a Safety issue, that language is new language, and you know, you're frustrated in an environment, you know, you might want to cause but this is a different kind of cause. This is the cause of being able to advocate and assert a position because you potentially have seen an early warning sign with the patient. Maybe it's an oxygen level, a blood pressure, temperature. So those kinds of things, I have to get maybe the attention of my attending physician by saying no doctor, I'm concerned. And the doctor says, Well, I'm not all that concerned. And maybe I need to go to the next level. I'm uncomfortable. And so it's eventually this bite, even the something I need to bring another consultant in and say, You know, I think if we if we keep down this path, we're going to end up in a safety issue. And so those are the kind of languages different and if you see people start to use that language even to say I think this is a tough situation. It really helps them immediately say that something's changed. We need to look at the plan of care and maybe do something different than what we were doing before because this patient is de saturating, for instance.

 

Alex
30:08

So interesting. You know, one of the things that's always struck me as I'm not quite sure what the right word is odd, is that from my perspective, within continuing medical education, I don't often see synergistic programs or projects between continuing medical education and the kind of work that you're doing in safety, management and performance improvement. And yet, it seems to me that it's potentially such a fertile partnership. From your perspective, do you see much of that?

 

Steve
30:45

You know, unfortunately, Alex, we go we struggle for every, you know, cu, CME, CNE, we struggle, we, you know, with TeamSTEPPS, we ensure to be had, you know, continuing education credits applied to the to the learning, unfortunately, there's just so much training that goes on, or education that is compliant in nature. And it's usually, it's either from a regulatory standpoint, or is something that is a technical requirement. And while I've said these are non technical skills, unfortunately, they're essential to the safety of patients. And yet, you know, they sometimes don't get the attention that, you know, the crowded education, the yield has, or that to do list that needed every year to be done by those frontline professionals, I will tell you, and again, a lot of that's just about organizational commitment, as well. And so in the military, this is a refresher requirement. Every year annually, every single health care professional goes through after they've had initial training and team steps. For instance, they are required to do an annual refresher of these skills. Sometimes it's done through a simple case study or simulation, where teams have an ability to debrief the case study, and pick apart those areas of improvement that the teams could have or should have done, or even best practices that the teams exhibited. So good exemplars for them to potentially apply to practice with their patients and their team. Right? Okay.

 

Alex
32:39

You've mentioned simulations a couple of times, I am a member of our community emergency response team. So we, you know, we do simulations, what what kind of simulations are you doing this TeamSTEPPS?

 

Steve
32:49

Well, it's, you know, with simulation, again, it could be as simple as a, like we said, some sort of desktop, you know, case based simulation, or even a game, for instance, you know, there's a fun game called a night at the ER, for instance, where basically there's, you know, secure emergency care, you would really have a lot of fun with that. But that whole idea of resource management that comes in to play as whether or not you're going to have the right people on the team, that the right time with the right patients, is really important. And so all the way to, we use story based simulation. So we think stories are rich, for learning the power of story. So we use some story based simulations where you have a story that is narrated, and from that story, we can learn, you know, about a patient about a situation that's unfolding. And how might we change that story, by doing something different as it relates to our teamwork, our communication, or some practice that we would change or a protocol or a system change that could be made. Those are more, you know, known as those lower fidelity simulations, all the way to roleplays. No big, standardized spaceships. And but with the focus of this, not just on maybe my diagnosis, but how I'm being actually the center of care instead of maybe, you know, being ignored when it comes to what's going on with my care and how important it is to communicate up to the high fidelity simulation where we, we have mannequins and we've used that in a way of maybe during an obstetric emergency where we need to move towards from a vaginal birth to a Syrian section. There's so much coordination that goes on between multiple teams, and that can only be practice you you don't want to be the first time doing that. For real you want to have just like with pilots, they rarely fly an airplane before being in a simulator to practice all of those different functions and skills before they let them take that million dollar jet up and go flying with it.

 

Alex
35:12

Yes, no, that makes complete sense. I remember when I was a student nurse back in the 80s. In the United Kingdom, we were supposed to be supernumerary. But of course, we weren't. And there was definitely that sense of, you're jumping into a very hot vat of oil. You know, without much support flying solo before you're really equipped to do that. For sure. I see when you were talking about the emergency room there, you know, we all did a stint in the emergency room as student nurses and the medical officer of the emergency room at the time. In the UK, it's called accident and emergency. And I'm really, really busy nights, he would stand at the at the entrance and literally ask people if they were an accident or an emergency, because inevitably, you know, you get people showing up on a Saturday afternoon, who should really be going to primary care. And that was his way of quality control, making sure that the people who ended up in casualty were actually the people who needed to be there. Anyway,

 

Steve
36:17

I digress. Y'all know, that's, that's exactly one of those kits that we sort of think are intuitive. They, they can be taught. They're learnable and teachable, like situation monitoring. Again, there's, you know, a great tools to actually think about that in the sense of one such tool we have is called step. And step is just probably what that it's opened using at the time, which is step stands for status. So thinking of the status of your your patient, the T is for the team, like how many team members do you have? And where are they assigned? Are they in kind of like Fast Track fair, where it's almost like primary care, as opposed to emergency care? The E, the environment? Where where do you put them value triage them? Do you have all the equipment that you need is? Is there a piece of equipment that's down? Or are one of your departments running slower than normal, like the blood bank, for instance? And then finally, P, the progress towards your goal? How quickly Are you seeing those patients, and which are the ones who delay based on their condition, that that's just a great situation monitoring tool. But think about that if you were a new nurse, or a junior doctor, being able to be taught something where you could actually step through, if you will, for no pun intended, right through how it feels to be able to understand the situation, and then know what to do about that after you have a good understanding. And it's shared amongst your team members. Steve,

 

Alex
37:53

I love that. I love that you

 

37:55

brought that little story right back to a tool that people can use. And and you're right, you know, he probably was engaged in that kind of process. We just all thought it was some kind of delightful idiosyncrasy. I'm mindful of our time, I do want to kind of wrap up by asking you about some current personal work that you're doing. You're You're always innovating you're always researching. Can you talk a little bit about the research that you're doing for your thesis?

 

Steve
38:27

Sure. Well, it's exciting. It's work that I started in aviation. But we really believed that it was along some of these other high risk principles, that it would be potentially applicable to patient care setting. So the concept is known as losa, for line operations safety audit, and it started about 15 years ago in aviation, it became a very methodical way that we went about measuring the actions and behaviors of our cockpit crews, during normal operations. So just everyday operations, just things that went on in a routine way, and was to actually do direct observations of those practitioners. And that concept follows what's known as a threatened error management construct, which is a threat is anything external to the team that adds increased complexity. So it's very simple and thinking about that construct a patient in a patient care setting, that patient condition could be a threat to the team dependent upon whether we know about it, understand it, and diagnose it. Next we are the construct is based on errors, which we're familiar with in human condition. And then finally, these other states known as undesired states, and undesired states are errors that are missed managed by the team. And they they are the closest we come to say a near miss before that event becomes patient harm. And so we categorize the observations into these three elements, threats, errors, and undesired states. And what that's been very successful in understanding, say, an auditing a group of pilots. But we wondered whether we could do that and monitor a group of healthcare team in the same frame, and what value would that produce for us. And it's not only the sheer number of these threats, errors and undesired states that are happening, but which ones of those are going unmanaged? Because it's, it's a measure of how well our programming is working with our teamwork or CRM, or high reliability training, to know whether, in fact, teams are catching and managing and trapping and mitigating these threats, errors and undesired states. And so we did this with the crossover was with aeromedical evacuation teams on aircraft, that were transporting patients in the back of an aircraft in the military. And so it's very exciting work. It appears as though that there are a lot of similarities in these patterns of mistakes that crews make, whether they be taking care of patients or flying the airplane, and how in this construct, like we talked about the teams of teams, as you can imagine, there can be things that the pilots or the cockpit crew does or doesn't do, that could create a threat to the aeromedical evacuation team, just like something that the patient Ward could do as far as slow to discharge patients, that would impact an emergency care team that's trying to move the patient from the ER to a patient bed on the ward. And so there's a lot of coordination that occurs between the two and and one can become a threat to the other without ever knowing that it creates a safety threat to that other team. And so it's not only is it valuable to the team of record, or the unit of measure, which is that aeromedical evacuation team. But it's also important to the system to understand how we could maybe improve our communication policies, our protocols, our training, our culture, if you will, to actually be able to measure these. And now, you know, we can take these audits and at various points in time, it gives us a very specific roadmap for action about how we could not only improve the team performance, but we could actually change the system to in fact, be more hardwired for safer care. So I'm really excited about that work.

 

Alex
42:56

Yeah, that is really interesting. You know, and as you were talking about that, it was hard for me not to think about, you know, everything that's happened this year and COVID. Especially, it was hard to not think about that as as a kind of threat. Is that off base?

 

Steve
43:14

Absolutely. No, that's exactly you hit a nail on the head, we saw that even in the work we were doing this year, some of our observations were actually took place in COVID environments. And so not only could the threat the patient having COVID, but the ability to have the proper isolation systems potentially, on an aircraft for a patient who is COVID positive. And then what about whether or not the rest of the crew members are following prescribed protocols around wearing their face masks or, you know, social distancing. And it's, it's not so easy inside of an aircraft. So again, as you can imagine, we could see that threat play into that. And then just think of a distraction if a pilots now wearing a mask, and not used to doing that how distracting that might be caused them to make an error that they wouldn't have made because they changed their whole protocol in how they actually operate the aircraft. Because it's a simple thing called the facemask. We wouldn't mean anything to an or nurse, but it means everything to a pilot.

 

44:23

Oh, right. Yes. Because it's just it's not part of their normal practice. It's completely new. And it's, it's irritating, and it's getting in the way.

 

44:32

And it even if you will, we've seen even because much of communication is nonverbal. So being able to see people actually speak. That's why I think some of the clear shielding that's going on, maybe offer some real enhancement to communication messaging in the future.

 

Alex
44:53

Oh, that makes total sense. Just one last question, Steve. You know, you're in a position to Have a kind of 360 view of what's going on in healthcare at the moment. And we're talking at the end of 2020. You know, looking ahead to the next 12 months, what sort of things should educators be thinking about in terms of resources and tools that healthcare workers can use to, you know, just make their work a little easier?

 

Steve
45:23

Yeah, I really think that, again, the move towards virtual learning experiences is huge. You know, it's like, we've all seen it, you know, we're, we're moving in that direction more, but I think it has to be good quality learning, right. So experiences that people can get when they need them. So developing learning that can be just in time is going to be huge. I think beyond that, I think ability for teams to come together and support each other, and really support this building back some of the resilience that is in jeopardy right now, I think that's going to be a real need is to figure out how to that commitment to resilience is going to be important as a high high reliability principle. Otherwise, I think we really do risk having more harm to patients, as we continue to take, you know, withdrawals from our workforce in ways that are not able to recover and be able to be resilient. And then I think, finally, is the last 12 months, I think training probably took a backseat to the operational tempo of the organizations, and how are we going to restart that education element? If I'm a nurse educator in a hospital, I probably been told one, we don't have time to train number one, which is always a problem. But now even more so with the operational tempo. And then secondly, and finally, is the pressure on the training and education budget, it's going to occur because of this, I think it's going to get us to start thinking about finding ways to unfortunately do more with less when it comes to the training dollars as well.

 

Alex
47:12

Yeah, no, 100%, I actually was speaking to a nurse administrator last week, and one of the things that she talked about was, you know, I think they ran out of IV, I don't even know what you call them. I lost my words for that. But what she was saying was that the nurses were having to use, you know, a gravity system, which is what I trained with, but they didn't know how to use the gravity system. And there was no one to train them. And so,

 

Steve
47:37

yeah,

 

47:39

you know, so I'm sure there are lots of situations at the moment where people are having to, you know, get skilled in old skills, and the collective memory of what those skills are, is probably in short supply and a lot of places,

 

47:53

right or dislike we saw, we've seen this also with our professional skill set or our specialization, you know, we become so specialized, that all of a sudden, we're asking now our hospitalist to maybe help our internist, and so forth, right, because because we're just in a different environment right now, I might have always been an operating room nurse, but now I need to work in the ICU. Yep. So it's really, really kind of an eye opener to, you know, the kinds of different competencies that we're demanding of our workforce right now. And what will that mean to the future of education?

 

Alex
48:34

I think that's probably a good place for us to end the podcast. Thank you so much for spending time with us today and talking with the right medicine podcast, listeners. Appreciate it.

 

Steve
48:46

Thanks, Alex. Really appreciate the opportunity. It's always great to talk to you. Take care. Thank you.

 

Alex
48:56

I loved talking to Steve, not least because my dad was a payload dispatcher in the Air Force, and some of the things Steve talked about reminded me of things my dad would talk about. Steve eloquently reminded us how this last year we've seen health professionals step up every single day to serve. As educators, we can build and provide the supports that healthcare teams need in order to manage the day to day workload in ways that prevent harm, optimize patient safety, and reduce risk. But I also love talking with Steve because he embodies service. He began his path as a change agent in healthcare because of a negative experience that impacted his family and he really followed it through the narrative thread that connects personal biography and history is strong for many of us who end up working in health care or education. I wonder what your story is. I'm your host, Alex Howson. Thanks for listening to All right, Madison

Introduction to Steve
Personal patient safety experience
Crew resource management
Flexibility and adaptability in healthcare
TeamSTEPPS
High reliability teams
Readiness and organizational change
Patient safety tools
CME/CPD and patient safety
Story-based simulations
Situation monitoring
High reliability research
COVID as systems-safety threat
Just in time virtual learning
Content roundup