Daniel Bailey, PhD, DABR
HCA Sarah Cannon
Daniel Bailey, PhD, DABR (00:03):
So we’re going to be talking today about managing change, which you’ve already heard several fantastic speakers already this morning and touched on a lot of things that we’re going to talk about during this session, but specifically implementing and standardizing the way that we roll out SGRT across multiple machines, perhaps within the same clinic or across multiple clinics. So, you know, every year that you ask me to come back, I realize it’s been one year longer that I’ve worked with SGRT, and there’s this myth somehow that like experience makes you wiser. So, I mean, surely since 2013, I must really be a distinguished expert right now, right? I’m just learning as the years go on. Experience makes you older. Okay? I think that if we evaluate our experience and we actually improve and change the things we do, then perhaps I could say experience makes us wiser. And hopefully that’s what we’re trying, trying to kind of do together and learn in talks like this.
Daniel Bailey, PhD, DABR (01:01):
So, a little bit of context and background about my experience and what we’re drawing from during the course of the talk today. Currently, HCA Sarah Cannon Cancer Network, of which I am the director of Physics and dosimetry, we are constantly evolving in the process of standardizing technology and workflows across the enterprise. It’s never something that will reach final state. It’s something that we literally have to keep evaluating, adapting, implementing new processes and policies and procedures, learning from things that have happened or almost happened. But if you look at, I mean, even just like our Aria and Eclipse implementation across the entire enterprise, the initial state of these, you know, roughly 50 clinics was, I mean, wild, wild west, right? Just the way it would be if you had a conglomerate of any large number of clinics that there was not standardization. That it wasn’t a system that was scalable, lacking inter interconnectivity for like remote access and cross coverage and those sorts of things. Software and tech management, which almost always falls on local physicists and local it. And very often the clinical staff are already bandwidth maxed. And so, trying to manage the standardization and implementation of new stuff on their own is virtually impossible. And at least in the context of, you know, information technology and data security and that sort of thing. Very often it’s the case that in a, you know, diverse and scattered group of clinics, there is no formal disaster recovery whatsoever. And that kind of was the initial state that described how we use AlignRT, Aria, Eclipse, you name it. And those are just the kind of things that we’re trying to kind of reign in and standardize, including the way that we are standardizing and implementing surface guidance across the entire network, trying to get to a place that in a nutshell, we buy it the same, we implement it the same, we train it the same, we work with it the same, so that we are able to, in a scalable and standardized way, maximize quality, maximize efficiency, reduce errors, and so on and so forth, across the landscape of our radiotherapy service line, including in our surface guidance.
Daniel Bailey, PhD, DABR (03:11):
So, going way back to my initial experience with AlignRT, which was actually within the Northside Hospital Cancer Institute in Atlanta, Georgia. We had a relatively short break in period where a lot of questions and a lot of issues were resolved and very quickly proved the efficiency and the accuracy of the system. Now, that doesn’t mean that everybody embraced it immediately with open arms. You’ve already heard today, it’s almost always the case. There is some subsample of persons within every, and it’s not just one discipline either, is it, there are some sub sampling of persons within every clinic when a new technology is implemented, including something like AlignRT that they just don’t believe, they just don’t trust, right? And a healthy amount of skepticism, I think, in this arena is actually good, because after all, if we do it too hastily without too much evaluation, if we don’t have the kinds of steps in place that we will talk about through the course of this talk, you can really screw things up, right? And you can actually really do things that end up harming patients, right? So a certain amount of skepticism, I think is healthy. That was certainly the case when we tried to implement surface guidance from the very beginning. But very shortly thereafter, we started and every patient, every fraction approach, especially starting out with our stereotactic treatments, intended use case match what the data actually says? Because we have just outlived the point that we make a plan for the use of SGRT that we are not planning ahead of time to roll it out to everybody. If you aren’t doing that and you’re still looking at it as if it’s just some kind of special technique band-aid, then you have missed the point of what SGRT can do for your patients, okay? And we need to make our rollout plan based on these and then started doing our research on going tattooless. And Markus, like you’ve already heard about a lot today, and transitioned our very first clinic that was a single nette clinic to fully marker list treatments, or all of their patients within that year. And everything just exploded from there.
Daniel Bailey, PhD, DABR (04:43):
So from there to where we are today, this is of course, now drawing from experience across HCA as well, which is a much larger radiation oncology network. We’re talking about a system that currently has roughlyKenny, you might have to keep me honest. I think we got 35 AlignRT systems. If I remember correctly, we have four or five that are going to be implemented either this year or early next year. So we’re really massively expanding our SGRT presence across radiation oncology within the Sarah Canon Cancer Network using a systematic approach to purchasing and configuration, right? You’ve heard today, like some departments for example, they wonder why they don’t have postural alignment, right? Well, others do. Or they’ve learned that they don’t have, I don’t know, a 3D photo pick a feature, right? And that becomes a, a source of not only frustration from clinic to clinic, that people didn’t put foresight in it to begin with to say, this is what we need in a standardized and configured way, right? Pre-Configured way, but also it causes massive problems when, you know, the clinical staff try to cover for each other, right? And sometimes you don’t even realize these disparities until the staff try to cover for each other. As we’ll talk about a little bit later, they’re in the training section. An adaptive approach to training and ongoing education, which we will talk a lot about that this morning. Building policies and procedures, but not in an arbitrary way, building pro policy and procedures in a deliberate and careful and data-driven way. And then setting the expectation that I think as a couple of the capable speakers before me, mention that we are on the road toward making surface guidance, our standard of care for every patient, every fraction across our enterprise, as we believe it should be across radiation oncology in general.
Daniel Bailey, PhD, DABR (06:40):
So here, I’m going to ask a series of kind of leading questions throughout this talk, okay? Is your implementation data driven? I feel very passionately about this. Okay? I mean, several of these things you’re going to notice, like my blood pressure goes up and I seem excited up here because I really do feel very strongly about a lot of this.
Daniel Bailey, PhD, DABR (06:59):
But here’s a potential problem. Sometimes we implement a rollout plan without considering real data driven reasons for why we’re doing the things that we’re doing. Okay? And a good example of that is, you know, only implementing SGRT for special procedures like DIBH. I think this is especially important because, and I’m going to talk a little bit more about uniformity of making sure the entire clinical team is aware of the rollout plan and the expectations and the timelines and that sort of thing. This is a new technology. I’m hypothetically saying we’re implementing it, right? So that’s the scenario that we’re talking about. You’re trying to implement this new technology. Our radiation oncology workflows are already complex. We agree, right? And that’s before you ever consider the many variables that enter into the room when the patient enters into the room, just the technology and the buttons you have to press and the workflows are very complex. And so basically in an implementation scenario, we’re going to a treatment team and saying, Hey, we’ve just bought this brand new thing and it’s so shiny and it’s bright and it goes beep, right? And, and you’re going to use it. And what we fail to do sometimes is actually ever explain to anybody, why the hell are we doing this, right? And then we have segments of the clinical team that they’ve never even been informed, you know, what’s the benefit or what are we trying to get out of it or whatever. And I’ve seen that over and over and over again, right?
Daniel Bailey, PhD, DABR (08:23):
So I just want to run through some data that you know, I’ve mentioned several of these things in previous talks before. So you can find this stuff in the literature, you can find it on the SGRT community website in past discussions, but just to fly through some stuff, okay? So in our own institutional study of about 15 pelvis patients and about 19 breast patients, okay, we’re talking about somewhere around 650 fractions. What we did at the beginning was start out with saying, Hey, let’s take these patients and reduce as many variables as we possibly can, like same treatment team and that sort of thing. And we’re going to do it half with skin marker, set up and half with SGRT and just see what the differences are when we’re using the surface guidance technology. And what we found with that cohort of patients is all the same stuff that you’ve already heard related this morning or relayed this morning. I know what I mean. Just listen to me. That we had, you know, drastically reduced times, especially for difficult setups. You know, I remember one example given of the 26 versus 53 minutes over the course of treatment, right? We can really talk about especially extreme cases that can be a lot more than this, 15 to 20%, but on average a 15to 20% reduction of setup time. And not only that, but if you were to graph it, which we did, and we published it in 2018, so you can go read itif you look at the confidence intervals, you know, those get much smaller too, which means we’ve got much more consistent set up time, consistently shorter set up time. So those extreme cases are reduced too, right? Smaller shift magnitude on average, approximately half the average shift magnitude when setting up with surface guidance instead of just skin marker triangulation and a big reduction in repeat imaging. Now, I have to say one thing about this because I don’t say it enough, and it’s really important. Do we have any of our wonderful dosimetry colleagues in the audience today? I see one hand here, I see another over to my left. All right, fantastic that you’re here. We absolutely love the magical things that you do for us and for our patients. Tell me, if you had the knowledge going into your treatment that due to the cumulative cone beam dose that these patients were going to receive, you’re actually probably delivering like 30, 40, 50 centigray more at least than what your plan calls for. Would that make a difference to the way that you push your OAR and your OAR doses? Because what you may not realize if you’re not very heavily involved in the treatment planning process is that it is often the case that we are pushing the OAR organs at risk, right? We are put, I don’t even know that you save any syllables by that acronym, do you anyway, organs at risk really pushing them, I mean, to like, to the level of centigray, right? In our plans. And yet, very seldom do we, and I think actually the UK is much better at this than just take into account our repeated imaging and how much dose that’s actually contributing to these patients, which is pretty much spread out uniformly in normal tissue, right? So the fact that we can reduce repeat imaging by 40% or more is actually a pretty huge deal, do symmetrically and can really make a difference depending on what, on what we’re treating.
Daniel Bailey, PhD, DABR (11:45):
Three examples. I don’t have enough time to go through this, but you can go through them when you get the handouts. Basically the same thing. All of this reduction in setup time, this reduction in shift magnitudes, reduction in repeat imaging, more geometrically accurate, way faster. It’s just proven again and again. In other words, as Dumbledore would say, it is incontrovertible, right? I learned that word from Dumbledore. Did anybody else? Well, at least the movie version, I can’t remember if said it in the books.
Daniel Bailey, PhD, DABR (12:11):
A little quote here. SGRT opens the possibility to reduce the number of BCTs while maintaining sufficient setup accuracy. The advantage is a reduction of imaging dose and overall treatment time. And if I can give you one more, and again, you’ll have these in your handouts, okay? So don’t worry if you don’t get a picture. The quality of patient positioning before radiotherapy treatments has been optimized by using SGRT without additional imaging dose. SGRT clearly reduced inefficiencies in the patient positioning workflow. Your clinical team needs to know this before you actually implement the technology, right? Or else while it’s being implemented. Invariably, you are going to experience difficulties and setbacks. And that means that at some point everybody’s going to be scratching their head saying, why are we doing this? Because there was not a data-driven approach to even explain to people why we’re doing it and how we’re doing it. You see? And that’s one of the ways that sometimes we fail in our implementation plan.
Daniel Bailey, PhD, DABR (13:06):
I just want to say a little bit about accuracy. We’ve been talking about efficiency up until this point, but I think that we already know this. However, I am surprised when I hear that there are still clinics that potentially use SGRT for setup, but not for intra-fraction monitoring. Because the fact is, if I can put the end of these slides like two or three slides away, right now, you’re mistreating patients. If you’re not using some form of surface guidance for intra-fraction monitoring, you are mistreating some percentage of your fractions for patients. And we have tons of data to show that it’s not something that can even be argued with. Okay? So I don’t know what I want to show here, but basically in our own study, we found that we’re limiting these data to stereotactic. For the purposes of this slide. We found that seven to 10% of stereotactic cases had to be halted at some point during the course of treatment because of an SGRT indicated shift that was beyond our tolerance.
Daniel Bailey, PhD, DABR (14:02):
And of that seven to 10% of SBRT patients, typically, most of them after re-scanning, had to be shifted in a clinically significant way. Now, atrium has done a study and published it, thank goodness, because I probably owe them all bottles of bourbon. I have quoted this study so many times when I’ve given these talks, but they basically found the same thing a long title, which I can’t even say in one breath, so I’ll skip it, but 335 stereotactic fractions treated during which 34 fractions. So do a little math with me, that’s about 10%. Everyone agree? Yes. 25 different patients exhibited patient movement beyond two millimeters, which was their institutional tolerance as observed using SGRT of those 34 fractions. So of that 10% of fractions, 74%, so roughly seven and a half percent of the hole right, resulted in shifts of two millimeters or greater based on CPCT realignment back to our dosimetrist. If you knew every one of your patients might shift by two millimeters, would it make a difference to your plan? Of course it does, right? And so here again, we’ve got published data to share with the whole clinical staff. Now that we know it, we can’t unknow it, right now that you’ve seen it, you cannot unsee it. And we know that like the eye in the sky, right? There’s an eye in the sky looking at you. Alan Parsons project. Anyone know that one? Yes. Yeah, it’s a great one, right? I’ll sing to you another day if you guys like that kind of thing. So now we know from the SGRTI in the sky that we actually have patients moving that we never knew based on CCTV that they were moving before, which means there’s a sub sample of patients that we are not delivering optimum dose, at least as calculated in our, in our treatment plan.
Daniel Bailey, PhD, DABR (15:53):
Okay? I think that’s all I have to say about that. So just a few conclusions. Does your rollout plan and data. Does the plan for education and training match the data for how new technologies like SGRT are rolled out clinically? So when you’re considering your education and training plan, are you building into that plan? Yes, we have a plan for how we’re going to use this across the board for all fractions or for whatever. And we’re going to take into account professional guidelines that have to do with education, for example, hands-on training and those sorts of things to make sure that everybody gets it. Do written policies and procedures focus on and completely address the most important areas of implementation and safety. I’m going to say a little bit about policies and procedures again, in a moment. It is exhausting work, isn’t it? Writing policies and procedures, if you like, that sort of thing. You are a weirdo and I need to know you and you need to be my friend because it is exhausting, frustrating work. And sometimes you have to hedge your bets here, right? Set up guardrails so that in a data-driven way, you’re starting your policies and procedures with the stuff that is most important that could go wrong, rather than just trying to eat the whole elephant in one sitting, right? Okay. That’s a terrible, terrible metaphor.
Daniel Bailey, PhD, DABR (17:43):
Is your, I love elephants, is your implementation. Everyone’s thinking about elephants too. Isn’t that the weirdest thing? Right? Don’t think about elephants that had nothing to do with anything, sorry? Is your implementation plan transparent with clear expectations? Okay, this is, again, one that I’m quite passionate about.
Daniel Bailey, PhD, DABR (18:00):
Here’s the potential problem. Sometimes we implement a rollout plan without transparency to the entire treatment team. And I mean, like the entire clinical team here, right? Everybody who’s going to touch this technology setting clear expectations and timelines. And one of the reasons that this is so important is that, as we’ve already said a couple of times, implementation of new technology can be dangerous if everybody doesn’t know what’s going on with full transparency and how it’s going to be used, everyone’s been trained, so on and so forth, we can actually take steps that make our patients less safe, instead of actually using this, you know, beautiful technology to make their treatments of higher quality.
Daniel Bailey, PhD, DABR (18:42):
Here’s a quotation from required reading, okay? If you have not read the radiation oncology incident learning system themed report on surface guidance, then you are to do so before the happy hour party tonight, okay? Or no alcohol for you. Surface imaging is a tool that has the potential to be value additive and act as a safety barrier. However, its value is dependent on the way in which it is implemented and care must be taken with implementing any new technology to prevent the introduction of new error pathways. Okay?
Daniel Bailey, PhD, DABR (19:22):
So this is particularly important. We have found in our vast experience during the training and documentation of competency process of implementation, is your training plan consistent, complete, and documented for the entire treatment team? Have you established written policies and protocols in advance of the actual clinical implementation of SGRT? In other words, have you done it ahead of time? Or is your documentation and standardization an afterthought? Right? I’ll give you one anecdote. We had a situation with a fantastic team of therapists that in all honesty, if you’re training an implementation plan is not standardized and clear and you know, laser focused on how the steps are going to go and so on and so forth, it’s easy just not to realize that you’ve missed gaps, right? That you’ve maybe missed people or someone was sick the day we talked about that or whatever. And you don’t even realize that until real life scenario. So we had a situation wherea lung patient due to discomfort, which I think all of us can sympathise with, these patients have to hold weird positions for long times, right? And, and time on the treatment table is not anyone’s friend, which is one of the reasons that the efficient efficiency benefits of SGRT are really so powerful, but due to discomfort moved his elbow apparently. And I don’t even know that anybody realizes it. I think the eye in the sky of SGRT is the thing that said, no, this person’s not in their position anymore. And that’s a frustrating situation. Like, you’re ready to treat the patient’s setup. It’s a difficult patient. The setup has been difficult, everything looks fine. And then you look to your left, I don’t know where your AlignRT monitor is, but you look at and you’re like, son Nova, right? It’s, we can’t treat now, right? You can’t even beam on if you wanted to. And one of the therapists on the team suggested, like, I mean, it had been a really hard setup and it had been a significant amount of time, so I can sympathize, but that person suggested, let’s just capture a new reference, right? And to those of you who’ve used the system and know what that means, you’re basically just wiping out all of your data, right? And saying, fine, we’re just going to accept it. They’re in the wrong position so that we can force the machine to treat. Okay? So setting clear expectations in a training plan so that everybody knows how things are expected to go when things go right, how things are supposed to flow when things go wrong, how important that level of transparency and standardization is.
Daniel Bailey, PhD, DABR (21:56):
So here we go. Highly scientific models of effective and ineffective training. I don’t mean to skip it, but I had another group discussion that we, I don’t think we really have time for. We’ve only got like 15 minutes left in the session. So what I might do is just let you think about this. If you have something you want to say or ask about it, we can boot it to the q and a panel or, you know, afterwards, of course we can talk amongst ourselves. I’d love to hear your feedback.
Daniel Bailey, PhD, DABR (22:22):
But from your own experience starting up SGRT in your clinic, what training method or technique was most conducive to learning and retention? So already in the examples that some of you have given today, and in the previous talks I’ve heard about, like all that wonderful online training that you do beforehand so that people have done simulations and things like that before the technology ever hits the clinic, things like that, okay? That make it most conducive having an expert trainer there available to you and kind of on call in case you need ongoing training. So we’ve already heard some of these and what training method or technique really didn’t work well, I’m going to give you some examples of that. Okay?
Daniel Bailey, PhD, DABR (22:59):
So first of all, highly scientific methods of training. You could have your school or your vendor or however this information is dis disseminated by experts go out to the entire, not one person cracked a smile when I put this slide up here. What is wrong with you people? I had someone ask me if I was going to do Star Wars this time and I said, no, I’m going to shake it up and do Ninja Turtles. And there was great rejoicing. So you could train the entire treatment team, right? Equally from the vendor so that everybody essentially gets brought up to speed as an expert. At the same time, we realize that that’s not always possible, right? I would give you scenarios, but you all know that that’s not always possible to get everybody in the same scenario at the same time getting the same training. But that is an effective method to get, you know, the expert training from your vendor, Clint specialist, whatever.
Daniel Bailey, PhD, DABR (23:57):
You could also take the super user approach. I mean, I at least saw one person smile this time. Holy hell people, you could take the super user approach, right? And train someone who’s really good and very effective at communicating and so on and so forth. That takes all that expert information. And then as time allows throughout the course of your clinical processes or whatever, disseminate that information to the rest of the treatment team. So once again, the end goal is the same. Everyone’s brought up basically to the same standard of training with clear timelines and expectations, but it may not have come directly from the clin app specialists. It might have come through a super user or a team of super users. Okay? Those are fantastic. They were my idea. I’d use the Ninja Turtles to show you.
Daniel Bailey, PhD, DABR (24:42):
Here are some things that don’t work. You give them to somebody who potentially is not a very good choice anyway because of their communication style or their personality or whatever. Maybe they’re just too busy, it’s not any inherent flaw to them, but this method doesn’t work because all the training goes to that person and they do, I always want to use adult language with you guys. We’re all adults here and I always shy away from doing it, but they don’t do anything with it. And then your entire clinical team is left out in the dark, thinking, well, where’s my training? Like, I didn’t hear anything. That person got to go physically to training and they didn’t bring anything back. You’ve heard all of this before, right? Okay. So that is a highly ineffective method of training a team. Anything. You could also do this, you could pass it on to one person who may be fine, I don’t know, maybe that one person was great, but then they pass it on to person number two and person number two passes it on to person number three. Person number three passes it on to person number four, and you all know where this is going, right? So I go to the TrueBeam one day and something’s not working the fantastic amazing phenomenal therapist team that I would never complain about is, you know, trying to do troubleshooting on their own. You know, they tell me the steps they’ve gone through and they’re like, we flipped the red switch to, I’m like, what? What red switch? What are you talking about? Flip the red switch. And in that particular vault, there’s a switch that has nothing to do with AlignRT, okay? I mean, for all I know it’s turning some poor soul’s office lights off in another building somewhere. You know, they’re just switching it back like my kids, right? Right. Surely by the 347th toggle, it will work. Okay? And they’re flipping the switch. I’m like, what are you talking about? The red switch? And they’re like, well, we were told by so and so that whenever it’s not working we should flip the red switch. And so several generations later, they’re still, as far as I know, flipping the red switch as a troubleshooting step with SGRT says, good lord, people, right? But that’s what happens when unfortunately, things deviate, right? Things become variable as information is passed downward in the game of telephone. And this is not an effective way of training people.
Daniel Bailey, PhD, DABR (26:59):
It is of course acknowledged that the vendors generally have well-developed training programs by experienced instructors. We would agree with that. However, these are obviously geared to the use of specific equipment which the particular vendor supplies. Perhaps what is required to compliment these events is more training and specifically safety-related topics. Human factors, process flow, failure modes as they apply to particular processes in a particular clinic. A multidisciplinary approach to such training might mitigate some of the communication difficulties encountered in a busy clinic environment. Something in2012 paper by the late great Peter Dunski. I know some of you remember him and learned a lot from him. And isn’t that absolutely true that we put the work in ahead of time that says we all generally get an understanding of each other’s roles. We work together as a multidisciplinary disciplinary team to work out here’s the functions and this is what the way that they’re supposed to flow. We try to, in a prospective, not entirely reactive way, think about things that might go wrong and how we’re to mitigate those situations when they arise. And in a transparent way, we roll that out to everybody, right? With clear expectations and clear timelines. And if we don’t, I’m not saying that like that’s the only way it can be done. I’m just saying that from our experience, if it’s not done that way, you’re just leading yourself down the path of creating error pathways, okay? That you never intended to create. Which is a shame because the technology is supposed to make people safer.
Daniel Bailey, PhD, DABR (28:39):
So are we making sure that everyone is trained thoroughly, including things that might go wrong in all aspects of their immediate responsibilities, and trained generally in departmental procedural workflows. So like walking my shoes type thing. Why is physics doing what they’re doing? What does dosimetry even do in this, right? Why are they here at this meeting? Ask them. They look like wonderful people educated, at least generally in departmental implementation of quality management for when things go wrong. And I could spend literally the rest of the day talking about number three, which would just drive you all to boredom and tears and death. But it is so important, right? That you need to have a quality management culture of safety in your clinic that learns from incident reporting from near misses that analyzes those things that every member of the clinical team, from physician to dosimetrist tophysicists to therapists, that everyone is empowered to know that they have the right when something feels or seems like it’s wrong to stop everything until the question is answered. I could go on for a year on this, right? But all of that plays into the implementation of your surface guidance program.
Daniel Bailey, PhD, DABR (29:46):
Documentation. I think for the sake of time, I might skip some of this. We’ve already talked a lot about documentation in the previous talks. So I think that I’ll leave these slides to your own self-study when you, when you have time later.
Daniel Bailey, PhD, DABR (29:59):
Okay, I’ll just bring up this one quote because I’ll bring it up again at the end. However, it is a common observation that even when adequate documentation does exist, it’s not always followed. It is unlikely that failure to follow established procedures is for some malicious reason. I’m not sure about that. I’ve known some malicious people, unlikely but not unheard of. It is more likely to be due to the procedure either having been forgotten or the significance of not following it, or, not being fully appreciated. So that ties into what we’ve been saying about ongoing education so that people don’t forget and don’t create these, you know, red switches that do nothing workarounds not realizing that what the proper procedures are. I’m going to come back to that quote in a moment.
Daniel Bailey, PhD, DABR (30:49):
Is your implementation process standardized and formalized? Even though we acknowledge that by definition it has to evolve. You are going to learn more. You’re going to learn from mistakes, from near misses, so on and so forth. You’re going to learn just from good, solid experience. And as that happens, as a multidisciplinary team, your training, your policies, your standardization of these techniques is going to evolve. That’s just sanity. That’s just good quality. But is that process standard? Standardized and formalized?
Daniel Bailey, PhD, DABR (31:22):
Here’s a potential problem. Sometimes we expand the multiple machines and or centers without ever standardizing the implementation plan. So inadvertently, we create pathways for error different, here’s an example. Different treatment teams develop their own unique workflows and you don’t even know it because like take physics for example, that often has a very you know, hardy role in the rollout of new technology. We’re not standing there at the treatment machine all the time, right? We trust the therapists to do that. And sometimes we don’t even learn that, you know, these aberrant methods have come in from team to team until the teams actually cross-pollinate and someone else has to cover a machine or someone else has to cover a clinic. And it’s like, what’s going on? We never do that at our clinic, right? Everyone’s heard or experience that standardization. Here’s another topic I really cannot say enough about, but I’m not going to read these quotes for you. Suffice to say that every significant professional guideline or publication having to do with making radiation oncology safer emphasizes the need for standardization and scalability. And one very good example of that is TG100. I don’t know if you’ve ever read TG100 before. It’s perfect if you’re experiencing bouts of insomnia, but it’s absolutely full of vital information, okay? And the TG task group 100 AAPM task group 100, Google it, you can pull it up. It all starts with defining as a multidisciplinary team, your process flow. Well, if you’re going to define a process flow, like you’re going to write it out as a flow chart or whatever on a piece of paper, then by default that means you actually know what the process flow is, that it’s actually standardised to the point that you can map it out. And actually very often, if you will do this, maybe some of you have done this, or if you’ll take this into, you know, a consideration for a multidisciplinary team in your clinic to just say, one day, let’s take an hour and say, this is the way we do our SGRT. Deep, deep inspiration breath hold and map it out. You are immediately as a go, as a group going to realize that you all have very weird ideas. That’s what immediately happens. And you will also immediately realize that there’s different levels of understanding why we do things, what we do, you know, what we do when things don’t go right from machine to machine or clinic to clinic, that things are done differently. That’s where the quality management pro, you know, basically programmatic suggestions of test group 100 start.
Daniel Bailey, PhD, DABR (33:52):
Here’s a quote, one of the general results of FMEA and associated FTA. That’s failure modes and effects analysis. And associated fault tree analysis is the clear need to define site-specific treatment planning and delivery protocols that serve as the basis for simulation planning and treatment delivery expectations, methods and quality management procedures. This general standardization and documentation of the methods to be used addresses many of the most common failure modes for many of the most critical steps in the planning and delivery process andnd are a crucial way to avoid training and procedural errors. That’s a general statement about standardization across radiation oncology. By golly, it absolutely applies to the implementation of SGRT.
Daniel Bailey, PhD, DABR (34:40):
Standardization across multiple machines or clinics. What does that look like? As we’ve mentioned about a hundred times to this point, centralized and ongoing education that we have a dedicated ongoing training and education, preferably team. One of the speakers who preceded me said they had, I think a couple of you actually said that you’ve got some kind of like super user team or implementation. You’ve got your own ninja turtles, right? That they are responsible for making sure that education is given out and that periodically you check back because what we got turnover, right? We got new people coming in as we grow, it’s not something that you can just do at the beginning and forget about it. It’s something that has to be ongoing. So having a dedicated on ongoing training and education team involved that are also involved in the policies and procedures process was a lot of alliteration.
Daniel Bailey, PhD, DABR (35:33):
However, it is a common observation that even when adequate documentation does exist, it’s not always followed. It is unlikely that failure to follow established procedures is from some malicious reason. It is more likely to be due to the procedure, either having been forgotten or the significance of not following it, not being fully appreciated. And this is exactly what we’ve experienced in our clinics, right? I mean all of us together collectively, that as new people come in, as time goes on, and it doesn’t even have to be that dramatic. How many of you have experienced accidentally forgetting to do a step on any one particular day just because there’s a ton of things going on? Or maybe you’ve got a lot on your mind that day. We’re humans, right? We have real lives. Well, I’m a physicist. I don’t know how many of us have real lives, but you know, most of you, you’ve got real lives. You know, and you carry all of these things around you and with you while you work. And it can be those too that come into the way, which is one of the reasons that checklists and questionnaires are so absolutely, you know, vital in our process flows. But as new people come in, as there’s turnover, as we’re dealing with difficult situations, as stress levels increase, whatever, that’s exactly why we have to have this ongoing you know, education and competency that’s built on good policies and procedures that’s standardized and rolled out to the whole clinical team.
Daniel Bailey, PhD, DABR (36:50):
Standardized physics, dosimetry and therapist workflows, avoiding knowledge gaps and workarounds. I only want to say one thing here. It is often the case because we experience time pressure, we’ve got patients who are on the table, we need to get through our schedules, get them treated, and get them out. That whether you realize it or not, treatment teams come up with what they call workarounds. Yes, yes. Okay? Workarounds are normalization of deviant behavior, okay? I’m a big fan of deviant behavior, just not in the context of radiation oncology, okay? Standardization of things that were never meant to be done that way to begin with. And one of the problems of workarounds is you cannot always by yourself, sometimes not even as a team, anticipate all the things that can go wrong from the timeline changing as Dr. Strange showed us in Infinity War, right? Okay. So understanding that these processes, whether it’s the way that dosimetry handles their export process to the AlignRT system, whether it’s physics in their qa, understanding the process flow, the steps that they have to take, you all understand of course, that there are calibration steps that if you do them out of order, you’ve screwed up your whole calibration, right?
Daniel Bailey, PhD, DABR (38:09):
Things like that have to be understood by the whole treatment team. And of course, this requires documented co-competency for everyone before they ever touch a patient. So that we don’t throw people who really don’t know what they’re doing into clinical treatment and then ask them to sink or swim. Your patients are too important to approach new technology that way. Too important to approach SGRT that way. Having standardized action plans for when things do not go as everyone expected them to go. I don’t remember the, one of the previous speakers was talking about something that went wrong. I can’t remember which one of you it was. The talks were all so good. But you know, again, that’s the most frustrating situation when you’re trying to troubleshoot on the fly, right? And the therapist look at me and say, Dan, like you’ve told us to restart the computer three times now. I’m not sure that starting the computer is our first line of defense, right? So working together with your clin app specialists, with people who are already experienced with the technology working together as a multidisciplinary disciplinary team, write out those processes, right? The ways in which the, the order in which we often do those troubleshooting steps should theoretically be virtually the same every time there’s an issue. And yet we constantly fail to ever even write it down. I mean, have a, have not a sticky, I totally agree. You take your sticky notes and burn them outside, right? But, you know, having some kind of maybe a quick reference just pasted on the cabinets up above the machine that shows, here’s the three steps that inevitably ends up with call physics, right? We know that’s at the bottom, but, you know, have your steps written out so that you know, and everybody’s doing it systematically the same way, right? For when things go wrong. So that those level of frustrations don’t build up so that those levels of frustration don’t become impediments to actually using the technology the way it’s supposed to be used and using it in a quality way.
Daniel Bailey, PhD, DABR (40:02):
I don’t have any time for this, unfortunately. I think we’re out of time. But these are five proposed implementation strategies that are directly stolen from the Royals themed report. So again, remember I said this is required reading, right? If you have not read the Royals themed report and you are practicing SGRT, then doom on you. Alright? You need to read the report. Very, very accessible. It’s pretty short too. Unlike the task groups that, like I said, tend to make you so tired that you can’t ever get through them. But, you know, look at those steps, reinforce them in your clinic, take them back from this meeting, take that knowledge back to make your processes better and safer. Alright? I think that’s it. So from my distinguished experience, there’s my recommendations for standardization and implementation of SGRT across clinics and machines. Thank you.