FINE PRINT: By no means is this summary intended as a substitute for the actual report.
The focus of the task group was to improve physics plan and chart review as a means to increase patient safety. To this end, the task group members used a risk-informed approach to assess potential error modes in the radiation oncology workflow using failure mode and effects analysis (FMEA) as described by TG-100.
A survey of active clinical physicists was conducted as part of the task group efforts to assess current physics check elements and to gather demographic information and department makeup. Over 1,500 AAPM members contributed to the survey, which provided a diverse and representative sample to inform the task group’s recommendations. As a bonus, initial survey takers can make use of the final survey results as part of a self-directed educational project to satisfy ABR requirements for maintenance of certification.
To be broadly applicable, the risk analysis performed for TG-275 was based on a collection of failure modes generated by task group members, their professional colleagues, and close review of incident learning systems such as RO-ILS. The failure modes were separated into groups of initial, weekly, and end-of-treatment checks, depending on which check would be ideal for identifying and catching the errors.
For each failure mode identified, a Risk Priority Number (RPN) was assigned based on the potential severity of the failure mode (S), the likelihood of occurrence (O), and the detectability (D). A high RPN score indicates a high-risk failure mode, and vice-versa. By doing this, the task group was able to more easily recommend physics check elements that are most effective.
In addition, the task group compiled survey responses corresponding to individual elements of the initial, weekly, and end-of-treatment checks that indicated their use frequency. The tasks enumerated in the report also show which tasks can be automated, either partially or in full.
Ultimately, the task group recommends each clinic develop chart checks based on a risk analysis for their particular clinical needs and that the review process be standardized and focused on the failure modes with the highest risk and highest severity. To help clinics develop chart check procedures, example checklists have been provided:
Example of weekly chart review checklist (abbreviated)
Other salient recommendations include employing automation in the clinic to assist in chart review and involving physics input earlier in the patient care workflow, and it calls on vendors to develop automation and workflow tools to assist in these checks and do so in a way that presents information in a user-friendly manner.
Q&A with Task Group Chair Eric Ford
Tyler: The report suggests that medical physicists might consider playing a greater role before treatment planning in order to mitigate those errors. What does that mean for physicists?
Eric: I think being more involved in simulation is probably a pretty good place to focus. Maybe being assigned to a case from start to finish might be helpful, so that you’re more involved with the whole process around that case. You can communicate better from the beginning. Some of this stuff happens naturally at smaller centers, but it is a little harder to pull off once you get more than two physicists because, I think, it doesn’t scale very well. I think being more involved in dosimetry and the planning process is helpful. You could look at the counter-example, which would be a physicist who really doesn’t see anything about the patient or the chart until the plan comes to them to check. So I think just kind of anything we can do to get away from that sort of a situation is better.
Tyler: Is there anything from the report that surprised you?
Eric: Going into this, one of the things I thought, probably naively, is that environments with different equipment, in particular the different oncology information systems (OIS) or treatment planning systems (TPS), would have different sorts of error patterns and different things, and it would just look very different. It turns out they really don’t. With access to the RO-ILS database, we sifted through a very large number of reports to find error patterns and they were pretty similar. We looked at how long it takes people to do chart checks in a Mosaiq environment versus an Aria environment and the times are about the same, error pathways are about the same, so that was a bit of a surprise.
Tyler: Is there something in particular that you really want to stress about the report, some detail that you think needs highlighting?
Eric: Well I think it’s really important for people to understand that this report was based on a very thorough risk analysis, and you will see that if you read the report, even just for everybody to know. This is one of the first such AAPM reports where there’s a thorough risk analysis behind it, so this is very different than getting a relatively small number of “experts” in a room and asking them their opinions and then writing it down and then taking that as the recommendation, which is the typical pathway for AAPM reports. For all the ink that’s been spilled about TG-100, we’ve not seen it be translated to an AAPM report yet and this is the first such thing that I’m aware of. The error pathways we found are probably representative of most clinics in North America.
Tyler: What other work surrounding safety and quality improvement needs to be done? What’s left to do besides continuing what we’ve already started?
Eric: The realm of automation will be important going forward. Anything that gets us away from a reliance on manual inspection is going to be positive. Incident learning is one tool that is great for promoting the culture of safety, but it has to be done with some thought behind it in order to be meaningful. Peer review could benefit from more thought and research; that includes chart rounds but also things like physics audits and colleague peer review. IROC phantoms are great for this, but they fail far too often and in ways that we don’t completely understand, though we are getting some indication now from the group as to why that might be. We spend all this time chipping away at getting TG-51 within 1%, yet 15% of our phantoms are failing an audit test. I don’t know why we’re not rioting about this.
Tyler: I found from personal experience that shorter checklists tend to be more effective. Can you comment on the example checklists offered in TG-275?
Eric: We were sensitive to the fact that longer checklists can sometimes be counterproductive. I think so many things can be automated now, so a lot of the recommended items can be run through a script to check whether it’s done or not. I think another way to do it is to chunk it up, like we talked about in the report. Also, you don’t need to have one person responsible for everything on that checklist. It doesn’t even have to be a physicist, as long as the right processes are in place. We decided to include everything that we thought should be there, but sort of left it up to whoever is using it how they are going to accomplish it.
At this point, we’re all aware of safety culture and how it plays a major role in our departments. We know that it’s an iterative, ongoing process. The new task group report and complementary insights from the task group chair provide us an opportunity to take a closer look at the processes and checks we have in place for physics plan review so we can make our efforts more effective. And more effective physics checks means safer departments.
The task group aims to improve physics plan and chart review as a means to increase patient safety, with recommendations to employ automation in the clinic. ClearCheck, a plan evaluation tool, does that and more by prioritizing patient safety and plan accuracy while automating the evaluation process to increase efficiency.
See how Hackensack Meridian Health is using ClearCheck to automate their workflow and improve the quality and consistency of their plans.
written by Tyler Blackwell