Originally posted on April 14, 2017, to AuntMinnie.com by Eric Barnes, AuntMinnie.com staff writer
Substantial reductions in CT radiation dose can be achieved by monitoring and comparing dose across institutions — even in a large academic health network such as the University of California (UC). Getting buy-in from all the players involved is key, according to a study published online April 10 in JAMA Internal Medicine.
The authors described how several UC hospitals introduced a feedback system for radiologists to share dose information and best practices from their own institutions, and then held a series of meetings to set common standards across the network. The project led to dose reductions of 19% to 25% across UC institutions (JAMA Intern Med, April 10, 2017).
“The main lesson is that when you get to get share your experience across institutions, there is really a lot to be learned,” corresponding author Dr. Rebecca Smith-Bindman told AuntMinnie.com. “We all want to use the lowest dose that’s reasonably achievable (ALARA) … but if you’re not comparing it with someone else’s, how do you know what’s reasonable?”
Smith-Bindman is a professor of radiology, epidemiology and biostatistics, obstetrics, gynecology, and reproductive medicine at the University of California, San Francisco (UCSF). The study team included principal investigator Joshua Demb, Philip Chu, Thomas Nelson, PhD, and David Hall, PhD.
Individual decisions, wide variability
CT use has grown steadily over the past 20 years, but there are few concrete standards for how much radiation dose patients should receive, the authors wrote. There is substantial available information from the American College of Radiology (ACR) and other organizations that promote dose optimization, but there is still wide variability in doses, both within and across institutions.
Inevitably, dose optimization involves trade-offs between keeping the radiation dose as low as possible and maintaining diagnostic accuracy — decisions that tend to be made independently by individual physicians and institutions, which contributes to variability in dose. Collaboration may be key to achieving more consistent dose profiles across and within institutions.
Thus, the three-year observational study aimed to standardize and optimize CT radiation dose across the five University of California medical centers in Davis, Irvine, Los Angeles, San Diego, and San Francisco by collectively defining metrics, assessing radiation doses, and moving toward dose standardization.
First, the researchers prospectively collected dose metrics on diagnostic CT exams performed in 2013 and 2014 at 12 imaging facilities associated with the five medical centers. CT images were uploaded to a single server at UCSF using eXposure (Radimetrics, Bayer HealthCare), a software tool for tracking patient radiation exposure; dose levels were recorded for CT scans of the chest, abdomen (with and without pelvis), and head, which together account for more than 80% of CT scans performed at the UC centers, the study team wrote.
The dose audit used data from each institution over a 12-week period and served as the baseline to create audit reports detailing the distribution of radiation dose metrics for chest, abdomen, and head CT scans. Doses were compared with other medical centers using the two National Quality Forum (NQF) frameworks for reporting dose. The audits summarized several common NQF-suggested dose metrics, including effective dose, dose-length product, CT dose index volume, and size-specific dose estimates.
Once baseline data were acquired, the study authors convened a 1.5-day in-person meeting of UC staff to discuss how to create and interpret dose audit reports. The meetings included medical physicists, radiologists, and technologists, as well as section heads from the five centers.
The groups focused on specific indications for imaging and the protocols used for those indications, and discussed ideas for dose reduction. They were tasked with creating a list of specific protocol changes they would make after the meeting and detailing the logistics of sharing the data with staff.
Best practices discussed during the meeting include the following:
- Streamlining and controlling the process for creating and updating CT protocols
- Using helical and wide-aperture scanning, and iterative reconstruction whenever possible
- Optimizing kV selection
- Designating a single person (such as a lead technologist) to oversee assignment of patients to intuitively named protocols
- Reducing multiphase imaging when possible; for example, skip the noncontrast phase for suspected pulmonary embolism, and use only a portal-venous phase for abdominal pain indications
- Reducing dose for noncontrast and excretory-phase acquisitions for triple-phase urograms for hematuria; reducing dose during the noncontrast portion of four-phase liver scans; and excluding the pelvis when assessing a renal mass or performing triple-phase urograms
- Noting that renal stone studies require very low doses, as do lung cancer screening and hydrocephalus and pectus excavatum studies
To assess the effectiveness of the audit and the in-person meeting, the researchers then collected dose data at the participating institutions for 12-week periods after the meeting and compared the results with the baseline periods.
Reductions for most scan types
In all, 29,594 CT scans were acquired in the three months before the meeting and 32,839 scans were performed 12 to 24 weeks after. Mean effective doses for chest CT fell by 18.9% during the study, while abdominal CT scan doses fell by a mean 25%. The number of scans with dose measurements exceeding benchmarks fell by 48% for chest CT and by 54% for abdominal scans. Head scan doses did not fall, but variability was reduced.
“We found a considerable decrease in average radiation doses and the proportion of chest and abdomen CT examinations that exceeded benchmarks after providing radiologists and medical physicists with quantitative institutional feedback summarizing CT radiation doses and an in-person meeting to share best practices,” the authors wrote.
The intervention did not reduce head doses; one possible reason is there may have been less opportunity for dose reduction if head CT doses were already optimized.
It was the in-person meeting that made the difference in producing substantial overall dose reductions at the end of the study, replacing individual ideas and decision-making with a shared format that permitted the adoption of best practices.
“I think physicians in general really care about this topic, but it’s not so easy for them to know which way to go,” Smith-Bindman said. “When we think about dose reduction, there are so many protocols, so many indications, so many machines that it’s really overwhelming, so it’s important to take a step back and look at studies with high doses, and optimize those studies with extremely concrete feedback.”
Strengths of the study included its large size and broad inclusion of most CT scan types. Also, the use of a single software tool was helpful for providing a common platform for dose calculation and comparison, the group wrote.
As for limitations, the group didn’t review diagnostic accuracy or conduct a protocol-by-protocol study due to the time and expense involved. Also, the design was observational, so temporal changes in dose that occurred could have happened without intervention.
Will it work everywhere?
An accompanying commentary by Dr. Ralph Gonzales and colleagues from UCSF lauded the results but cautioned that the approach might not work in all institutions.
“Well-designed consensus-building meetings with compelling action plans can still be derailed if the institution’s organizational readiness for change is not aligned with the intervention program’s implementation plan,” they wrote. “Factors in the outer environment of an institution include prevailing policies, incentives, and institutional peer pressure, which can serve to enhance or diminish improvement efforts.”
» News » Sharing best practices cuts CT...