Encrypted | Login

MedPhys 3.0 LogoMedPhys 3.0

Making Quality Control in Radiography Comprehensive

Clinical medical physicists are responsible for determining whether or not imaging systems are operating properly, and the method they use to do this is transitioning from Medical Physics 1.0, which provide “siloed” glimpses of system performance, to a more comprehensive version known as Medical Physics 3.0.

Medical Physics 3.0 relies on modern informatics resources to evaluate results and discover trends on an ongoing basis—to detect and fix system performance problems before clinical operations are affected.

If something goes awry with an imaging system, Medical Physics 1.0 methods can completely fail to detect anomalous system performance.

Silo
Med Phys 1.0 tests are isolated glimpses of system performance. There is no mandate
to assess longitudinal performance or to compare results from one system to another.

A recent event at MD Anderson Cancer Center illustrates the importance of the Medical Physics 3.0 approach. “A radiologist reported prominent grid lines on a chest radiograph and, when responding to the complaint, we noticed conspicuous rebound artifacts that suggested a detector recalibration was necessary,” says Dr. Diana E. Carver, Imaging Physics resident.

The calibration was completed, and the artifact disappeared. The corrective action was a success, but why weren’t they alerted to the problem sooner? How long had mediocre images been generated?

To find out, they examined weekly quality control measurements for clues. “On a weekly basis, we perform the GE Quality Assurance Procedures (QAP) test on our GE digital radiography (DR) units,” explains Dr. Charles E. Willis, associate professor. “We found that the QAP test on the DR unit had passed immediately before the radiologist’s complaint. So why did the problem go undetected?”

Digging deeper, they discovered that contrast-to-noise (CNR) values had passed the default limits. “Naturally, we wanted to know why our quality control approach failed,” Willis says. “Not only had the CNR dropped drastically, but it appeared to be related to a detector replacement. To make matters even more curious, the recalibration didn’t completely restore CNR to previous levels.”

To investigate, they collected approximately 100 chest radiographs from the same unit. These images were grouped by service events that corresponded to different periods of CNR change: before the detector replacement, after the detector replacement, after recalibration, and during the transition period when the CNR began to drop. The Duke University Clinical Imaging Physics Group analyzed the images using software that measures ten different metrics of quality for chest radiographs. Results revealed that the lung noise metric was the strongest indicator of changes in the detector’s performance. “If we’d monitored lung noise on every image, we would have been warned far in advance of the image quality complaint,” notes Carver.

They still, however, hadn’t identified the root cause of the problem. Why didn’t they detect a problem during annual testing? They looked at another indicator of detector performance—exposure-dependent signal-to-noise ratio (SNR). “In retrospect, the data from the most recent annual test was completely outside of limits, and all of the data from every annual test dating back to 2009 was as well,” Willis says. “We knew that a proper calibration was performed in 2006, so what happened after that?

The root cause was finally discovered through their institution’s equipment service records. “The service history revealed a catastrophic failure in 2007 of the image detection controller (IDC) that stores the calibration files,” explains Carver. “The computer was replaced, but an old backup file from before 2006 was loaded. All future efforts to calibrate the detector also failed to save the new files. Ultimately, the detector was replaced and the backup of the incorrect calibration was loaded for the new detector.”

Medical Physics 3.0 methods “can reveal performance issues such as the one we encountered,” she adds. “In this case, we would have been aware of the problem much sooner if we’d been using them from the beginning.”

Silo
Med Phys 3.0 exploits a mosaic of multiple disparate information sources.