SPECT software comparison uncovers inconsistencies

May 1, 2008

Most interpreters of cardiac SPECT use any of three software packages provided by vendors of gamma cameras and PACS: the Quantitative Gated SPECT algorithm from the Cedars-Sinai Medical Center, Emory Cardiac Tool Box from Emory University Hospital, or 4D-MSPECT from the University of Michigan Medical Center.

Most interpreters of cardiac SPECT use any of three software packages provided by vendors of gamma cameras and PACS: the Quantitative Gated SPECT algorithm from the Cedars-Sinai Medical Center, Emory Cardiac Tool Box from Emory University Hospital, or 4D-MSPECT from the University of Michigan Medical Center.

The packages are all licensed from academic institutions. Each does an excellent job. But they don't all work the same way, and they don't deliver the same quantitative results, which can cause problems if a patient is evaluated at different times with different packages.

A collaborative study among the Universityof Oregon and Sacred Heart Medical Center in Eugene and Cedars-Sinai Medical Center in Los Angeles documented a lack of consistency among these three packages, leading investigators to suggest that the same software be used to compare and analyze SPECT data acquired on individual patients at different times and in different laboratories.

There should be little trouble in doing so, according to Dr. Mathews Fish, medical director of nuclear cardiology at the Oregon Heart and Vascular Institute at Sacred Heart and a lead investigator in the research.

All three types of software are widely available, Fish said. Vendors often have multiple licensing agreements for the different software packages, allowing an institution to choose whichever one it wants. Sometimes, several are made available to a customer, ensuring that a patient's images analyzed with one package can be analyzed later with the same software. Also, certain software packages tend to dominate in different areas of the country. Providers in the western areas of the country, for example, gravitate to-ward the algorithm developed by Cedars-Sinai.

Although the research documented inconsistencies among these major software packages, there is relatively little danger of misdiagnosis. Several sources of data typically factor into a diagnosis, including physical assessment of the patient during stress testing, such as the appearance of pain or blood pressure abnormalities; ECG recordings; and, in the case of SPECT, a visual analysis of the images.

"Inconsistencies due to the software could lead to a misdiagnosis, but that would occur probably in a limited fraction of patients," Fish said.

Fish applauds the presentation of images by these packages, noting that at least 75% of the decision making based on SPECT is related to visual impression. Important management and treatment decisions, however, are based on quantitative analyses provided by the software, particularly when characterizing the degree of abnormality, he said."One program might indicate a mild abnormality, and another might say it is more than that," he said.

Concerns about possible misdiagnosis are mitigated by using a single type of software. Familiarity with the quantitation in the context of other diagnostic data and use of the same software on patients from one scan to the next make patient assessments easier and more accurate.

Future developments may also improve the underlying quantitation. As part of research sponsored by the National Institutes of Health, Fish and colleagues are developing an automated high-performance system for analyzing the data from cardiac SPECT exams. Their efforts, currently in the first year of a five-year grant, have focused on different approaches for quantifying the data.

"The goal is to make (the analysis) more objective, more quantitative, and more accurate," he said. "We're looking at a whole variety of things, so we might combine them so as to make a better determination of what is normal and what is abnormal."

-By Greg Freiherr