CO (Carbon monOxide) is a significant cause of poisoning in the US, with hundreds of fatalities each year. The RAD-57 non-invasive CO monitor is a device that is supposed to make identification of these patients quick and accurate in the out of hospital setting.
There has been one study of the RAD-57 on actual patients being evaluated for CO toxicity. In that study, the sensitivity was horrible. Only 48%. I could do as well flipping a coin. So could you.
The low sensitivity has been the focus of the criticism. On the other hand, the 99% specificity has been seen as a confirmation of what was already known.
Is the high specificity real?
There is a study coming out that suggests that rather than 15%, we should use 6.6% as the cutoff to provide good sensitivity. What happens to this study’s calculation of 99% specificity (only one false positive for every 100 patients screened), when the cutoff is dropped to 7% (the RAD-57 does not provide a display in fractions).
Using the 15% cutoff, 99% of the time, when the RAD-57 indicates that the carboxyhemoglobin is over 15%, the carboxyhemoglobin is over 15%.
Only one false positive out of 120 patients.
What happens when we change the cutoff to 7%?
Not so good on the specificity. There appear to be 14 false positives out of 120 screened patients.
What will happen in the real world with these results?
With time, we will probably start to ignore the results that do not tell us what we want to see.
We will have spent $4,000 per machine to have a piece of equipment that we ignore when we do not like the results.
How does that provide any benefit for anyone with CO toxicity?
 Performance of the RAD-57 pulse CO-oximeter compared with standard laboratory carboxyhemoglobin measurement.
Touger M, Birnbaum A, Wang J, Chou K, Pearson D, Bijur P.
Ann Emerg Med. 2010 Oct;56(4):382-8. Epub 2010 Jun 3.
PMID: 20605259 [PubMed - indexed for MEDLINE]