DURHAM, N.C. -- Agencies that rank doctors and hospitals need to make sure they are comparing apples to apples, or rankings can become skewed and unfairly penalize high-quality medical professionals, according to cardiologists James Jollis of Duke University Medical Center and Patrick Romano of the University of California, Davis.
In a critical analysis published in the April 2 New England Journal of Medicine, the physicians warn that the methods of data collection are not accurate enough to make results trustworthy. But, they add, if simple corrections are made, such scorecards can be a good way to ensure quality care.
"We are questioning whether these types of scorecards are really accurate enough to make available to consumers," Jollis said in an interview. "Hospital "scorecards" are here to stay, so as physicians we have the responsibility to be sure the methods used to generate rankings are as accurate as possible."
The researchers analyzed the Pennsylvania Health Care Cost Containment Council's 1996 report on how well doctors and hospitals in the state fared in taking care of heart attack patients. They found while the method used was, in general, sound, it had several flaws that could have skewed results. They chose Pennsylvania because it was the first state to implement a government-sponsored statewide ranking.
"The Pennsylvania report is a good intermediate step, but we need better systems for reporting outcomes before these type of rankings will be realistic," Jollis said.
The researchers say before such ratings systems are adopted and results made public, they should be subject to peer review to be sure the most accurate information is reported to the public.
In their analysis, Jollis and Romano argue that using information from
hospital bills and insurance claims is not an accurate way to gather information
about patient outcomes. They say such data, while more standardiz
'"/>
Contact: Karyn Hede George
Georg016@mc.duke.edu
919-684-4148
Duke University Medical Center
1-Apr-1998