How accurate is the data you send to JCAHO?
How accurate is the data you send to JCAHO?
Coding entry errors can affect quality outcomes
"Coding systems are so complicated [that] data errors are rampant in hospitals across the country," says Justin Doheny, executive vice president and chief operating officer at St. Peter’s University Hospital in New Brunswick, NJ. "Then those errors get multiplied throughout the system as people put them to use."
So when the Joint Commission on Accredita tion of Healthcare Organizations in Oakbrook Terrace, IL, crunches the numbers it gets from accredited systems across the country, the accuracy of the outcomes they detect turn on whether data entry personnel are entering numbers accurately. Also dependent on this wobbly system are Medicare’s MedPAR file, state health departments’ analyses of data, and all the other outcome reports originating from that coding. The data are also used by the Chicago-based Ameri can Hospital Association in trending diseases and as the basis for significant political decisions on local and national levels.
Doheny says St. Peter’s selected only so-called "bright-line" indicators for its ORYX requirement — ones that are not subject to misinterpretation. "For example," he says, "we chose readmissions within 28 days of delivery, AMI after surgery — significant events where the likelihood of coding error is minimal. But soon we’re going to run out of ORYX bright-line indicators."
However, even bright-line indicators like readmission rates aren’t foolproof, cautions Nancy Y. Carter, RN, MBA, director of clinical resource management at Emory Hospitals in Atlanta. "There is a good bit of error potential in readmissions based on how a vendor chooses to calculate," she says. "[Some institutions] identify readmissions as patients with the same birth date and zip code, but many patients can have the same zip codes and birth dates, particularly twins." Carter notes that the readmissions indicator may not adequately reflect the fact that patients don’t always return to the same hospital. "While an institution may look very good on an indicator of this type, it may actually be losing patients to other hospitals, where the actual return or readmission occurs," she says.
Even simple clerical errors can skew readmissions data. "A patient’s name can be spelled or entered in different ways on multiple visits so that the patient isn’t recognized as a readmission," Carter says. "There are billing systems that allow free text to be entered so that even states are misspelled. Some billing systems are so cumbersome that multiple bills are generated to correct an original bill — which may dramatically skew the number of encounters showing up on the indicators."
And what’s going on in individual organizations is just the tip of the iceberg, Doheny says. The problem pervades our whole health care system, including Medicare. A number of years ago, when Medicare was publishing results of mortality experience by hospital, there were nationwide misinterpretations of how particular events were to be coded. One of the problems involved a question on admissions from nursing homes through the hospitals’ emergency departments (ED). Should those have been coded as admissions through the nursing homes or through the EDs? No code existed then for a nursing home patient admitted to a hospital through the ED. Coding personnel had to choose nursing home or ED. There was a need for a third option in the coding system, but no one realized that until after the fact. When supervisors were originally coding the data, no one foresaw its ultimate purpose.
Adequate attention was not paid to those discrepancies at the time of the coding, but they resulted in different weights being applied, depend ing on which way the admission was coded. The forecast mortality was different from that actually experienced by the hospital.
"The problem was, Medicare didn’t look at the data until years after it was coded," says Doheny, "and it made a difference in the way Medicare ran its projected methodology." The population in one statistical model received a much higher probability of death than in the other.
HPR asked Doheny if a solution is electronic entry by the physicians on a case-by-case basis. "The computerized medical record would not fix the kind of things we’re talking about," he says. It’s too pervasive, and the cost of computerizing all the records in all the hospitals across the country is not realistic. Most hospitals are moving in that direction, but it will take years and years before all are computerized.
"I’m not sure what the solution is for this data confusion," he admits. "People have to recognize that just because numbers are spit out of a computer doesn’t mean they are right." The old phrase "garbage in-garbage out" still applies here, he says. "You have a very thick medical record — typically two to six inches of paper — and the people collecting data from it typically have no more than a high school degree plus some subsequent training." Accredited record technicians or registered record administrators are supervising the coding process, but the actual data going into the system are recorded by much lower-level personnel. Some are on the medical records staff, while some are temps from outside firms.
"People sometimes make themselves feel better about this situation by saying that inaccuracy exists across the data and therefore it all evens out in the end," says Doheny. "That assumption is not valid, because when you get down to looking at individual hospitals or individual states, you may find local practices that run contrary to people’s interpretation of what the data is supposed to mean."
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.