What’s needed for best practices? Clean data collection, benchmarking
What’s needed for best practices? Clean data collection, benchmarking
Experts offer advice on how to make it work
Initiating a benchmarking process could be the HIM department equivalent of having one’s teeth drilled. It’s not likely to be something coders and HIM managers eagerly anticipate.
However, given an organized and methodical approach to the process, it can be done well, and it can result in process improvements that are lasting and cost-effective.
"We have developed what we believe are the best ways to run an HIM department, given a set of circumstances, and we did that based on our cumulative years of experience," says Cindy Doyon, RHIA, product manager for data management solutions at QuadraMed Corp. in Covina, CA. QuadraMed is a software technology and HIM consulting company.
Doyon, along with Debi Primeau, RHIT, western region vice president of HIM Services Division of QuadraMed Corp. in Lomita, CA, spoke about developing benchmarking and best practices at the 73rd National Convention and Exhibit of the Chicago-based American Health Information Management Association (AHIMA), held Oct. 13-18, 2001, in Miami Beach, FL.
Doyon and Primeau suggest these guidelines for HIM departments to conduct benchmarking and develop best practices:
1. Define benchmarking and best practices.
People often confuse the two terms, but the simplest definition is that a best practice is an optimal or ideal way to perform a business process or procedure, and benchmarking is the search for industry best practices that lead to superior performance, Doyon says.
Developing best practices requires similar strategies to continuous quality improvement activities, Doyon says. (See chart, "Standard Types of Benchmarking," below.)
"You analyze your data and decide on changes," she adds. "But it’s never done and goes on continuously."
2. Collect information about benchmarking.
HIM department managers can learn more about benchmarking guidelines from organizations that have successfully incorporated these strategies into their business. One such web site is www.benchmarking.org. Also, the U.S. Department of Energy has a web site that includes information on benchmarking: www.em.doe.gov.
AHIMA is another source for information, as the organization has published a variety of benchmarking information and conducts a best-practices award program.
3. Learn the difference between qualitative and quantitative processes.
Qualitative information is the descriptive part of a process, Doyon says.
"Discover why numbers are different and analyze the steps that go into what you’re evaluating," Doyon says. "You have to do a qualitative analysis in order to do a best practice because that’s the only way to identify items, numbers, and steps you have to change in order to achieve your benchmark."
On the other side, quantitative or metric analysis is where someone uses specific numbers or targets in the analysis.
"Once you’ve done the number part, then you can identify where you’ll have something different or some goal to strive for," Doyon explains. "Quantitative is the numbers with a comparative analysis."
Here’s an example of a situation that requires both quantitative and qualitative analyses: Department A has fewer complaints than departments B and C, Department B does more procedures than A and C, and Department C has fewer full-time-equivalents (FTEs) than either A or B. The quantitative analysis has determined which department is best at each of these three practices, while the qualitative analysis will need to answer these sorts of questions:
• Which department is best?
• Which results are the most important to what the organization wishes to achieve?
4. Determine what will be benchmarked.
First, managers should define the process that will be benchmarked, first selecting something that is simple to study, Doyon suggests. (See chart, "Benchmarking Components," below.)
"And it should probably have some high-level exposure within your organization so that you can get buy-off from the administration," she adds. "Then after doing that, plan and clearly define what you want to benchmark."
Primeau advises that these questions be asked when benchmarking:
- What are functions or processes being benchmarked?
- To whom or what will we compare?
- How will the data be collected?
5. Collect data.
There are several ways to collect data, and each has its pros and cons, Primeau says.
First, a manager may collect data through on-site observation, Primeau says.
When observing a department, it’s best to have a standardized tool that is uniform across all organizations and sites that are reviewed. In other words, the site observer will ask the same questions of the same staff at the same time and then observe the processes without making judgment, Primeau says.
"The observer needs to be trained, and we would suggest that there be some advance warning as a courtesy to the department and director," Primeau adds.
The second type of collection strategy is to use a survey tool and survey the department by telephone, Primeau says.
"On-site observation is the best way to go, but as far as feasibility and the time-process, the telephone interview is most popular," Primeau notes.
A third strategy is to use a survey tool with a mail survey, which is slower and typically has a lower rate of return than telephone surveys.
Whether the survey is conducted by telephone or mail, there are generic tools available for such use. However, it’s best that a benchmarking program take the generic tool and specifically design it to be used for the specific reviews required by the organization, Primeau cautions.
"I would use some of the predefined tools as a guide — some of the ones out there on web sites and in AHIMA journals on benchmarking would work — but they should be customized to the individual survey," Primeau says.
6. Start analysis.
Once the data are collected, the hard work begins: analysis. (See chart, "Benchmarking Steps and Processes," below.)
First, list the elements that were included in the data and identify the ones that an organization wishes to strive to improve, Doyon says.
"Do a quantitative data analysis first," she says. "You may have data from 10 to 15 places, and these can be laid out on a grid with numbers."
After seeing the results on a chart, it’s time to select the best three or four items for conducting a detailed analysis, using qualitative analysis, Doyon says. Rank them from the top on down in order of importance.
"You will start to identify which steps or processes are similar or different and which you may want to incorporate into your own organization," Doyon says.
Another strategy for selecting processes to change is to pay attention to the complaints a department has received from other departments or from within its own staff, Primeau suggests. "Also look at where you might have some backlog or process improvement opportunities."
When selecting the most important two to four processes to change, managers should choose processes that are amenable to change and that will have an impact on the organization.
"Maybe you can’t make one change because of X condition, but maybe you can do Y or Z and result in the same impact on the organization," Doyon says.
For example, suppose a department chooses to work on reducing accounts receivable days, which are the days outstanding for Medicare records not yet coded. "Then the data-gathering of what you’re going to measure is the average processing time for Medicare inpatient records, and we define the average processing time as the time elapsed from the patient discharge until the bill is dropped," Doyon says. "And so in the HIM department we would look at collecting the record, assembling the record, analyzing the record, coding the record, and the impact of the record if it’s incomplete and needs physicians to complete its deficiencies."
Some of the data that will be collected in this scenario would be the number of beds in the hospital, the percentage of Medicare discharges, the average charges per day per FTE, and which FTEs are defined as those most directly affecting the processing time of the record, Doyon says.
7. Implement best practice changes.
After analyzing the information, the department might decide that the departments that had the best outcomes in this benchmarking comparison of the above example had a variety of best practice processes, including:
- adjusting work schedules in the record assembly area;
- providing a noise-free environment for coders;
- hiring dedicated clerical staff for transcription;
- using a universal chart order or assembly;
- cross-training staff;
- good communication among staff and sharing of the department’s status in accounts receivable.
What organizations need to do to improve their own best practices is to select one or more of these processes to be incorporated into their own practice and then see if this change leads to an improved outcome in accounts receivable collections, Doyon says.
"You should set a goal, so that if you’re currently at 15 days in accounts receivable, your goal would be to reduce it to eight days," Doyon says. "And then phase in your goal, because you don’t have to achieve it all at once or implement it all at once."
Also, keep in mind that best practices require an ongoing commitment from an organization, Doyon adds.
8. Avoid common mistakes.
One of the biggest mistakes a manager makes is to have preconceived ideas about what will happen, Primeau says.
"Often what happens is the people collecting the data knows what they want to get out of it before they begin the process," Primeau says. "So they’re not doing it with a completely objective eye, and this could lead to a bad analysis."
To avoid this problem, a manager should make sure those involved in the benchmarking project are familiar with the processes and are trained to be aware of this potential problem during the process of surveying and collecting data, Primeau suggests.
Another big mistake that’s often made is that the manager and department fail to clearly define the data that will be collected, Doyon says.
"If you don’t define clearly enough, you will get misinformation from other people and it will skew your results and drive you to an incorrect conclusion," Doyon explains. "If you take more time in the beginning of the process to specifically define the data that’s to be collected, then you will reap huge rewards."
Along with that issue, it sometimes happens that a department provides the surveyor with inaccurate information, Primeau says. This may be by accident or intentional.
"The institutions being surveyed want to appear better than they are, so they could be skewing their results," Doyon says.
An on-site survey has a better chance of preventing this type of problem because it’s more difficult to skew data when someone is observing what is being done, she adds.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.