Take your patient surveys to a higher level of accuracy
Take your patient surveys to a higher level of accuracy
Avoid the pitfalls of too much and inappropriate data
(Editor's note: In this issue, we show you how to create and administer an effective patient satisfaction survey and conduct a focus group. Next month, we'll address how to interpret and act on survey information.)
Agree and disagree questions are ideal for patient satisfaction surveys. Yes or no? If this is the first time you have thought about how your satisfaction surveys are developed and implemented, then maybe the information they generate is not as helpful as it could be.
Many hospital administrators, in their enthusiasm to gain patient feedback, make mistakes such as collecting too much data or inappropriate data. To avoid similar problems at your hospital, survey development experts suggest following two important rules -- decide in advance what the data will be used for, and identify the population you want to survey.
Here is their step-by-step advice for a successful survey project:
1. First, decide how you will use the data.
The key to a successful patient satisfaction survey outcome is to know in advance what information you're trying to extrapolate from the data, says Nancy Mihevc, PhD, director of measurements of system performance for the Lahey Hitchcock Clinic in Lebanon, NH, and Boston. "The problem is, people don't think about that ahead of time," she says. "They collect all their data and say, 'What do I do with it now?'" So, important questions are left out, and questions are asked that are not important, Mihevc says.
Trying to make something out of data gathered from ill-conceived surveys is a waste of money and time because it rarely yields useful results, says Allyson Ross Davies, PhD, MPH, a private health consultant in Newton, MA, who develops patient satisfaction surveys for health care organizations and hospitals. Work backwards back from purpose toward survey methodology, Davies advises.
For example, will the results be used to identify opportunities for improvements in your hospital's clinical areas? Or will the results be used to help acquire contracts with managed care organizations? "Those are quite different uses and imply at a minimum that the content of the surveys will be different," Davies says.
2. Identify the population you want to survey.
It's also important to know the population you want to study and to obtain a good sampling of that population, Mihevc says. For example, if the hospital wants to survey patients who have seen physicians in a practice for the past week, month, or three months, the practice's information services or scheduling office could generate a list of people who have seen certain physicians. For example, every 10th name on the list then could be selected to survey, Mihevc says.
Simple approach worked
A simple, straight-forward approach to developing a survey has worked well for administrators at the 106-bed Herrick Memorial Hospital in Tecumseh, MI, says Cindy Burns, RN, director of quality at the hospital. Burns developed a 10-person multidisciplinary team which included physicians, nurses, radiology and laboratory staff, ambulatory and social services, the business office, and registration. To develop its survey, the team simply conducted a literature search, chose a model survey, and picked the questions it wanted answered, she says.
"We made sure to ask a lot of emergency department questions on timeliness, service, and so forth because we perceived we weren't timely [in ED]," she says. The team kept questions very general in the clinical areas, Burns adds.
Forming an informal focus group or holding in-depth interviews with a few patients are good methods for targeting patient populations and concerns to develop survey questions, Mihevc says. (See related story on conducting a focus group, p. 107.) If you do hold a focus group, invite administrators who will ultimately use the survey data. While this step seems time-consuming, the survey results will more likely be accepted because the end users will have had input from the beginning, she says.
3. Develop your survey questions.
A survey should ask questions that are carefully crafted and specific, Mihevc says. Here are three types of questions you can use:
-- Rating questions are among the most popular question type, Mihevc says. A rating question asks the patient how satisfied he or she was with a particular service, or to rate a service on a scale from poor to excellent, she says.
-- Report questions ask the patient to report on whether or not an event occurred, Mihevc says. For example, hospital administrators may want to know if patients are waiting more than 15 minutes in the waiting room. The report question might ask whether the patient waited longer and was told why, she says.
-- Agree/disagree questions are popular on satisfaction surveys but can result in unreliable data. This type of question sets up the patient mentally to respond to a question that already has a negative or positive implication, Mihevc cautions. For example, the statement "I had to wait too long in the waiting room -- strongly agree, agree, disagree," predisposes the patient to think they waited too long, she says.
If the surveyor is concerned about patient access or happiness with a physician practice, for instance, you might ask patients:
-- how long they waited from the time they booked an appointment until they saw the physician;
-- how well the physician communicated with the patient;
-- how the patient perceived the physician's technical competence.
In an inpatient setting, ask the patient how well care was coordinated among staff and how responsive nurses were to questions and call bells.
The survey should also cover a comprehensive range of services and experiences and provide a balance of global and detailed data, Davies says. Global data cover broad issues, such as how well patients liked their hospital experience overall. Detailed data target a piece of a process found only at your particular hospital, such as your hospital's preregistration process, Davies says.
Use expert help to create a survey
It is best to have an expert craft the survey, Davies and Mihevc agree. An internal expert or an independent consultant can perform this task. You may also use computer software to create the survey, Mihevc says. If a vendor or internal source is chosen, make sure the person or company has a good background in theory and empirical work in patient satisfaction survey development, says Davies. "Get somebody who knows about measurement science and survey research design involved right at the beginning," she says. "If a hospital doesn't have a good person on staff, look to either consultants or vendors."
Any survey should measure the information that is needed, and the results should be valid and reliable, Mihevc says. Vendors should perform regular validation checks that prove the information they collect is reliable and that ensure their surveys provide the promised levels of accuracy, Davies notes.
Reliability and validity are two different concepts, she adds. Reliability refers to the score that is produced by a specific item or set of items. Reliability proves that if you gave the same person the same survey several times and nothing about their care changed, you would get the same answers. The vendor will have estimated the reliability of the scores produced by the item, and they can tell you what the reliability will be within your particular survey population, she says.
Validity tells you whether your scores will be valid for the intended use of the data. Ask the vendor for evidence that the items on the survey reflect what you're trying to measure, Davies advises. Interpreting validity is extremely complex, she says. It's best to ask someone trained in measurement and application to help you identify whether a vendor's survey has been properly validated, she says.
Also ask the vendor for a response rate. Some vendors will claim a 30% response rate is good enough, but you should expect no less than 50%, Mihevc and Davies agree.
Once the survey is developed, a pilot test should be conducted to be sure patients understand the questions and the type of information needed is being compiled, Mihevc and Davies say. A group of 10 to 15 patients should take the pilot survey. Afterward, the surveyor should talk to the participants to see if any questions were confusing, Mihevc says. "Even after doing this for 20 years and thinking I've looked at all the possible angles, when I pilot test I always change things," Mihevc says.
At Herrick Memorial, Burns and other administrators gather information about patient concerns through annual community meetings. Hospital administrators meet with a group, either from businesses, schools or churches to discuss their needs, expectations and the hospital's strengths and weaknesses, Burns says.
4. Administer the survey.
Here are three popular methods for administering a survey, Mihevc notes:
-- hand-out/hand-back -- The patient receives the survey prior to the visit and hands it back immediately after;
-- hand-out/mail-backs -- The patient mails the survey back after discharge;
-- telephone surveys.
Before sending out a second survey to non-respondents, the surveyor should wait at least two weeks for mail-backs. When mailing a survey, also send a letter signed by the medical director or hospital administrator assuring recipients of confidentiality, Davies says.
Telephone surveys are difficult and expensive to administer -- $10 to $20 per completed survey, Mihevc says. Yet the data are available immediately, while data from mail surveys may take up to nine weeks to gather because of the time it takes for the surveys to come back.
Never attempt an in-house phone survey, Mihevc warns. It is too difficult for people working at an institution to remain objective when faced with negative feedback, and patients aren't comfortable expressing negative opinions when they're talking to someone who works at the institution, she explains.
Software that creates and mails out patient satisfaction surveys is another option, Mihevc says. At Lahey Hitchcock, staff use Viewpoint by National Computer Systems in Edina, MN, to send out about 700 surveys each week, Mihevc says. (See source listing for Viewpoint at end of article.) The software lets you create your own survey, Mihevc says. "You can input patient data on who to send the survey to," she says. "When the survey comes back, you use the scanner to scan the survey directly into the computer, so there's no data entry. Viewpoint will even perform a second mailing by printing more surveys for all people who have not returned them. Viewpoint costs about $3 per returned survey, Mihevc says.
Response rates vary, but should be at least 50%, Mihevc notes. Lahey Hitchcock's response rate is around 60%, she says. The response rate at Herrick Memorial used to be about 30% but it's climbing to 41%, Burns says. "I think we're getting a reputation that we care about what our customers tell us," she says.
[Editor's note: For information on Viewpoint, call National Computer Systems at (612) 829-3000.] *
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.