Keep patient education from being the red-headed stepchild
Keep patient education from being the red-headed stepchild
Do your efforts measure effective education, or are your data meaningless?
Most patient education managers are aware of the problem: Patient education is undervalued by administrators. As a result, staff in patient education departments find themselves holding their breath during budget-crunching, hoping that patient education won’t be the target of staff and budget cuts.
Administrators often want proof that patient education makes a difference. For example, did the teaching improve the bottom line by reducing length of hospital stay, readmissions, or visits to the emergency department? Did teaching create behavior changes, such as the CHF patients’ consumption of sodium, that improved their ability to manage the disease? Did teaching improve self-efficacy and provide the confidence needed to perform a task such as monitoring blood glucose levels? These data are hard to come by, but possible to find.
Measuring the effectiveness of patient education programs, teaching protocols, and materials is difficult, says Fran London, MS, RN, a health education specialist at Phoenix Children’s Hospital. It’s easiest to measure outcomes of programs when they’re applied to patients who all have the same diagnosis. The outcomes of patients in the program can be compared with those from a similar patient group not enrolled in the program.
For example, at the University of Wisconsin Hospital and Clinics in Madison, learning center staff evaluated orthopedic patients receiving presurgery education at the center vs. those who were taught in the clinics. To gather the information, staff ask patients to complete a survey tool before teaching and again prior to discharge. Chart reviews also are conducted. Several weeks after hospitalization, study participants are interviewed over the telephone. The study showed that there were higher levels of empowerment and self-efficacy for the patients that were taught in the learning center, says Zeena Engelke, RN, MS, senior clinical nurse specialist at the facility. (For more information on the learning center, see Patient Education Management, September 1999, pp. 105-107.)
Most teaching is done individually in the office, clinic, or hospital within the context of an interaction, not in groups or programs, says London. While it is easy to measure short-term effectiveness of one-on-one teaching by asking questions or having the patient demonstrate a skill, it’s difficult to know if the change has a lasting impact. In this situation, it is impossible to create a control group because it would be unethical to choose not to teach a patient. Therefore, it is hard to determine — let alone demonstrate to others — the long-term impact of informal teaching, says London.
Telephone follow-up surveys also help gather data. Staff at St. Joseph’s Hospital of Atlanta, for example, call patients to determine whether a teaching protocol for diabetics following open-heart surgery is effective. In helping these patients learn how to manage their diabetes after discharge, staff educators make a follow-up call one to two months after discharge to assess the patient’s retention of teaching and behavior change.
During the patient’s hospitalization, a nurse makes sure the patient has a blood glucose meter and knows how to use it. Patients also are taught how often to monitor blood glucose levels, when to call their physician, the signs and symptoms of hypoglycemia and its treatment, and that diabetes is a major cause of coronary artery disease. Post-heart surgery patients with diabetes were chosen because it is important for diabetics to monitor their blood sugar closely and to follow measures to control their blood sugar after surgery. This will help prevent post-op complications, such as infection, that could lead to readmission.
"This project will help us evaluate the effectiveness of our teaching and assist us in making any changes in this program," says Jodi Langford, RN, BSN, patient education coordinator at the health care facility.
The follow-up questions include:
• Do you remember being seen by an educator?
• Do you know that diabetes is a major cause of coronary heart disease?
• How often do you monitor your blood glucose level?
• Do you have a high or low blood glucose reading?
• What do you do when the reading is high or low?
Follow-up during the April, May, and June 1999 quarter showed that 30 out of 31 patients remember being seen by the nurse educator, despite post-op fatigue or the effects of medication, which can hamper teaching. The results show great retention and application of knowledge, says Langford.
A short patient and family education survey delivered twice a year at Children’s Healthcare of Atlanta provides a peek into the effectiveness of education and the patient’s satisfaction with it, says Kathy Ordelt, RN, CRRN, CPN, patient and family education coordinator at the facility.
Patients and family members are asked to rate patient education on a scale of one to five, ranging from very poor to very good. A couple of yes-or-no questions were included on the survey. Some of the areas people were asked to rate on the May 1999 survey include:
• Information about your child’s care explained in a way you could understand.
• Degree to which your child was included in the teaching (if over three years old and able to understand).
• Ease of asking questions or expressing concerns about your child’s care.
• Staff attention to your child’s special needs.
• Your understanding of the teaching materials provided.
On most questions, the patient education department wants to achieve an overall 4.5 — or, for the yes-and-no questions, a 90% positive response — but the ratings are set individually each time. "I have never found a way to come up with hard-core statistics to go to our CEO and top leaders and say, look what patient education does.’ I collect soft data," says Ordelt.
The satisfaction surveys do show how satisfied patients and families feel in managing their health care when they are discharged, and that is important to consumers today, she says.
Define what effective education is
Many patient education managers have trouble proving the effectiveness of patient education because they are not clear on what would constitute effective teaching, says Kate Lorig, RN, DrPH, director of the Stanford Patient Education Research Center at Stanford University School of Medicine in Palo Alto, CA.
The first step in the evaluation process is to articulate what you want your teaching to accomplish. Define what effective education is or what you want to know, she says. For example, ask yourself the following questions: Do you want the outcome to change health behavior, health status, health care utilization, or satisfaction with the system?
To determine a good health outcome, think about why you have created the teaching protocol, program, or educational materials. Ask who cares. "If you can’t answer the who cares’ question in about 15 seconds, it is not the right outcome," says Lorig.
Evaluations must be tailored to your audience. If it is administrators who care, then the outcome you are trying to achieve might be fewer visits to the emergency department for asthma patients. To show whether a program is cost-effective, measure utilization the year before the program is launched and one year after the program is up and running to determine if there is a difference, says Lorig. The utilization may not be emergency department visits, but rather frequent physician visits or hospital admissions. "In our chronic disease self-management program, we showed that people who attended the program had eight-tenths of a day less hospitalization over six months than people who did not attend the program. That is a big cost savings," she explains.
Once you determine what you are trying to achieve and who cares, you must determine how to measure the outcome. To determine the best way to measure patient education, go to the literature and look to see what other people used. Measurement tools are always referenced in an article, says Lorig. "I would advise patient educators never to write their own instruments, because writing an instrument is a very difficult task," she says. (For information on a resource that contains several evaluation instruments, see source box at left.)
Perhaps the best aspect of evaluation is that it doesn’t have to be complicated. Lorig quickly summarizes the elements of evaluation:
• what you want to know;
• why you want to know it;
• straightforward and simple measures.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.