Do Statistics Ever Lie?
Do Statistics Ever Lie?
Abstract & Commentary
By Frank W. Ling, MD, Clinical Professor, Department of Obstetrics and Gynecology, Vanderbilt University School of Medicine, Nashville, is Associate Editor for OB/GYN Clinical Alert.
Dr. Ling reports no financial relationship to this field of study.
Synopsis: A reassessment of the results in a randomized controlled trial regarding laparoscopic adhesiolysis leads to the conclusion that, contrary to the authors' recommendation, the procedure should not be abandoned.
Source: Roman H, et al. Why laparoscopic adhesiolysis should not be the victim of a single randomized clinical trial. Am J Obstet Gynecol 2009;200:136e.1-136e.4.
I hate statistics. I don't understand statistics. Statistics are so confusing. How many times have these statements (or something like them) crossed your lips? I know that I certainly feel that way sometimes. This article can potentially lead to more confusion, or it can prove to be a breakthrough for you and the application of research articles into your practice. I hope it's the latter.
The authors of this article challenge the conclusions of a 2003 article by Swank et al, which was aimed at comparing laparoscopic lysis of adhesions with diagnostic laparoscopy in the treatment of chronic abdominal pain. Although the results did not show equivalence between the 2 procedures, the paper concluded that the 2 were equivalent, and, therefore, adhesiolysis should be abandoned. In a population of 100 subjects randomized to 1 procedure or the other, 57% who underwent laparoscopic adhesiolysis reported pain reduction while 42% reported reduced pain in the diagnostic laparoscopy group. The 15% difference was not statistically significant, so the study's conclusions recommended abandoning laparoscopic adhesiolysis as a treatment for chronic pain related to adhesions.
Without going into all the details, here are some of the issues that these authors raise, all questions that any one of us could possibly raise if we were to critically read the literature with some degree of statistical comfort. Because the original paper did not take certain assumptions into account, they chose to conclude that there was no difference between the 2 procedures rather than state that they could not reach a conclusion on the relationship between the procedures. This resulted in an overstatement of what the data really said. The authors also chose not to exclude patients undergoing adhesiolysis who had major complications and also those who had incomplete adhesiolysis. Because these patients make adhesiolysis appear less effective, their inclusion in the data analysis is critical. Unfortunately, the discussion section did not fully address what the results would have been had those patients been excluded. The original analysis also did not address the fact that the data set showed less benefit than the literature. The obvious questions in this regard might relate to whether this was a unique study population and why the results were so different from previously published data.
The authors also pointed out that the study did not factor in the potential use of adhesion barriers, which could limit adhesion reformation. That is not necessarily a flaw in study design, but it does limit the applicability of the publication's results since so many cases in our daily practice might well involve the use of barriers, thereby potentially increasing the benefit of the adhesiolysis.
Commentary
What does this mean to your practice? It means: Don't believe everything that you hear, or at least understand the data as well as the analysis. The authors of the original paper overestimated what their data said, yet practitioners used the results to make clinical decisions. We should all be reading the literature, but we should be good consumers also. Think of it a bit like a television commercial, i.e., just because you see the product advertised doesn't mean you run out and buy it at the store; however, the more you hear it, the more likely you are to buy it. Then, when, and if you do buy the product, you'll be running your own test to decide whether it's a good product and whether you'll buy it again. So Roman and co-authors have done us all a favor by cautioning us not to abandon adhesiolysis just because of this one study, even though it was randomized. Evidence-based medicine is now and will continue to be important to our daily lives, but this is an example of why it does not hold all the answers yet.
So in response to the question in the title, "Do statistics ever lie?" the answer would be that they don't lie by themselves. They need help.
In our never-ending effort to provide a service to our readers, allow me to provide a quick reference glossary that may help in your future reading of the literature (see sidebar).
I hate statistics. I don't understand statistics. Statistics are so confusing. How many times have these statements (or something like them) crossed your lips? I know that I certainly feel that way sometimes.Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.