Evidence Based Medicine:  A Necessary change or…does the Emperor have no clothes?

In an article recently in the New York Times, author Harriet Brown reviews a subject that was hotly debated at the London International Conference for Eating Disorders by Glenn Waller, Roz Shafran and Howard Steiger, among others. The issue was what is called “evidence-based” interventions in eating disorders.

For example, cognitive behavioral therapy (CBT), dialectical behavioral therapy (DBT) and family-based therapy (FBT) are  purportedly evidence-based interventions in the field of eating disorders belong. Waller, Shafran and Steiger et al. discussed why they believe evidence-based interventions are not more widely used, what most people say they are using instead (i.e. “eclectic therapies”), and what can be done about it.

In the NYT article Ms Brown reports, as Waller et al. also did, that evidence-based interventions are rarely used effectively by therapists despite the fact that  “treatments like cognitive-behavioral therapy, dialectical behavior therapy and family-based treatment have been shown effective for ailments ranging from anxiety and depression to post-traumatic stress disorder and eating disorders.”  Articles are cited showing how low the percentages for the use of these modalities actually are.  So far, so good.

“Many psychologists, ”Dianne Chambless, a professor of psychology at the University of Pennsylvania is quoted as saying, “believe they have skills that allow them to tailor a treatment to a client that’s better than any scientist can come up with, with all their data.”

The research suggests otherwise, Harriet Brown claims. “A study by Kristin von Ranson, a clinical psychologist at the University of Calgary, and colleagues published last year concluded that when eating-disorder clinicians did not use an evidence-based treatment or blended it with other techniques for a more eclectic approach, patients fared worse, compared with those who received a more standardized treatment.”

Ah ha!  Finally the evidence that patients were better off with the evidence-based interventions will be presented, I thought, and clicked on the words “patients fared worse”.  But this citation did NOT lead to an article proving that patients fared worse, but rather to yet another study showing that most therapists did not use them.  Huh? Where’s the evidence of efficacy?  This is the New York Times!

In London, therapists were strongly enjoined to stop insisting that therapy is an “art” rather than a “science”, something I personally agree with (the science part).  If it’s an art, said Waller, buy a box of paints and have at it. But actually, the issue is not whether we practice art or science or stick to the evidence or even know what the evidence is; it’s far larger issue and as crucial to the future of medicine as of psychology and psychiatry. The game-changing intervention would not be learning to march in lock-step with “the evidence” or anything (or anyone) else, it would be to learn to think critically.  And to do this we must improve science literacy for therapists (and the rest of us). A bit of critical thinking reveals problems with any wholesale acceptance of just evidence-based practices, not least of which is that, by definition, strict adherence to only evidence-based approaches do not allow for innovation.  The real question is not whether you should substitute your “clinical judgment” for “evidence based practice,” but whether you stay up to date on the research, think critically about it, and adopt what seems to have the strongest base of evidence relevant to what you do and see.  And then – crucially – that you make some attempt to track your outcome. Why does thinking critically matter so much?  Why not just do what everyone else does, or do what those whom you most admire do, or whatever the relevant Medical or Psychiatric Board recommends?  Because once upon a time, these august entities supported trepanation, leeches, mustard plaster, frontal lobotomy and…slavery. After all, the ability to think critically is the real reason for an education.  An education that does not teach or value this would be vocational training only.  And, indeed, there are some that argue that that is exactly what medical school is. But returning from the extreme periphery of this argument to the center:  even if you accept that we need more evidence-based treatments (and I do), you still must ask of every evidence-based treatment put forward: how strong is the evidence, really?  What does the evidence actually say? Did the authors of a paper go into their study to prove a point or to find out “the truth”? Do they interpret all their data in light of their own preconceived notions, or do they let the data speak for itself?  Do they, despite not having achieved statistically significant results, insist that a “bigger N” (i.e. more test subjects) would have inevitably proven them right or given the results they anticipated?  That is common, but weak.  Do they extrapolate from older, even weaker data?  Do they make unexamined assumptions upon which their whole theoretical house of cards is built? Do they clearly report (or even perceive) what possible confounders might have lead to the outcomes they are claiming to have demonstrated? If this stuff sounds pedantic to you, be afraid – medical and psychiatric decisions with potentially life changing consequences are made every day on the basis of weak data and preconceived notions.  Ask questions.  Insist on understanding the answers. Think critically. And please, share your thoughts with your colleagues.