Posted by: leigh | April 30, 2013

The clinical trials of MDMA for PTSD: Garbage in, garbage out.

Some months back, there was a publication that ended up being highly hyped by the press. Controversy and science by press release ensued. Public relations people had field days hitting all the target demographic periodical outlets. There was a convenient lack of PR for a concurrently published paper with results that could not be more negative. Great situation for people who like to get all fired up about things, useless hype in terms of evidence-based discovery efforts.

The publication in question? A follow-up study by Mithoefer et al describing the long term effects of their MDMA therapy for PTSD, based upon which the press releases make the bold* claim that MDMA augmented therapy has an 89% success rate in treating PTSD.

Herein, I describe the reasons this is an ostentatious conclusion at best.

First, we go back a couple of years

The contested claim is from a story that really gets going in the first phase of their study, which was published back in 2011. So to understand the recent publication, we must start at the beginning.

Briefly, this study used a double-blind design to investigate the effects of MDMA as a psychotherapy adjunct. They recruited 20 subjects, and decided to complicate their study off the bat by using an unbalanced design (unequal sample sizes). Their active drug group was 12 subjects, while the inactive placebo group was 8 subjects. They used common evaluation tools such as the CAPS (clinician-adminstered PTSD scale).

They set out to compare MDMA vs placebo. For the active drug, they administered 125 mg of MDMA at the start of the therapy session, and offered an optional supplemental dose of 62.5 mg MDMA 2-2.5 hours later. (I assume they also offered a supplemental placebo “dose” to the inactive group.) Not all subjects took them up on this, which complicates things further. Pharmacologically speaking, it gets even harder to interpret when you also consider that patients were asked to discontinue use of any psychotherapeutics for the duration of the study. (Withdrawal effects? Rebound effects? How long were they off the meds? We don’t know.) And some patients were given one or two medications to assist recovery after some MDMA+therapy sessions. (What are the effects of these sporadic and inconsistent treatments on consolidating progress made in therapy?)

When you wish to make a big claim, yes, your work is under a level of scrutiny that directly correlates with the size of the claim you intend to make. Their entire study design seems to treat subjects and groups inconsistently, which is a big problem when you want to harp on the strengths of the prospective, double-blind design of the study. Well folks, you have to treat all subjects the same way, too. To the greatest extent possible.

They found plenty of effects, which may or may not be directly related to the primary finding: that people are pretty effin’ good at determining whether or not you gave them MDMA. (As are the study observers.) So much for placebo, so much for the double-blind study.

So, in what reads like a grand gesture of shoulder-shrugging, they offered MDMA +therapy to their placebo group members in an open-label “crossover” phase. Except one placebo subject, whose CAPS score had decreased from 67 to 15, felt satisfied with his or her experience and declined further participation in the study. (The authors note that a second placebo subject experienced a decrease in CAPS score from 54 to 15 after therapy, which later increased to 64. This makes me ponder the effect of the therapy protocol as well as the drug treatment.) They present the data from the crossover phase in a table, arguing this data further demonstrates that MDMA+ therapy improves PTSD symptoms. We can’t tell this claim from a simple effect of more time passing after placebo+therapy sessions, since there is no control group anymore! Also, the way they say these things makes me wonder how exactly they ran their statistics. Well, the way they say things and their inconsistent use of descriptive statistics. (Yes, I lean toward the stats douchery. One kinda has to, in my field.)

These results, as presented, must be considered with a deluge of caveats. Here are a few important ones:

  1. Unbalanced design compromises a lot of things, particularly your ability to evaluate the effects of the therapy protocol alone. They seemed much more interested in comparing MDMA vs placebo, not placebo+therapy vs MDMA+therapy. This is one way to look at the problem, but I argue it is the less valid way to look at it.
  2. It’s pretty hard to argue for an effect when your control group is not functioning as a control, and your blinding is nonexistent.
  3. The non-standardized dosing (the optional additional dose) presents a big confound.
  4. The variability of therapy sessions and post-therapy-session drug treatment (with sleep drugs and benzodiazepines, which I view as a considerable blunder) again reduces the validity of the results. We now have a number of confounding factors that distract from the main effect they attempt to demonstrate.
  5. This is a tiny study, including just 20 people (12 MDMA, 8 placebo followed by crossover MDMA). Their population was primarily female and primarily survivors of child abuse or sexual trauma, so it’s also a big jump to assume that all populations and all causes of PTSD are going to apply.

That said, with all the limitations and caveats and skepticism, at the end of my first reading of this paper I still thought the data were interesting. Overall, I had the impression that it was worth a further look if they could improve the study design to clear up as many of these initial flaws as possible.

And then the follow-up

So the follow-up is where Science By Press Release took place just recently. Recall, at the end of the study we just analyzed, all participating subjects had received MDMA+therapy. The only acknowledged difference between groups at this point is whether or not the group received placebo+therapy a few months earlier. (Of course, there are many un-acknowledged differences within groups and between groups due to the inconsistent treatment design.)

Now the authors conduct a long term follow-up on their subjects. They received responses from all but three subjects. This gives them a total of 16 subjects (20 to start – 1 declined the open label second phase in the original study – 3 did not complete follow up). They take the data from these 16 individuals and see what can be learned from a more distant timepoint.

Some subjects appear to maintain similar CAPS scores, others seem to show further improvement between the last study timepoint and the long-term follow up, and still others seem to relapse. We see that some subjects completed a third MDMA+therapy session, as well. Importantly, the study authors reveal the full dataset in table format (rather than the graphs presented in the first paper, which were really uninformative without error bars). There is a far wider range of CAPS scores at study entry in the MDMA+therapy group (compared to the group that was originally assigned placebo+therapy and then put into a crossover study).

The authors boast that 8/19 subjects are in psychotherapy at follow-up (a 50% decrease from the 16/19 at study entry). However, they downplay the fact that of the three subjects NOT in therapy, one is in therapy at follow-up. (Though unbalanced, if they wish to compare these things, one could argue this is more than a 50% increase.)

The changes in medication prescriptions seems to be a wash. Some started new prescriptions, some went off prescriptions. I don’t see anything outstanding, and neither did the authors.

Still, they become so bold as to assume that the three subjects who did not complete the CAPS assessment showed improvement without the hard data to back it up and conclude, “Therefore, it may be the case that up to 89% (17/19) of those who received MDMA had long-term improvement in their PTSD symptoms.”

Wow. They quickly get their heads on straight and go back to the data, but that’s really amazing. Even so, they really have shifted to treating these subjects as a set of case studies rather than a treatment group to analyze. They kind of had to, given the lack of control group and not much else to go on. Still, when you shift from a prospective study to a series of case reports, you’re looking at far less powerful results. The press releases certainly don’t reflect that.

Again, it’s an interesting result, but at this point the power of the data is severely reduced- and quite a few of the losses of potential information gained are due to blunders on the part of the authors. That said- do we really need to go around selling early (VERY early) results in a civilian cohort (primarily women, primarily survivors of sexual violence) as the answer to a warfighter/combat trauma audience? That’s cheap and awful.

But lest we forget the conveniently ignored study…

One of the conclusions of this long-term study is clear: people know when you’ve given them MDMA. The subjective effects of the drug create a major problem in doing any kind of blinded study. So the authors of this study used a low dose of MDMA (25 mg + 12.5 mg) as an active placebo, and the same dose of MDMA that was used in the previous study (125 mg + 62.5 mg) as their therapeutic dose. Again they had a low number of subjects split into uneven groups (4 active placebo, 8 full dose MDMA). They claim it was to test safety of the full dose (didn’t the last study do this well enough?) and to enhance recruiting efforts. Huh. They added a third therapy session to their regimen, which differed from the first study.

They effectively solved the blinding problem. Subject and investigator guesses didn’t look a lot better than a random (50/50) chance of being right. So, that’s a plus to the active placebo!

However, there is a massive downside here. Magically, when your subjects can’t tell which experimental group they’re in… your clinical effect disappears. Placebo effect for the win!

Where were the press releases for this study? It was published in the very same issue of J Psychopharmacol as the Science By Press Release (long-term follow-up) study- in fact, it’s the next article in that issue- yet it’s ignored. I only stumbled upon the paper by accident, and was stunned to find the exact opposite of what all the press releases were screaming about.

This is why Science By Press Release sucks. It’s also why double-blind studies are useful, why holding extraordinary claims to high standards is important, and why replicability (especially for small sample size studies) can make or break a finding.

It’s whether you choose to incorporate that new knowledge in your worldview that makes you a Scientist rather than a True Believer in a romanticized idea that sounds great but might just not be right.



* this is a politically correct term for “overreaching” or “not really correct”

I really don’t intend to sound as uncharitable as I come off. However, I think that if you’re going to invest the time and effort, it really should be done right and not halfassed. I see many lost opportunities to learn things here and it’s disappointing. It’s my opinion that, at this time, there’s really not much more to go after on this therapeutic front.


Further reading:

Mithoefer MC, Wagner MT, Mithoefer AT, Jerome L, Doblin R. (2011) The safety and efficacy of {+/-}3,4-methylenedioxymethamphetamine-assisted psychotherapy in subjects with chronic, treatment-resistant posttraumatic stress disorder: the first randomized controlled pilot study. J Psychopharmacol 25(4):439-52.

Mithoefer MC, Wagner MT, Mithoefer AT, Jerome L, Martin SF, Yazar-Klosinski B, Michel Y, Brewerton TD, Doblin R. (2013) Durability of improvement in post-traumatic stress disorder symptoms and absence of harmful effects or drug dependency after 3,4-methylenedioxymethamphetamine-assisted psychotherapy: a prospective long-term follow-up study. J Psychopharmacol 27(1):28-39.

Oehen P, Traber R, Widmer V, Schnyder U. (2013) A randomized, controlled pilot study of MDMA (± 3,4-Methylenedioxymethamphetamine)-assisted psychotherapy for treatment of resistant, chronic Post-Traumatic Stress Disorder (PTSD). J Psychopharmacol 27(1):40-52.


  1. Emphasis on “safety” testing is what justifies the small N study….it is the primary goal of Phase I. So even without showing any efficacy you can declare a victory of sorts. And if you don’t see any efficacy you have a built in excuse of “it wasn’t designed for efficacy”.

  2. they market these studies as phase 2, and they are clearly designed to demonstrate efficacy. does not deter anyone from coming up with lots of other excuses.

  3. If it is easy to tell if you have taken MDMA and therefore very difficult to do a double blind study, then we should do other studies. You say double blind studies are important as a maxim, but that doesn’t mean they are the only way of doing things. Basically you talk a lot of shit and don’t actually offer a suggestion for how to do a better study. You also ignore the fact that there are many other studies in many countries that show MDMA helps with PTSD, despite the insane obstacles of getting approval to use it for research.

  4. sure, double-blinded trials are not the only way of doing things, G. however, i think it’s a sledgehammer kind of point that the effect is only observed in trials where there is no effective blinding.

    i don’t have a suggestion for a study design that would separate out the effects of the therapy session and time effects from the drug effects. but from those who wish to make extraordinary claims, extraordinary proof is required. i am simply stating that i don’t see it. i do lament that they genuinely missed out by skipping out on the all-important control group part of the study design. if, that is, there was an effect to be observed in the first place. pointing out significant flaws in interpretation of the data is not “talking shit” it’s talking shop.

    “many other studies in many other countries” have yet to show up in the published literature, if you know what i’m saying.

  5. G-
    It doesn’t matter how many rinky dink little studies concur if they all suffer similar design flaws. As it happens, people are not that good at distinguishing drugs of similar activity as long as you have a decent dose. That’s a plus. The caparisoned should be Meth, MDA, MDE, mCPP, fenfluromine, maybe a bath salt or three. Then we’ll see how magical MDMA really is.

  6. I was really interested when I saw a news item about MDMA trials for use in therapy. In early 1980 when I was about 30 I used MDMA on weekends only for about 3 months. Before use I had suffered debilitating social anxiety and other related psychological problems since my early teens -the MDMA turned my life around such that all debilitating symptoms ceased and I have since lived a fully engaged and confident life.

  7. To the author of this article:
    Yes, you’re right that this study has been hyped up by the press.
    However Dr. Michael Mithoefer himself admitted right away that his study was too small (only 19 test subjects) to make any extensive conclusions. It just points in a certain direction and more studies have to be done to confirm this.
    He’s actually doing that right now. This might take a while however, as they’re short for funding at the moment.
    Therefore, I do not understand what you’re trying to prove, when the researcher of this study has already stated that the results found in this study are encouraging yet inconclusive.
    I for one think it’s very promising. Reducing PTSD symptoms by as much as 90% for years on end with such a short therapy process, I hope future studies prove successful and will allow this therapy to be used on a wider scale.


  8. the part where the results are inconclusive certainly is not reflected in the original research article.

  9. It could also be that the active placebo was of sufficient potency to achieve similar results as the therapeutic dose such that it really wasn’t a placebo effect.

    So many questions.

  10. Hey, look! Another person writing about MDMA who has never tried it before!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s



Get every new post delivered to your Inbox.

%d bloggers like this: