23948sdkhjf

Anna Törner: To kill your darlings

Hopes were high when Anna Törner and her colleague started a study on a dietary supplement that seemed unbelievably good. “Enthusiastically, we dreamed of exciting results and perhaps a publication in a high-impact journal,” she writes in a column.

My colleague Ingrid came into the office, bursting with energy and joy. She had long struggled with fatigue and low energy levels, but the doctors couldn’t find anything wrong. She had then bought a dietary supplement at the health food store by chance, and she told me that it REALLY worked, and even in low doses of half a tablet a week. It sounded too good to be true, but I could clearly see how well Ingrid was feeling, and her conviction quickly also became mine.

What do two statisticians do then? Naturally, we designed and conducted a clinical study. Right from the beginning, we were convinced that the results would be positive, but a long time in the industry had taught us that we also needed hard data to convince more people. Quickly but carefully, we planned a clinical study with n=1, as it may be an individual-specific effect; that Ingrid had a nutritional deficiency that this dietary supplement addressed.

Power calculations showed that we needed to run the study for 15 weeks. Every Monday, ½ tablet was given (trial drug or matching placebo). In order for Ingrid not to be forced to give up this almost “life-saving treatment” during the weeks when she received a placebo, Ingrid could have rescue medication on Wednesdays (if she thought that she had received a placebo on Monday). It sounds somewhat crazy, I know, but that’s what happens when two creative statisticians are given free rein.

Enthusiastically, we dreamed of exciting results and perhaps a publication in a high-impact journal? After 15 weeks, we expectantly broke the code, and our hopes were shattered in minutes. The results showed absolutely zero effect. It was not even a power problem, meaning that the results pointed in the right direction, but we had too few measuring points. Disappointing and inconceivable!

In retrospect, I think this was one of my most important studies. I often meet enthusiastic researchers in my work, who, with very limited patient material, have found ground-breaking medical effects. For me, it has been difficult to understand that anyone can be so convinced without objective data to lean on. And when the study eventually produces negative results, their spinal reflex is that there must have been something wrong with the study because they are certain that the treatment actually works! Now I know how it feels because that’s what Ingrid and I were both thinking; something must have gone wrong, we just do not know what.

“Kill your darlings” is an expression in film and literature, which means that sometimes you have to cut out stanzas or scenes that feel brilliant because they are not really adding anything. The term is also used in life science to illustrate the problem that we unjustifiably keep projects alive, even though we have objective results showing unsatisfactory effect and safety, but despite this, the project is still pursued. I have often thought that this is due to our reluctance to give up or because we have already invested a lot of time and money, and therefore, we turn a blind eye to the truth.

I now realise that it goes much deeper than that. A heartfelt conviction will often trump data, and it will trump a tiny clinical study any day of the week.

Artikeln är en del av vårt tema om News in English.

Kommentera en artikel
Utvalda artiklar

Nyhetsbrev

Sänd till en kollega

0.092