Look before you leap: Why we shouldn’t jump to conclusions about supplement studies

Scientific research is essential for understanding the role of supplements in health, but it must be interpreted carefully
Scientific research is essential for understanding the role of supplements in health, but it must be interpreted carefully, says the Council for Responsible Nutrition’s Dr. Andrea Wong. (Getty Images)

In this guest article, the Council for Responsible Nutrition’s Dr. Andrea Wong explores how mainstream media coverage of supplement research often leads to consumer confusion due to oversimplified or misinterpreted findings.

As someone deeply invested in the science behind dietary supplements, I’ve noticed a troubling pattern in how research is reported. Time and again, studies on supplements follow a familiar trajectory: an attention-grabbing conclusion, a media frenzy and a wave of consumer confusion. But when we look closer, we often find that the study’s design has significant limitations, or its findings have been misinterpreted, leading to sensationalized coverage by the media, both traditional and social.

I’m not saying we should ignore potential risks identified in research—far from it. We absolutely need to scrutinize supplements for safety and efficacy. But we also need to apply the same level of scrutiny to the research itself. Without this, consumers are left with misleading messages that could discourage them from taking products that may benefit their health.

Don’t neglect the details

A prime example of this issue is a study published in the British Medical Journal that suggested multivitamins offer no measurable health benefits. On the surface, that sounds definitive. But when you dig deeper, the study relied on self-reported data rather than being determined by a clinician, making the measured outcomes less reliable.

Additionally, the study didn’t specify the composition of the multivitamins taken, and didn’t account for how long or consistently participants used them. Perhaps most importantly, it was a cross-sectional study—a design that can show associations but can’t determine causation because it only provides a snapshot in time.

Despite these limitations, headlines ran with the idea that multivitamins are useless, likely leading many people to second-guess their supplement use. But what about those who need them? Many Americans fall short on essential nutrients like vitamins A, C, D, E and K, as well as calcium, magnesium, choline and potassium. Multivitamins serve a real purpose in filling those gaps. The takeaway? We need to ask whether a study’s design supports its conclusion before we make personal health decisions based on it.

Don’t trust, verify

Last fall, a study on prenatal multivitamins made waves with headlines like “Prenatal vitamins may have dangerous levels of lead and arsenic, explosive new study finds.”

The media coverage stemmed from a press release accompanying the study stating that some of the products tested “contain harmful levels of toxic metals.” An initial read of the study seemingly confirmed the headlines, indicating that some prenatal multivitamins tested contained lead, arsenic, and cadmium at levels exceeding “USP Purity Limits.”

But as we dug deeper to verify these limits, we found that the authors cited another study on prenatal supplements published in 2018, and that study misapplied USP requirements for ingredient testing, not finished products. Plus, they used the wrong units of measure. Importantly, proper application of USP standards (General Chapter 2232 for dietary supplements) demonstrated that all tested products were below established safety thresholds.

To restore confidence in the safety of prenatal vitamins, CRN worked with the study authors and their institution to correct the study manuscript and withdraw the press release, alerted the journal about the errors, leading to the withdrawal of the original study from the journal website, and pressed media outlets to issue corrections. Without these actions, the study’s misrepresentation of heavy metal limits could have continued to cause unnecessary anxiety for pregnant women.

Don’t overinterpret

Case reports have also caused unwarranted consternation. Case reports are merely a detailed description of a single patient’s experience. The Federal Trade Commission doesn’t allow a marketer to make health-related claims based on case reports because they are not generalizable. They simply record one consumer’s experience, and, perhaps, pose a hypothesis worth further study. Case reports can serve as an educational tool among clinicians, but they don’t establish causation. And since case reports involve a single individual or a small group, they are not generalizable to a broader population.

In a recent example that led to viral discourse on TikTok, a 26-year-old woman with a history of obesity presented with jaundice, and lab tests showed elevated liver transaminases and bilirubin. She reported drinking mullerian leaf teas and taking Nutrafol Hair Growth Supplements.

The case study reported that she had normal liver enzyme and bilirubin levels one month prior to starting Nutrafol and improvement of these liver function markers after stopping the supplement; the authors concluded that it is “likely” that Nutrafol caused the liver injury in the individual. The authors did not provide information on dosage levels, nor did they investigate whether the patient had pre-existing sensitivities to any of these ingredients in Nutrafol (which is a brand, not a single product).

Despite these limitations, the case report quickly gained traction on TikTok, with influencers and self-proclaimed wellness experts distilling its findings into simple—and misleading—messages about Nutrafol. The fact that this was a single case report, as well as the nuances of the specific case were largely ignored.

This is a textbook example of how scientific research can be distorted when filtered through social media. A case study, which is meant to serve as a preliminary signal for further investigation, is transformed into definitive proof of harm in the public’s mind. This not only spreads unnecessary fear but also makes it harder to have rational discussions about supplement safety. Of course, the tendency for social media posts to distort or misinterpret even the most rock-solid findings is a topic for another day.

The media’s role in consumer confusion

One of the biggest culprits, however, is how supplement studies are covered, even in credentialed and traditional, news sources. Sensational headlines drive clicks, and it’s much easier to say “Multivitamins’ ‘benefits’ are all in your head” than to explain all the nuances of a study’s design and limitations. Mainstream journalists should know that scientific findings are rarely “black and white” but the temptation for readers overcomes the need for accuracy and fairness.

This oversimplification leads to knee-jerk reactions from the public. Whether the cause is poorly designed research, lack of rigorous peer review to catch shoddy work, the absence of context and appreciation for a study’s limitations, or salacious headlines that exaggerate the potential harms, the effect is the same. People stop taking beneficial supplements or start fearing products based on incomplete information. What’s worse, these misleading narratives can be difficult to correct once they’ve spread.

The Bottom Line

Scientific research is essential for understanding the role of supplements in health, but it must be interpreted carefully. The next time you see a headline declaring that a supplement “doesn’t work” or “causes harm,” take a step back. Read between the lines. Look at the study’s design, its limitations and how its findings fit into the bigger picture.