Every day is April Fool’s in nutrition.

“People who are desperate for reliable information face a bewildering array of diet guidance—salt is bad, salt is good, protein is good, protein is bad, fat is bad, fat is good—that changes like the weather. But science will figure it out, right? Now that we’re calling obesity an epidemic, funding will flow to the best scientists and all of this noise will die down, leaving us with clear answers to the causes and treatments.

Or maybe not.”

From a recent article at io9 by John Bohannon:

I Fooled Millions Into Thinking Chocolate Helps Weight Loss. Here’s How.

chocolate

With a poorly-crafted study that used a small sample and ignored how big the measured results actually were, a team of journalists managed to punk the nutrition-news circuit into publishing their study.

“A team of German researchers had found that people on a low-carb diet lost weight 10 percent faster if they ate a chocolate bar every day. It made the front page of Bild, Europe’s largest daily newspaper, just beneath their update about the Germanwings crash. From there, it ricocheted around the internet and beyond, making news in more than 20 countries and half a dozen languages. It was discussed on television news shows. It appeared in glossy print, most recently in the June issue of Shape magazine.”

But it was all a crock of dung. And sadly, I can guarantee that many people will continue to believe the lie, simply because it appeared in journals as “prestigious” as Prevention, regardless of this exposé or any further evidence to the contrary. Like the entire anti-Vax movement, nothing can kill a good excuse for mouth-foaming outrage, not even repeatedly-confirmed facts.

Read the article. It’s worth your time, if you’re interested in having accurate information on which to base your decisions.

A big part of the problem with modern “scientific” studies is the concept of “p-value.” It’s more complex than most people care about, but William Rozeboom wrote, “The use of P values and null hypothesis testing is ‘surely the most bone-headedly misguided procedure ever institutionalized in the rote training of science students.’ “

P value calculations tell you only the probability of seeing a result at least as big as what you saw if there is no real effect. (In other words, the P value calculation assumes the null hypothesis is true.) A small P value — low probability of the data you measured — might mean the null hypothesis is wrong, or it might mean that you just saw some unusual data. You don’t know which. And if there is a real effect, your calculation of a P value is rendered meaningless, because that calculation assumed that there wasn’t a real effect.

(ScienceNews – “P value ban: small step for a journal, giant leap for science”)

vsqweempzkjlcamyod3c

And if Randall Munroe pillories something, you have a pretty good idea that there are legitimate questions about its validity.

significant

The takeaway: don’t be excited just because one study says something, and I’ve written about this elsewhere. Look at the study, determine the size of the sample used, and see if you can ferret out how big the measured differences were. There’s a lot more digging you could do, but this is a good place to start.

The Old Wolf has spoken.

2 responses to “Every day is April Fool’s in nutrition.

Leave a reply to The Old Wolf Cancel reply