Debunking 5 Holiday Health Myths

This holiday season, we’re giving you the opportunity to correct your well-meaning loved ones! In this blog post, we’ll tackle five health myths that always seem to pop up this time of year.

Myth #1: Tryptophan in turkey makes you tired

People will often lament their sleepiness after eating Thanksgiving dinner, and turkey is often the scapegoat. Many people blame their lethargy on the tryptophan levels in turkey

It’s true that turkey contains tryptophan… but so does a lot of food. In fact, when it comes to food, turkey has an average amount of tryptophan, and chicken actually has a higher level! But let’s back up: what is tryptophan? 

L-tryptophan is an essential amino acid that helps produce niacin, a B-vitamin that helps with digestion and keeps skin and nerves healthy. Tryptophan is also involved in the process of cellular metabolism, and it’s used to make serotonin, a chemical that can make you feel calm and relaxed, and melatonin, a chemical that can make you feel tired. 

pexels-karolina-grabowska-5718104-jpgPhoto by Karolina Grabowska via Pexels.

Turkey simply doesn’t have enough tryptophan in it to make you feel groggy. So what’s the culprit? Overeating! 

After a big meal, the digestive system expends a lot of energy to break down all of that grub. Blood rushes to the organs in the digestive system. Foods rich in carbohydrates (like my favorite holiday dinner staple, stuffing) can cause a spike in blood sugar levels, and when levels dip back down again, the body feels tired. 

 

Myth #2: Colds are caused by cold, wet weather

“You’ll catch your death of cold!” is what well-meaning relatives say when you try to leave the house without a hat or coat. However, this is yet another health myth. 

Since the 1950s, there have been numerous studies where researchers put study participants in a cold room or a cold bath and then exposed them to the common cold. They found that being chilly had no effect on catching a cold—and in fact, a 1999 study found that being exposed to cold might have actually helped immune function in study participants. 

But if chilly weather doesn’t cause colds, why do we tend to catch colds during the coldest season? 

Colds don’t appear out of thin air. The common cold can be caused by several hundred different virus strains, but the rhinovirus is the most common cause. It turns out that the rhinovirus loves humidity and replicates faster in colder temperatures. Plus, people tend to stay indoors when the weather is cold and wet, and close quarters help the virus spread. 

This doesn’t mean you should hang out in the cold all day. We’ve all heard of frostbite, but in cold, wet weather, not wearing warm clothes and waterproof shoes can also lead to conditions like trench foot and the development of chilblains. Chilblains are itchy, lumpy patches of skin, typically on the hands and feet, caused when cold damages the capillary beds in the skin. Chilblains typically heal within a few weeks but can return with more exposure to cold weather. Trench foot (also known as immersion foot) occurs when the feet are wet for a prolonged period of time, as made famous by the trench warfare of World War I. Through cold and wetness, the feet lose circulation and nerve function, which can lead to blisters, ulcers, and gangrene. 

Exposure to chilly, wet weather won’t give you a cold, but it’s hardly harmless.

Myth #3: Eating chocolate gives you acne 

Chocolate lovers rejoice! The myth that eating chocolate causes acne breakouts is not supported by science, though several studies have investigated it over the years. 

As Clare Collins, professor of nutrition and dietetics at the University of Newcastle points out in her article for The Conversation, the scientific studies that support this myth don’t hold up. She writes that “by today’s standards, the investigations were all of a poor scientific standard. The original study, conducted in 1965, contained just eight participants [...] We have failed to subject this chocolate myth to the rigors of a randomized control trial (RCT), despite the fact that almost all people aged 15 to 17 years experience some degree of acne. We need a decent RCT so we can know once and for all whether to unleash our teenagers, and ourselves, in the confectionery aisle at the supermarket.” 

There’s growing evidence, however, that the lactose in milk causes an insulin spike that might lead to breakouts. In a 2007 study, participants who followed a diet low in insulin-spiking glucose had improved acne. 

My project (35)The sebaceous glands. Image from Human Anatomy Atlas

What causes acne in the first place? The sebaceous glands (aka oil glands) secrete sebum, an oily substance that can create acne when combined with dead skin cells and bacteria.

There’s no evidence that one particular food—like chocolate—can cause acne by itself, but a diet with lots of sugars and fats can lead to breakouts. 

One potential origin for this myth pertains to premenstrual syndrome (PMS). When you’re about to get your period, you tend to crave chocolate and other sweets, and at the same time, the body produces hormones that cause an increase in oil production. This coincidence might lead some to believe that their chocolate habit created the acne. 

 

Myth #4: Sugar makes kids hyperactive

Here’s another thing everyone hears a lot as a kid: sugar makes kids hyper. 

Researchers have conducted several studies over the years in which they give some children sugar and some children a placebo. Overall, they found that sugar does not affect children’s behavior, though a small number of children might be slightly affected. 

This myth is pervasive, though, and it can lead to confirmation bias in parents. If a parent believes that sugar makes kids act up and they see their child consuming sugar, they might interpret their child’s behavior as hyperactive. If parents are told that their child has consumed sugar, they are more likely to report hyperactivity, even if the child has actually consumed a placebo. 

Another factor is that kids consume sugar during events that make them excited—picture a big birthday cake at a party or a pillowcase full of Halloween candy. In those situations, their heightened energy level is probably caused by excitement, not their diet. Remember, correlation does not equal causation!

This myth has been debunked since 1982 when the National Institute of Health declared there was no link between sugar and hyperactivity, but the myth still lives on. 

 

Myth #5: You lose half your body heat through your head

Hat haters rejoice—you don’t actually lose that much heat through your head. This is a myth that has even been repeated in the US Army Field Manual for years—the myth that up to half of your body heat is lost through your head. 

This myth likely originated in the 1950s, when a military study put participants in warm clothes but no hats and measured their heat loss. They observed that when the participants experienced very cold temperatures, they lost the most heat through their heads. However, that’s only because the head was the only body part exposed to the cold. 

A more contemporary study found that adults lose about 10% of their body heat through the head, which is pretty proportionate given that the head makes up about 7% of the body’s surface area. 

screenshot (98)Learn more about anatomy with Human Anatomy Atlas!

The hands, back, thighs, and lower legs actually lose heat at a higher rate than the head

When it’s cold and snowy out, wearing a hat will certainly help keep you warm, but it’s not the end-all-be-all this myth makes it out to be. 

 

More mythbusting from Visible Body!

We’re no Adam Savage or Jamie Hyneman, but check out other blog posts that tackle myths: 

 


Be sure to subscribe to the Visible Body Blog for more awesomeness! 

Are you an instructor? We have award-winning 3D products and resources for your anatomy and physiology or biology course! Learn more here.

Topics