Eat your vegetables. Sleep eight hours a day. Exercise.
There are so many truisms about staying healthy that we sometimes don’t use due diligence in determining if the advice given to us is even true. Then there are the old wives’ tales, passed down from one generation to the next, that often disregard the difference between fact and fiction.
Below, you’ll find a list of 10 of the most common health cliches out there. None of them are true.
Weighing in at just over 1.4 kilograms (3 lb), the human brain is home to nearly 100 billion neurons. They transmit information to each other across gaps called synapses, of which the brain has almost one quadrillion.
The brain is sectioned into three primary parts—the cerebrum, the cerebellum, and the brain stem. The cerebrum composes roughly 85 percent of the organ and is responsible for much of the higher-level functioning we associate with being human. Seated below it, you’ll find the cerebellum, which controls basic coordination and balance. And finally, you have the brain stem. Connected to your spinal cord, the brain stem controls most of your automatic functions, such as breathing and digestion.
Wouldn’t it be incredible if all this processing is only making use of 10 percent of the brain’s bandwidth?
Alas, this “fact” is utterly wrong. We’re not sure where the claim that we only use 10 percent of our brains came from, but it seemed to percolate out of the late Victorian era. In the late 1890s, Harvard psychologists William James and Boris Sidis used the latter’s wunderkind (his IQ was nearly 300) as proof that all humans must have the capacity to be that smart. We just have to try harder.
Pretty ridiculous, right?
Further research at the start of the 20th century found that rats with cerebral damage could be retaught certain tasks. This was used to bolster the already weak case that our human brain is full of untapped potential. Alas, this factoid is completely ridiculous with no basis in modern science. Just reading this paragraph uses more than 10 percent of your brain. Oh well.
After swallowing a particularly large piece of bubblegum, many of you may remember being horrified to hear that your digestive tract would spend the next seven years trying to digest it. If your seven years isn’t up yet, you may be relieved to learn that this “fact” is complete nonsense.
Although the origins of this myth are elusive, it has borne out a relative truth about chewing gum. It’s indigestible. The Food and Drug Administration defines gum as a “nonnutritive masticatory substance.” (Translation: It’s not food.)
While it’s not advisable to swallow your chewing gum, what happens to it isn’t all that exciting. Excess ingredients like sweeteners may be digested, but the bulk of the gum is an elastomer that gets moved through your digestive tract without being broken down. Then the gum comes out the other end via the excretory system and is usually unscathed.
Foreign, inedible objects have to be roughly larger than a United States quarter to get stuck in your digestive system. Otherwise, they flow like junk down a stream, right out the other end.
As if puberty, high school, and those teenage years aren’t hard enough, many of us grew up learning that our chocolate intake had a causal relationship with breakouts. Pretty awful that chocolate, the one thing that makes adolescence bearable, lights up your face with ugly zits.
Well, we’re here to let you know that this old wives’ tale is false. Eating chocolate will not cause you to break out. However, eating foods high in fat and sugar can increase your body’s natural sebum production, which makes your skin oilier. Furthermore, those unhealthy foods lead to higher levels of skin inflammation.
But will chocolate—or any food for that matter—make your skin break out? The answer to that is a resounding no. Eating high levels of fatty foods will definitely trip up your blood sugar, which can indirectly affect breakout levels. But no single food item is your golden ticket to avoiding teenage pimples.
The myth that carrots will improve your vision is wrapped up in a twisted history of wartime propaganda. To be fair, carrots are great sources of beta-carotene, an inactive retinol that is transformed into vitamin A during digestion. Vitamin A provides all sorts of benefits to the body, including the protection of eyesight.
But does it really improve one’s nighttime vision?
No. The British Ministry of Information ran a campaign during World War II that suggested pilots in the Royal Air Force were eating large quantities of carrots, explaining their uncanny ability to shoot down German fighter pilots under the veil of darkness. Truth is, all the carrots in the world couldn’t give you the gift of nocturnal sight.
British troops were warding off German bombers with novel technology at the time—airborne interception radar. It’s unlikely that German intelligence bought into the idea that British pilots were fueled by high-octane carrots.
Yet, in the almost century since, the Western world’s general public has remained firm believers that if they eat enough of the orange stuff, their eyes will thank them. We hate to be the ones to break it to you, but you’re not going to have night vision anytime soon.
This one should be easy, right? Not so fast. The belief that we have five senses dates back to the time of Greek philosopher Aristotle, who was the first to discern the five discrete senses of the human body. You probably learned them in elementary school: sight, hearing, smell, touch, taste.
Yes, these are five of your senses. But they aren’t the only ones.
Let’s start with the basics. What is a “sense”? Well, it’s something with a sensor that can perceive a given stimulus. Every sense is activated by a unique phenomenon.
In fact, the sense of touch is actually much more complex than just a single sensation. Many neurologists break down “touch” into divergent sensations, including perceptions of pressure, temperature, and pain.
Depending on whom you ask, humans have as many as 33 senses. These include some senses, like blood pressure and balance, that you knew you had but didn’t count as a “sense.” So, next time someone says they have a sixth sense, you might respond by saying you have 33. They may not know what you mean by that, but you’ll know!
Many of us can remember being taught, by a biology teacher no less, that our ability to roll our tongues was simple genetic fate. The majority of people can roll their tongues, and societal wisdom held that tongue rolling was a dominant genetic trait. If either of your parents could do it, so could you. Or so we were told.
In reality, it’s not that simple. Unlike many of these human body myths, we have a good idea from where this one came. In 1940, American geneticist Alfred Sturtevant published a study which concluded that your tongue-rolling ability was a hereditary trait based on a dominant gene.
However, Sturtevant’s exuberance over his finding was short-lived. People realized quickly that there were identical twins where one could roll his tongue and the other couldn’t. As a result, Sturtevant’s findings were swiftly debunked, with the man at the helm conceding defeat.
And yet, decades later in classrooms across the world, this falsehood is being spread anew. Now that you know the truth, you can stop the madness from spreading the next time someone unveils this quirky parlor trick.
Between the myth that we only use 10 percent of our brains and the prevailing notion that we lose the majority of our body heat through our heads, it seems like our craniums can’t catch a break. A prevailing hypothesis on the origin of this myth: Scientists conducted studies in the 1950s in which subjects were exposed to low temperatures and lost a solid chunk of their heat through their noggins.
The problem with this research is that the subjects were bundled up in coats and only their heads were exposed to the elements. So yes, if every part of your body is insulated and your head isn’t, you’ll lose a disproportionate amount of body heat through your head.
However, more recent research finds that, all else being equal, an excessive amount of heat doesn’t escape from your head. You lose approximately 7 percent of your body heat through your head, which makes sense because your head is roughly 7 percent of your body’s surface area.
So, treat your head like every other part of your body. When it’s cold, bundle it up and everything will be fine.
This “fact” about the human body is sort of creepy, isn’t it? The idea that protein shards of keratin keep growing at our extremities in the days and weeks after we die is freaky. Well, we’re here to let you know that it’s simply not true.
Our bodies dehydrate rather rapidly once we die. When this happens, our skin starts wrinkling and pulls inward. This gives the illusion that our hair and nails are still growing. On the contrary, the rest of the body is merely shrinking. For this reason, morticians will often lather corpses in moisturizer to keep them from pruning up.
Arthritis isn’t a single condition but rather a catchall term for a group of pain disorders characterized by joint aches, swelling, and inflammation. Unfortunately, it’s quite common, affecting more than 50 million adults and 300,000 children in the US. Arthritis can be mild or debilitating. It can flare up or feel like a slow and steady burn.
Obviously, if you can avoid activities linked to arthritis, you should. For many health-conscious individuals, this includes a seemingly simple request—don’t crack your knuckles. However, we’re here to tell you that cracking your knuckles doesn’t make the list in your fight to prevent arthritis.
But first, what is “cracking” the knuckles? That popping sound is associated with bubbles bursting in your synovial fluid (the stuff that greases your joints). As bad as that sounds, a cross-study analysis by doctors at Harvard Medical School found no evidence that cracking one’s knuckles has a causal link to arthritis.
That said, you still might want to give up the habit. Chronic knuckle cracking is linked to weaker grip strength. Furthermore, it’s just annoying to listen to.
Who hasn’t heard this one? You can shave your beard—or for women, the hair on your legs—but your efforts will be in vain. Not only will the hair grow back, we’re told, but it’ll grow back faster and darker than before.
This is absolutely false. In fact, we’ve known this isn’t true for quite some time. One of the first contemporary studies on the issue took place in 1928. The participating men all shaved in the same manner with the same brand of shaving cream. Then their subsequent new hairs were analyzed for increased rates of growth.
A lot of this myth comes down to perception. As our hair grows back, we may be influenced by our preexisting biases. Also, when you wax or shave off hair, it’s like chopping down a tree and leaving the stump. You’re left with slightly wider and more visible hair, contributing to perceived gains on the part of your stubble.
Alas, any changes in growth speed may be caused by underlying hormonal changes. But otherwise, it’s all in your head!
Evan Beck is a freelance writer living in San Francisco, California.