6 Health Myths and Urban Legends Debunked – LifeSavvy


Woman eating a healthy salad.
RossHelen/Shutterstock.com

As we grow up, most of us learn that, contrary to what we may have been led to believe as children, adults do not, in fact, know everything. They can, in fact, be spectacularly wrong. In fact, many of the health-related myths and urban legends we heard growing up fall into that category.

The good news, though, is that through a combination of better science and the fact that information is easier to access now than ever, it’s quite simple to debunk these myths. So, if you’ve spent the past seven years worried about exactly what’s going to happen to that piece of chewing gum you accidentally swallowed way back when, worry no more! Here’s the truth behind that and five other common-but-inaccurate health beliefs.

Do You Need Eight Glasses of Water a Day?





Woman drinking water in a brown top.
Dean Drobot/Shutterstock.com

The supposed rule that humans should all be drinking eight eight-ounce glasses of water per day has seemingly been around forever—but it turns out that it’s not actually true. What’s more, the myth might not even be as old as many of us think it is: It’s believed to have come from the misinterpretation of a recommendation published by the U.S. Food and Nutrition Board in 1945. So much for that piece of common knowledge.

To be clear, yes, you do need water to survive; as the Mayo Clinic points out, “the cells, tissues, and organs in your body require it to function. If you don’t get enough of the stuff, you’ll suffer from dehydration—and what it really means to be dehydrated is that your body doesn’t have enough water to carry out the functions it needs to in order to keep you alive.

But exactly how much water each person needs varies based on a wide variety of factors. Your level of physical activity is one, for example; if you’re extraordinarily physically active—say, if you’re an athlete, or if you work a job that requires a lot of manual labor—you’re going to need more water than someone is who is relatively sedentary; your level of activity causes you to sweat more, meaning you’ll need to replace your fluids more often.

If you live in a particularly warm climate, that, too, might cause you to sweat more, which means you’ll need to get more water in your system to replace those lost fluids. According to the National Academies of Sciences, Engineering, and Medicine, a good amount of water to aim for ranges between 2.7 and 3.7 liters per day—but there’s no one-size-fits-all solution.

What’s more, there are more ways to get the water you need than just drinking the stuff on its own. In fact, you probably already satisfy at least some of your necessary water intake each day from the other things you consume throughout the day. Per WebMD, most humans get around 20 percent of their daily water intake just from the foods they eat alone—so you don’t necessarily need to drink a certain amount of water per day in order to get the water your body needs to function.

Is Spinach the Best Source of Iron?





a bowl of bright green spinach.
Sunny Forest/Shutterstock.com

Eat your spinach, they said. It’s a good source of iron, they said. Except it turns out that it’s not—and what’s more, the explanation behind the whole story is a lot weirder and way more complicated than it probably should be.

A commonly-cited origin story for the “spinach is a good source of iron” myth states that the misconception is the result of a misplaced decimal point. According to this story, a German researcher, E. von Wolff, analyzed the nutritional contents of a number of foods in the 19th century and found that spinach had an impressive amount of iron in it.

Von Wolff published his results—including what he believed to be the iron content of spinach—in the 1870s. But when a different group of researchers analyzed the iron content of spinach in the 1930s, they found that it was actually one-tenth of what von Wolff’s research had determined it was—meaning that, for decades, a decimal point placed in the wrong spot in von Wolff’s research had both created and fueled the widespread belief that spinach was a good source of iron.

However, it turns out that this supposed origin story is also a myth. It seems to have arrived on the scene in the 1970s courtesy of nutritionist Arnold E. Bender and was accidentally perpetuated by immunohematologist Terence Hamblin in 1981. But, as Mike Sutton found in his extensive 2010 exploration of the myth, there’s no evidence that any such erroneous decimal point ever existed. Hamblin actually followed up with a mea culpa at his own blog after Sutton’s article came out, noting that, although he had been right about spinach’s actual iron content in his 1981 article, it was for the wrong reasons.

So what is the deal with spinach? Is it a good source of iron, or not?

While spinach isn’t the end all be all of iron like you might have been taught, it does contain more iron than other foods and can be a good source in your diet. 3.5 ounces of raw spinach has 2.7 mg of iron. That’s 15% of your daily required amount. Overall, not bad, but if you’re looking to add more iron, you can also do so via shellfish, legumes, and red meat. Your best bet, though, is to speak with your doctor about how to increase your iron.

Is Swallowing Chewing Gum Harmful?





A woman blows a bubble with chewing gum.
Prostock-studio/Shutterstock.com

“Don’t swallow your gum,” your parents, teachers, or caregivers may have told you when you were a kid. “If you do, it’ll take your body seven years to digest it!”

Thankfully, that’s actually a myth. Generally speaking, you won’t come to any harm if you swallow your chewing gum—although it’s admittedly still not recommended that purposefully swallow it or make a habit of doing so (Too much of it can cause some blockages in your digestive system, so, you know, chew cautiously).

Humans have been chewing gum for millennia—first in such forms as spruce resin, birch tar, and chicle, and later in the form of commercial chewing gums based off of these and other naturally-occurring substances from the 19th century onward. These days, most chewing gums are made out of synthetic rubbers. And it’s true that our bodies don’t have the means to digest these substances; as gastroenterologist Nancy McGreal, MD noted at Duke University’s Duke Health Blog in 2013, “Our bodies do not possess digestive enzymes to specifically break down gum base.”

But that doesn’t mean our bodies can’t deal with these substances; it just does it in a… slightly different way. As Healthline puts it, “Your digestive system is built to digest what it can and pass anything that can’t be digested in your stool.” It’s not even rare for this to happen—our bodies also can’t digest corn, and, well… you don’t see anyone warning people that if you eat corn, it’ll take your body seven years to digest it, do you?

The upshot is this: As long as the piece of gum you’ve swallowed isn’t absurdly large, it should pass through your system without incident. For reference, it usually takes between two and five days for food to make its way through your body from start to finish, according to the Mayo Clinic.

Does Cracking Your Knuckles Cause Arthritis?





A woman cracks her knuckles.
Andrey_Popov/Shutterstock.com

Cracking your knuckles regularly might be annoying to those around you, but contrary to popular belief, the habit won’t give you arthritis. Numerous studies have explored a potential link between habitual knuckle cracking and the occurrence of arthritis—and the vast majority of them have found no link whatsoever.

For example, in an informal study using himself as a subject, Donald L. Unger, MD “cracked the knuckles of his left hand at least twice a day, leaving those on the right as a control” for a whopping 50 years; then, at the end of the 50 years, he compared each hand to the other and found neither arthritis nor any other “apparent differences” in either hand. Unger wrote about his experiment as a letter to the editor in the journal Arthritis and Rheumatism in 1998.

A study of one isn’t much, of course—but Unger’s experiment is far from the only to have been done on the subject. Most of the others have had similar results; indeed, a 2011 study published in the Journal of the American Board of Family Medicine found that, in a pool of 215 participants split into two groups—those who cracked their knuckles and those who didn’t—the prevalence of arthritis was similar in each group: Arthritis occurred in 18.1 percent of the participants who cracked their knuckles and 21.5 percent of those who didn’t.

However, just because cracking your knuckles doesn’t cause arthritis doesn’t mean that it won’t have other adverse effects on your joint health. According to a study published in the journal Annals of the Rheumatic Diseases in 1990, for instance, in a group of 300 participants, people who cracked their knuckles regularly were more likely to have hand swelling, as well as weaker grip strength than their non-knuckle-cracking peers.

For the curious, the actual sound produced by cracking knuckles is “caused by bubbles bursting in the … fluid that helps lubricate joints,” the Harvard Health website notes.  “The bubbles pop when you pull the bones apart, either by stretching the fingers or bending them backward, creating negative pressure.”

Should You Starve a Fever?





A woman blows her nose on the couch
Prostock-studio/Shutterstock.com

“Starve a fever, feed a cold,” the old saying goes—meaning that if you’ve got a fever, you should avoid eating in order to help your body fight the illness, while if you’ve got a cold, you should go take a trip to the snack cupboard. Is it correct, though? Not so much.

The belief that starving a fever is the way to go is thought to have been first recorded in 1574 by John Withals in a dictionary, where Withals noted that “fasting is a great remedy of fever.” And, as Raychel Sugar observed at Bon Appetit in 2018, “That does make some intuitive sense, sort of. Fevers produce heat, and heat takes energy. So if you don’t give your body energy to produce the fever, then the fever should go away.” Sugar went on to note that, according to a piece published at Smithsonian Magazine’s website in 2009, colds at the time were thought to have been “caused by a drop in temperature,” with food being required to “stoke the fire” in order to bring that temperature up.

These days, though, improved medical knowledge has proven that the old “Starve a fever, feed a cold” adage should really just be, “Fever? Cold? Feed ’em both.” Mark Fischetti summed up exactly why at Scientific American in 2014, writing that feeding a cold makes sense, as “when your body fights an illness, it needs energy”; ergo, “eating healthy food is helpful.” Eating can also generate heat, helping you combat any illness-induced chills you might be feeling.

But Fischetti noted that fevers also need fuel to beat. “Fever is part of the immune system’s attempt to beat the bugs,” he wrote. “It raises body temperature, which increases metabolism and results in more calories burned; for each degree of temperature rise, the energy demand increases further.” You get that energy from food—so, as Fischetti put it, “taking in calories becomes important.”

Being sick can sometimes rob you of your appetite, but it’s important to keep eating food and drinking water when you’re not feeling well—even if it’s the last thing you want to be doing. At least it’ll help you get better, right?


If you’ve heard any of these old health myths and wondered if they were true, now, you can relax a little if you ever get an urge to crack your knuckles.





Source link

Previous articleDrop ALT mechanical keyboard review: specs, performance, cost
Next articleKia EV6 GT review: the attainable, guilt-free supercar