I've spent a sizable portion of the year or so of my life trying to find ways to live healthier, and the thing that I have found so discouraging is how complicated it is. That I need daily exercise is obvious, but what exercises should I do? Of course I should eat healthier, but what is actually good for me to eat? These are complicated questions with complicated answers, and living in an industrialized country seems only to make them more complex.
Probably one of the areas which I've had the hardest time figuring out how to investigate in the first place though is the seemingly simple question, "vitamin supplements, good or bad?"
I've been told for most of my life that when ingested in supplement form our bodies often don't absorb the vitamins that we eat. That is to say that vitamin supplements are a scam, an easy way for companies to sucker fools out of their money for useless pills. And I've even heard it claimed that taking vitamins can be detrimental to your health.
But I've also heard people praise the abilities of vitamin supplements. Obviously, if you're eating a diet of processed foods you're not getting all the nutrients you need no matter how much you eat, and if taking a vitamin could help then it'd be a wonder why anyone wouldn't take them. I also found a statistic claiming that "72 percent of physicians and 89 percent of nurses used dietary supplements and that 79 percent of physicians and 82 percent of nurses said that they recommend dietary supplements to their patients."
TLR Whether or not I should be taking vitamin supplements is a big question, and since I'm just at the start of diving into it, I thought it might be helpful to ask others what they know? Do you take vitamin supplements? If so, what do you take? Are you adamantly against them? Do you know any good books or other sources I could investigate for answers?
Probably one of the areas which I've had the hardest time figuring out how to investigate in the first place though is the seemingly simple question, "vitamin supplements, good or bad?"
I've been told for most of my life that when ingested in supplement form our bodies often don't absorb the vitamins that we eat. That is to say that vitamin supplements are a scam, an easy way for companies to sucker fools out of their money for useless pills. And I've even heard it claimed that taking vitamins can be detrimental to your health.
But I've also heard people praise the abilities of vitamin supplements. Obviously, if you're eating a diet of processed foods you're not getting all the nutrients you need no matter how much you eat, and if taking a vitamin could help then it'd be a wonder why anyone wouldn't take them. I also found a statistic claiming that "72 percent of physicians and 89 percent of nurses used dietary supplements and that 79 percent of physicians and 82 percent of nurses said that they recommend dietary supplements to their patients."
TLR Whether or not I should be taking vitamin supplements is a big question, and since I'm just at the start of diving into it, I thought it might be helpful to ask others what they know? Do you take vitamin supplements? If so, what do you take? Are you adamantly against them? Do you know any good books or other sources I could investigate for answers?