Why healthy people shouldn’t be taking vitamin D pills

Should we all be popping a pill to stave off vitamin D deficiency? That appeared to be the message last month from the UK government via Public Health England, which published new guidelines on taking supplements. After several years of work and consulting widely with over 20 groups, from public health doctors and scientists to the food and vitamin industry, a document was produced saying that most of us should probably take a tablet in autumn and winter as the British sun is too weak to help our skin make vitamin D — the main source of the vitamin. A recent report had suggested that one in five Britons had blood vitamin D levels that were dangerously low. So far this advice has been well received. But is the argument clear-cut?

The report advised that some people could still maintain their levels all year round by going in the sun for 15 minutes a day and eating foods rich in vitamin D such as oily fish like tuna and salmon. The advice on sunshine is welcome and overdue as, despite past pressure from scare-mongering dermatologists, the benefits of being outdoors far outweigh the risks.

But, given that most Britons don’t eat fish regularly and live most of the time indoors, are we destined to a life of supplements?

I’ve been studying and treating osteoporosis as a consultant for over 30 years and was always happily prescribing calcium and vitamin D, thinking they worked safely. They were also popular as patients were happy to take vitamins rather than ‘real’ medicines with side effects.

My practice changed recently when evidence suggested that calcium supplements, as well as being ineffective against fractures, may cause heart disease.

I have also started to see increasing numbers of patients with very high vitamin D levels (over three times normal) which we never thought was a problem. Many patients take the prescribed dose as well as other sources of the vitamin in cod liver oil tablets or in fortified milk, orange juice or bread.

Recently, however, several randomised trials have reported that patients with high blood levels or taking large doses of vitamin D (above 800 IU) had an unexpected increased risk of falls and fractures — the opposite of what was intended.

While the new recommendations for supplementations are modest in terms of dose (10ug or 400IU), it will be overdone by some and reinforces the misguided view that supplements and food sources are interchangeable.

The report acknowledges that the evidence for vitamin D use for non-bone diseases is weak. Recent reviews have concluded that the vast majority of the published studies linking vitamin D to 137 multiple diseases and outcomes (some written by me) were spurious. But the report also assumes that the data for vitamin D use in preventing osteoporosis and fracture is rock solid. Sadly, despite strongly held views, it is not. Several papers have reviewed the evidence from 31 studies and the independent Cochrane review team found a no overall effect on fractures. (They reported a small effect on hip fracture, but you had to treat 1,000 elderly people before preventing one fracture.)

The evidence for vitamin D is slightly better for the elderly in care homes on poor diets, where some early trials were successful, although strangely it has no clear benefits on muscle strength or mobility.

With one in five Britons apparently low in vitamin D, should we be worried? About half the difference between people in blood levels is genetic rather than related to diet, disease or sunshine. Many GPs are now routinely screening patients and diagnosing deficiency or insufficiency. But is this really a disease?

About 25 years ago experts got together to redefine the normal ranges based on the tipping point of different blood hormones. This nearly doubled the level you needed to be ‘normal’ and effectively created millions of new patients overnight. Some of these experts now realise that this may have been an error as these theoretically abnormal levels have never been significantly related to any adverse health outcome. We have unfortunately created another pseudo-disease that is encouraged by vitamin companies, clinicians, food manufacturers and charities.

Most healthy people should get vitamin D from small doses of sunshine every day and trust that thousands of years of evolution would have dealt with the fact that in northern climes our vitamin D level would naturally drop in winter without us snapping our limbs. Our ancestors probably had more diverse diets and foods high in vitamin D than we did (that is, oily fish such as salmon, tuna, mackerel and herrings, but also full-fat cheeses — especially ricotta, for some reason — as well as eggs, butter, liver and mushrooms).

The vast majority of vitamins and multivitamins taken by millions of us daily have been shown to be ineffective and often harmful. While vitamin D treatment still has a role in people with proven deficiency or in the infirm elderly who are high-risk, most of us should focus on having a healthy lifestyle and a diversity of real food. Unless essential, we should avoid artificial chemicals with adverse effects — even if they come disguised under the friendly name of vitamins.

Tim Spector is professor of genetic epidemiology at King’s College London and author of The Diet Myth: The Real Science Behind What We Eat.


  • jeremy Morfey

    I learnt at school that Vitamin D prevents rickets and helps with bones.

    I would like to hear from the eminent professor though what effect Vitamin D has on metabolism and energy levels.

  • Supplements are just that…supplements. Your best option is safe, sensible sun exposure without burning. Have your vitamin D levels tested every 6 months. You can never get enough D from diet alone. See our FB site to learn more about moderate UV exposure.

    • pauliew

      Getting vitamin D tested is ok but testing is not A on the NHS. The patient has to pay. It seems wrong that we are being advised to take vitamin D supplements yet we won’t be tested to see if we actually need it.

      Generally I think the NHS is poor at testing. Some minerals are kept in balance in the blood so would show up normal. Yet cellular or other tests may show deficiency. Cost is obviously a big issue but I wish GPs would make patients aware of limitations and inaccuracies in testing.

  • Doc Mills

    We should all be getting our daily vitamin D from sunshine, but for the last 30 years Public Health have waged a war on sun exposure because of concern about the skin cancers, BUT without any RCTs to prove that this is effective way to prevent melanoma ( emerging studies show that sunbathers get less – http://onlinelibrary.wiley.com/doi/10.1111/joim.12496/abstract;jsessionid=652D493B60C84DB88D741FF2C91C81FB.f04t02 ).

  • helga rhein

    Don’t forget us, living in Scotland. Here, the average vitamin D blood level is 37 nmol/l. Modern science says a healthy level is around 100 nmol/l. Outdoor living people without much clothing living around the equator have an average of 120 nmol/l. – When children are very deficient they get rickets, this can be cured with supplements. When adults are very deficient – they get osteo malacia and it can be cured also with supplements. Then they feel much better, are less depressed, have less aches and pains and fewer colds. It’s a spectrum between symptoms of extreme deficiency and symptoms of minor deficiency. And a range of diseases which appear to develop depending on the length of time and the severity of the D-deficiency. You certainly can not just do a simple RCT when it has taken maybe 3 decades of deficiency to have developed a certain condition. In our climatically challenged Scotland all people should take a supplement, their average levels are unhealthily low.

    • Isaiah

      Forget the pills, the spray is really easy to use.

  • disqus_pFomH2XM20

    Vitamin D regulates gene expression not just for calcium metabolism but for many other processes. One has to consider why this is the case, so why would the body design of animals have evolved to change gene expression depending on the sunlight the body receives? We should assume by default that this system functions in an optimal way.

    So, if certain genes are going to be switched on or off due to low vitamin D levels when an animal’s vitamin D level falls, then this is presumably an optimal choice the body makes, assuming that this happens under natural circumstances. Now, when vitamin D levels are falling under natural circumstances, that means that the Sun is low in the sky or the animal has been burrowing underground (e.g. to escape the consequences of a prolonged drought period). Under such natural conditions where vitamin D levels would be low, food will be scarce or it will become scarce later (in autumn vitamin D levels will start to go down, but there may still be plenty of food around, for a while). It’s then quite logical that the body needs to be re-programmed on how it deals with its energy metabolism. E.g., maintaining muscle mass and fitness will have a lower priority, building up and maintaining fat reserves will have a higher priority. Also, you may feel more tired after exercise to discourage you from wasting energy. This way you’ll be in a much better shape when winter arrives a few months later. compared to going about your business in the same way as in Summer.

    This is why I take 8000 IU of vitamin D a day when I don’t get sun exposure, I eat as much as I can (4500 Kcal/day, only healthy foods, 600 grams of vegetables, 80% of energy comes from whole grain carbs), I exercise as much as I can (75 minutes of running every day). I weight 52 kg, so I have a huge vitamin and mineral intake per unit of body weight. This is how a healthy person is supposed to eat and exercise. But, if you are vitamin D deprived, and don’t get enough exercise and start to eat the “recommended” 2000 Kcal/day, you’ll starve your body of vitamins and minerals into becoming a miserable couch potato.