The heresy: the sun can be healthy. The heretic: Dr. Michael Holick of Boston University, a seemingly gentle, but combative scientist-physician who studies the beneficial effects of Vitamin D, produced by our skin when exposed to the sun.
Report a story about Holick’s research and a reporter can expect to get — as I did — a rocket in the name of the president of the American Academy of Dermatology (AAD) alleging that the information endangers America’s health.
This battle is not about facts.
It is more akin to arguments that erupt over advice that a drink or two of alcoholic beverages a day can help the heart. If we give those who are addicted an excuse, the argument goes, we’re handing them a ticket to excess.
To understand the debate, one must appreciate the AAD’s enormous success in recent years persuading Americans to either avoid the sun altogether or to slather on a lot of sun block if they can’t. Then, along comes Holick alleging that the campaign has gone overboard leaving millions of Americans Vitamin D deficient.
Not long ago medical wisdom held that Vitamin D deficiency only matters if severe enough to produce rickets, a horrible disintegration of the bones seen in children living in severe poverty. But research by Holick and others in recent years proves that Vitamin D plays a key role in avoiding osteoporosis, the bone thinning that often occurs with aging.
In addition, every cell and tissue in the body requires Vitamin D so a lack of it can increase the risk for conditions including heart disease, breast and prostate cancer and high blood pressure.
Many experts now say we need at least 1,000 international units a day of Vitamin D, and it is almost impossible to ingest that much from the typical American diet. Large doses of supplements or moderate sun exposure are the alternatives. One can argue the sun is the far more natural alternative.
Even Holick’s critics agree his science is sound. But that did not stop the dermatologists from pressuring him to resign from the Dermatology Department at Boston University. (He remains on the faculty in the endocrinology department).
“The concern,” argues Dr. Thomas Kupper, a dermatologist at Brigham and Women’s Hospital who speaks for the AAD, is that if “people hear 10 or 15 minutes is OK, then a little more is better and then 30 to 40 minutes becomes an hour and then an hour-and-a-half.”
Holick’s response: “They’re promoting abstinence and abstinence campaigns usually don’t work.”
The argument gets even trickier when we consider how dangerous the sun really is. There is no doubt that sun exposure increases the rate of basal and squamous cell carcinomas. These are called skin cancer. Having them removed frequently can be bothersome and even disfiguring, but they almost never threaten your life.
With melanoma, the potentially deadly skin cancer the role of the sun gets murkier. Research also shows that people who build up and maintain a constant tan such as those who work outdoors are less at risk from melanoma than those who get sudden, rapid exposure. A history of sunburns can be especially dangerous.
A review article in this week’s New England Journal of Medicine concludes that the strongest risk factors for melanoma are a family history, multiple “nevi” skin lesions that can become melanoma, and a previous history of the disease. Exposure to ultraviolet light, the harmful rays from the sun are a more distant “additional risk factor." People often get melanoma on parts of the body never exposed to the sun.
Scientists are elucidating the specific genes that make up that family history. Just last week researchers from the National Cancer Institute, and other institutions, reported in the journal Science that inherited variations in a gene called MC1R can increase a person’s risk for melanoma up to 17-fold. In the not too distant future, we may have blood tests to reveal who is truly at risk for melanoma from sun exposure.
Meanwhile, there is no escaping the data proving a little bit of skin exposure is actually beneficial — no matter what the dermatologists say.