Is drinking extra water good for your skin?

By
January 21st, 2013

The idea that you’ll have a better complexion if you stay hydrated is so commonplace it’s surprising to discover the lack of evidence to back this up.

If you yearn for smooth skin that glows with youth, the chances are that at some point you will have heard the exhortation to drink lots of water in order to flush out those evil toxins and keep your skin healthy.

The exact amount people suggest varies. US-based advice tends to recommend eight glasses a day, while in hotter climates people are advised to drink more to compensate for higher rates of sweating. But regardless of the exact volume of water suggested, the principle behind the advice remains the same – taking extra water on board will keep your skin hydrated. In other words, water acts like a moisturiser, but from the inside out.

This is such a common idea you might be surprised at the lack of evidence to back this up. You might expect there to be countless studies where people are separated into two groups, one assigned to sip water all day, the other to drink a normal amount. Then the smoothness of the skin could be assessed a month or so later to establish whether sipping more led to smoother skin.

In fact such studies are rare, partly because water can’t be patented, so it is hard to find anyone to fund such research when there will be no new medication or cosmetic to sell that could repay the costs. A review by the dermatologist Ronni Wolf at the Kaplan Medical Centre in Israel found just one study looking at the effect of long-term water intake on the skin. But the results were contradictory. After four weeks, the group who drank extra mineral water showed a decrease in skin density, which some believe suggests the skin is retaining more moisture, while those who drank tap water showed an increase in skin density. But regardless of the type of water they drank, it made no difference to their wrinkles or to the smoothness of their skin.

That’s not to say that dehydration has no effect on skin. We can measure some effect through the assessment of skin turgor. This is a measure of how fast it takes the skin to return to normal if you pinch some skin and lift it up. If you are dehydrated your skin will take longer to get its shape back.

But it doesn’t follow that because drinking too little water is bad for the skin, drinking above average quantities is good. It would be like saying that because a lack of food leads to malnutrition, overeating must be good for us. Or as Wolf puts it, it’s like saying a car needs petrol, therefore the more petrol the better.

Mystery advice

Another common belief is that if you drink extra water the body will somehow store it. But it depends on how fast you drink it. Drink several glasses within a fifteen-minute period and you will just pass extra urine. If you spend more than two hours sipping the same amount, more liquid is retained.

There is one study suggesting that drinking 500ml of water increases the blood flow through the capillaries in the skin. But the skin was only evaluated thirty minutes after drinking the water, and what we don’t know is whether this in turn improves skin tone.

One counterargument is that skin contains up to 30% water, and this helps it to look plump. This may be true, but the skin’s youthful appearance is affected more by factors such as genetics, exposure to the sun and damage from smoking.

So the mystery is where the eight-glasses-a-day recommendation for good skin comes from. Few of the official guidelines even refer to the skin. Water is undoubtedly the most important nutrient for the body. Without it we die in a matter of days, and there are of course other health benefits from staying hydrated. A review in 2010 found good evidence that it reduces the recurrence of kidney stones in those who have already had them, but evidence for other specific benefits is weaker.

Arguments rage over the eight glasses a day rule, with disputes over how much is needed to clear the kidneys of toxins and whether or not water helps curb the appetite. It depends on how high the ambient temperature is and how much you are exerting yourself. It’s also a myth that other liquids don’t count. It doesn’t have to be water. Even food contains more liquid than you might expect. Pizza is 40-49% water, for instance. The percentage of water we derive from food in the diet depends on where you live. In the US it’s 22%. In Greece, where people eat more fruit and vegetables it is much higher.

So the problem is a general lack of evidence that drinking more water makes any difference to your skin. We can’t say it definitely doesn’t work, but there’s no evidence that it does. Which leaves the question of how much water you should drink. Since it depends on the weather and what you are doing, then there is a very good internal guideline we all have that can help. And that’s thirst.

If you would like to comment on this article or anything else you have seen on Future, head over to our Facebook page or message us on Twitter.

You can hear more Medical Myths on Health Check on the BBC World Service.

When the holidays turn depressing By Dr. Charles Raison

By
January 21st, 2013

When the holidays bring heartache instead of joy, I think they do so because they stand as an unforgiving yardstick against which we measure our losses and troubles.

If no one reminds us, we can sometimes overlook the fact that loved ones are gone, or that our lives are filled with painful conflict in exactly the intimate areas that should be sources of strength and comfort for us. But then along come the holidays, imposing upon us once again a template for what happiness and interpersonal success is expected to look like.

It can be hard to measure up. It is far easier to overlook the death of loved ones when you don’t have to stare across the holiday table at their empty places. It is far easier to pretend that family trauma or conflict don’t exist when you are far away and on your own.

But the holidays force us to either return to painful family interactions or to fully own our isolation and spend the season alone.

It is a terrible choice. I’ve treated many patients over the years who reliably became depressed during the holidays out of dread of having to interact with their families. On the other hand, the silence of Christmas morning on one’s own carries its own unique pain.

I never cease to be amazed at how often both emotional well-being and mental illness hinge on how we negotiate these types of impossible choices. Because the choices really are often insoluble and the losses are often so actual, we in the mental health professions frequently try to find “a third way” to help people cope. In the end, these “third way” approaches usually come down to helping people reframe their issues so they that seem less hopeless and painful. Or we provide people with medications such as antidepressants to make their brains and bodies less reactive to stress. Or we do both.

Reframing

I’ve given many interviews over the years regarding strategies for helping people cope emotionally with the holidays. For people truly overwhelmed, I often recommend exploring ways to neutralize Christmas negativity by changing how they approach the holidays. For example, if someone develops a major depression every year before or after going home to see her family, I encourage her to explore what would happen if she abandoned this painful pattern and instead proactively planned a Christmas vacation somewhere beyond the reach of her memories and holiday associations that generate symptoms of depression.

Sometimes this type of strategy works beautifully. Often other family members are equally miserable and join the exodus, providing strength in numbers. Sometimes the person’s absence leads the family to re-evaluate itself and change in positive ways. But sometimes, the attempt to flee Christmas is met with such anger and guilt production from the family that the patient actually ends up doing worse. Everyone’s holiday situation is unique.

This type of approach toward reframing Christmas follows what I sometimes call the “who says” rule. Many times we torture ourselves with ideas of how things should be, or would be if we were somehow smarter, richer, different. To which I often ask, “Who says?” “Who says things have to be the way you think they should be?” “Who says you have to suffer over a painful fantasy of what you think Christmas ought to be?”

We cling tightly to our fantasies — good and bad. But sometimes when we can loosen their grip on us, we can see new possibilities for how to be at peace with our lives and find a little joy.

Medications

This holiday season I’ve been thinking a lot about Christmas 1987, because it was four days later, on December 29, that fluoxetine, better known by its brand name Prozac, received FDA approval for use in the United States. The approval of Prozac launched one of history’s greatest run of “third way” approaches to trauma and loss.

With Prozac came a growing belief that medicines might hold promise as the ultimate solution, not just to clinical depression, but perhaps to heartache more generally.

Having once believed this myself, I find that now, 25 years later, I am far more cautious in my appraisal of what the coming of Prozac actually meant for the world’s emotional well-being. I’ve seen repeatedly with my own eyes how modern antidepressants like Prozac can help depressed people get their lives back. And I’ve seen people who had struggled with negative thoughts and feelings for years find that they were different — and more successful — with the addition of an antidepressant in their lives.
Dr. Charles Raison
Dr. Charles Raison

But in the last several years it has become increasingly clear that antidepressants are not, and probably will never be, a cure-all for heartache, in any of its forms clinical or mundane. For one thing, our best current data suggest that antidepressants only work adequately for 40% to 60% of depressed people, with the percentages varying depending on what one thinks of as “adequate.”

More recent evidence suggests that antidepressants can actually worsen depressive symptoms in a sizable minority of people who take them. Perhaps this shouldn’t surprise us. Chemotherapeutic agents that increase the probability of surviving cancer also increase the risk of developing a second cancer in the future. And yet despite this fact, and despite the dread we feel at the mere mention of the word “chemotherapy,” most of us embrace chemotherapeutic treatment when diagnosed with cancer, understanding that despite the manifold limitations and horrendous side effects, it’s the best that we’ve got.

Perhaps the most concerning recent debate in the antidepressant literature revolves around the question of whether taking these medications increases the risk of having a depression relapse when the antidepressant is discontinued. This issue is complex and hotly debated. But if the weight of the evidence eventually suggests that antidepressants carry this risk it will further complicate the clinician’s task. Still, as with chemotherapy, they’re the best that we’ve got.

Finding more solutions

This year, with Christmas upon us, I am more convinced than ever that we who work clinically or conduct research in the realm of mental health must redouble our efforts to find new and better “third ways” to help deal, not just with clinical depression, but also with the ubiquitous heartache and anxiety that are so prevalent in the modern world.

Although I personally research biological approaches to treating depression, I suspect that part of our movement forward will come from better integrating older ways of wisdom into our treatment protocols. Many wisdom traditions point toward the same thing — that full healing requires not just reframing or biology, but an inner transformation that embraces suffering itself as a means of escape from our suffering.

Thin is in, but fat might be better By Lisa O’Neill Hill

By
January 18th, 2013

When Janet Servoss shops for clothes in Orange County, California, she sees plenty of selection in sizes 0, 2 and 4, but fewer in sizes 12 and 14.

“You’re bombarded by it daily,” she said of the message that thin is better. “It’s everywhere.”

But according to a report published in the Journal of the American Medical Association, being thin might not be in your best interest in the long run. The report is drawing strong reaction in the medical community, among proponents who hail its findings and among critics, one of whom dismisses it as “rubbish.”

The comprehensive study confirmed that obese people tend to die earlier than people of normal weight. But it also found that overweight people — those with a body mass index (BMI) of 25 to 30 — had a lower risk of dying than people of normal weight.
Study: Pleasantly plump may live longer
Study: Belly fat can weaken men’s bones
Obesity projections for America
Corner stores help clean up obesity

“If I were to look at this study and if it is shown to be true, I would think maybe I should be worrying less that I’m wearing a size 12 and focus on how I feel,” said Servoss, a 44-year-old nurse whose BMI fluctuates from 24 to 26.

Researchers analyzed nearly 100 studies that included more than 2.8 million people. While obese people had a higher risk of death — particularly those whose BMI was 35 or more — overweight people had a 6% lower risk of death than those of normal weight.

“Because this bias against weight has been so prevalent, it’s really been unquestioned, and I think this concept that thin is healthy and fat is not healthy is clearly not true,” said Michelle May, a physician and author of “Eat What You Love, Love What You Eat.”

Big deal: You can be fat and fit

Some thin people exercise excessively and don’t eat a balanced diet, and there are people in the overweight and obese categories who have good diets and are active, she said.

May said people need to focus on choices about eating and physical activity rather than be concerned about the numbers on a scale.

“I find it interesting that the reason they did this is because this is something that has shown up over and over again. It is challenging to shift a paradigm that has become so deeply entrenched, that being overweight by BMI category automatically puts you at high risk,” she said.

Americans overemphasize the importance of being thin, said professor Glenn Gaesser, author of “Big Fat Lies” and director of the Healthy Lifestyles Research Center at Arizona State University.

“We have had for decades now an obsession with thinness and an obsession with weight and how to lose it,” he said. “I think the forces in our culture — in fashion, in fitness, in health and wellness — all have been predicated on, ‘A thin body is a good body and a fat body is a bad body,’ and that’s wrong. I have always believed that a good, healthy body can come in many shapes and sizes.”

Fat, fit people tend to be better off healthwise than thin people who are unfit, Gaesser said, suggesting that being fit is far more important than being thin.

“I think in general, America is still not ready to accept this notion that fitness comes in many shapes and sizes,” he said. “It’s a good message, but I still think people would rather be thin.”

Exercise lengthens your life — even if you’re overweight

The study authors say it’s possible that overweight people live longer because they get better medical care and are tested for diabetes, heart problems and other diseases stemming from their weight. Heavier people might also be able to better survive infections or surgery.

While many say the findings make sense, some experts take issue with the way the research was conducted and express concern it will send wrong message.

“Of course, a lot of people would like to hear that it’s no problem that they are overweight or obese,” said Walter Willett, professor of medicine at Harvard Medical School and chair of the Harvard School of Public Health’s Department of Nutrition. “It causes a lot of confusion that’s completely unnecessary.”

He called the study “a pile of rubbish.”

Scientists often disagree, said Barry Graubard, senior investigator at the National Cancer Institute and one of the authors of the study.

“We published our findings in the peer-reviewed scientific literature to invite discussion,” he said in an e-mail. “It is by engaging with our colleagues in this manner that science advances.”

BMI is one of three numbers people should watch, according to Willett.

“It’s also useful to look at weight change since age 20,” he said. “That’s going to primarily be fat. The third is your waistline. The vast majority of people will be best off if they do not increase their weight or waistline after age 20.”

Not smoking, eating a high-quality and healthful diet, not being overweight and being physically active all contribute to a person’s health, he said.

Memphis, most obese U.S. city, moving from fat to fit

While the majority of experts and scientists believe that excess weight is unhealthy, Dr. Kamyar Kalantar-Zadeh, professor of medicine and public health at the University of California, Irvine, takes a different stance. A black-and-white approach to obesity is inappropriate, he said.

“Most experts have problems with this sort of data,” he said. “It’s difficult to see that some of these principles are being questioned.”

Kalantar-Zadeh compares the lack of consensus about weight and fat to the evolution of thought about alcohol consumption.

Alcohol was thought to be detrimental to a person’s health, but studies started to show that alcohol consumption in moderation could have some health benefits, he said.

“The fat-is-bad principle is a very recent approach,” he said. “Body-stored fat has helped us for hundreds of thousands of years to survive hardships. That should tell us evolutionarily there was something good in that.”

A higher body mass index can be protective in certain situations, he said.

“Once you are in your 70s, 80s or 90s, or if you have chronic disease like heart failure, rheumatoid arthritis, chronic lung and kidney disease, a larger body size gives you longevity,” he said.

Not all fat is bad, but belly fat is more harmful than fat in the arms, legs or buttocks, he said.

Global report: Obesity bigger health crisis than hunger

Fitness expert John Siracuse said people shouldn’t get caught up with numbers.

“I always got people focused on their bodies rather than on a number and make them more aware of their muscle tissue, their shape,” he said. “Are they getting stronger? Faster? Can they pedal longer? If you listen to your body more, you will know the symptoms before your body starts to break down. We tend to forget about our bodies and that’s when we start getting fat.”

Servoss said she can see where a little bit of extra fat could be good for people facing a serious illness, but said the more weight you have, the harder it is on your joints. Excess weight also comes with numerous health challenges, including circulatory problems, high blood pressure and diabetes, she said.

“I think the numbers have their place,” she said. “They do give us some reference but it does ultimately come down to how you are feeling, your exercise tolerance, are you able to do the things you love to do without any difficulty?

“That’s the kind of thing I should be focusing more on, rather than that my jeans come from the back of the rack rather than the front,” she said.

Berries Linked to Lower Heart Disease Among Women

By
January 18th, 2013

The benefits for the heart of eating strawberries and blueberries can build up over a lifetime, according to the latest research.

Bright-colored berries have long been a part of any healthy diet, owing mainly to the anthocyanins that give them their vibrant color and act as antioxidants to fight off damage to cells. Now a study published in the journal Circulation confirms and quantifies that benefit; women who ate three or more servings of blueberries and strawberries per week reduced their risk of heart attack by up to one third.

In the study, researchers from the Harvard School of Public Health and the University of East Anglia in the U.K. analyzed data from 93,600 women ages 25 to 42 enrolled in the Nurses’ Health Study II. For 18 years, the women filled out surveys detailing their diets at four-year intervals.

(MORE: Study: Flavonoids May Help Protect Against Parkinson’s)

During the study the women experienced 405 heart attacks. But women who consumed the most blueberries and strawberries had a 32% reduced risk of heart attack compared with the women who ate berries once a month or less. The women who ate more berries also tended to eat healthier overall, consuming more vegetables and fruits than those who didn’t eat as many berries; but when the scientists broke down the women’s diets, they found that the highest consumers of berries even had a lower risk of heart attack compared with women who still ate plenty of fruits and vegetables but fewer berries. The effect remained even after the researchers adjusted for other things that can influence heart-disease risk, such as obesity, high blood pressure, smoking, low levels of physical activity and a family history of heart disease.

“These foods can be readily incorporated into diets, and simple dietary changes could have an impact in reducing risk of heart disease in younger women,” says study author Aedin Cassidy from the University of East Anglia. “This supports growing lab data showing that these compounds can help keep arteries healthy and flexible.”

So what is it about berries that help the heart? The researchers focused on blueberries and strawberries because these are the most widely consumed varieties in the U.S. Both berries contain high levels of anthocyanins, as well as other flavonoids, which fight the effects of stress and free-radical damage to cells as they age. They can also keep heart vessels more elastic and flexible, which helps combat the growth of plaques that can build up and rupture, causing heart attacks.

(MORE: Can Eating Fruits and Veggies Outwit Bad Heart Genes?)

The results are particularly encouraging because they showed that a change in diet could affect heart-disease risk for relatively young women. That means that regular consumption of berries might be a relatively easy way to lower a woman’s risk of having a heart attack later in life, possibly even insulating her from heart problems. “Although we know about the effects of antioxidants and flavonoids, and their effects in wine and chocolate, it is interesting to look at their effects in such a large group of women over a long period of time,” says Dr. Suzanne Steinbaum, director of women and heart disease at Lenox Hill Hospital in New York City, who was not involved in the study. “The take-home lesson is that even if you are eating these early in life, you’re getting benefits that last for life. When we’re making choices in our 20s, we may think that a burger and fries is great, but the message is that there are alternatives that make a difference for the rest of your life. It is a powerful message that we can prevent cardiovascular disease by what we eat.” Something worth remembering the next time you’re in the produce aisle.

Alexandra Sifferlin @acsifferlin

Alexandra Sifferlin is a writer and producer for TIME Healthland. She is a graduate from the Northwestern University Medill School of Journalism

Childhood Trauma Leaves Legacy of Brain Changes

By
January 18th, 2013

Painful experiences early in life can alter the brain in lasting ways.

A difficult reality for psychiatrists and counselors of child abuse is that young victims are at high risk of becoming offenders themselves one day, although it’s unclear why. But now a team of behavioral geneticists in Switzerland report a possible reason: early psychological trauma may actually cause lasting changes in the brain that promote aggressive behavior in adulthood.

Writing this week Translational Psychiatry, the researchers describe a series or experiments conducted in rats that led them to that conclusion. Animals placed in traumatic, fear-inducing situations around the time of puberty show high and sustained levels of aggression later in life. And while rats cannot substitute for humans, the scared rats also showed changes in hormone levels, brain activity, and genetic expression that appear very similar to traits observed among troubled and unusually violent people.

The main implication of the research, says study co-author Carmen Sandi, is that it links two previously observed phenomena: the higher rate of aggression among those experiencing early-life stress, and the blunted activation of a brain region known as the orbitofrontal cortex among people with pathological aggression. Social learning, it seems, may not be the only thing that makes abused kids more likely to grow up aggressive.

“This is a key finding which highlights the importance of not only developing social programs and politics, but also of reinforcing research that could offer valid [medical] treatments for individuals that have been victimized early in life,” says Sandi, the director of the Brain Mind Institute at Ecole Polytechnique Fédérale de Lausanne, in an email discussing the study. “We need to understand the neurobiological mechanisms to offer better solutions to break ‘the cycle of violence.’”

In the study, Sandi and colleagues tested the rats for changes in specific regions of the brain following long periods of fear, and then tested a potential treatment to determine if it was possible to undo those brain changes.

They began by exposing about 40 pubescent male rats for a few minutes at a time over several days, to severe stress — which, for the rats was either the scent of a fox or being stranded on a brightly lit platform. Those rats immediately showed higher levels of stress hormones and later puberty onset than similar animals not exposed to those experiences.

As adults, the stressed rats showed greater aggression toward other males they met — even ones that were clearly not a threat because they were much smaller or even anesthetized. And the once-stressed animals also showed more signs of depression and anxiety, including a reduced interest in food, lower sociability, and a tendency to give up quickly when faced with a challenge.

Those behavioral changes were accompanied by neurobiological changes in the brain as well. Compared to normal rats, the once-fearful ones had higher levels of the hormone testosterone, which is linked to aggression. They also showed more activity in the amygdala, the part of the brain responsible for emotions such as fear and anxiety, and altered connectivity between the amygdala and a region of the brain involved in decision-making. These brain alterations were also correlated with enhanced expression of the gene for an enzyme known as monoamine oxidase A, or MAOA, providing the scientists with a potential way to reverse the effects of the early traumas. Indeed, treating the rats with an MAOA inhibitor helped to restore normal social behavior and reduce aggression in the formerly stressed animals.

It turns out that the MAOA gene is also related to aggressive behavior in people, and certain inherited variants of the gene have also been linked to aggressive tendencies. Because the new study showed that MAOA inhibitors were effective in treating pathological aggression in rats, Sandi says the findings might suggest a similar drug treatment for humans, too, to complement behavioral therapy.

“What we show in our study is that, regardless of the genetic background, exposure to early life trauma can on its own affect the expression levels of this molecule in the brain,” Sandi says. “Our work is novel in many ways, particularly because it provides concrete neurobiological pathways that link early trauma with pathological aggression.”

Why would early traumatic experiences crave permanent changes in the brain? Evolutionarily, such brain changes may have helped us to survive a harsh and cruel environment, by keeping us on edge and ready to confront any possible threats, Sandi says. Today, however, those same changes may do more harm than good, leading some victims of abuse to slip into a vicious cycle, seeing threats where none exist, and overreacting to situations, often with violence. It’s possible that some people may be genetically more sensitive to the changes triggered by painful experiences, and therefore more likely to benefit from treatments that can address those genetic differences. Better understanding of why vicious cycles of violence exist may help researchers to find ways to break them.