If you want to skip the misery that comes with fighting a seasonal cold or flu, new research explains why sleep is some of the best preventive medicine.
We already knew that not getting enough sleep can lead to an increased risk of getting sick, but Nathaniel Watson, a neurologist and sleep specialist at the University of Washington School of Medicine, said this new research helps explain why.
Sleeping poorly can block specific genetic processes in the cells that make up your immune system, which is responsible for fighting off infections and disease, according to the new study.
“Your immune system is not functioning the way it was meant to when you’re sleep deprived,” Watson said.
This study is the first one that Watson and his colleagues are aware of that looks at what happens to the immune system’s DNA when you’re not getting adequate sleep.
“It’s further evidence of how important sleep is to human health and physiology,” Watson said.
Just an hour of lost sleep can cause cellular damage
The researchers followed 11 pairs of identical twins for the study. One twin reported sleeping at least seven hours per night, while the other slept approximately one hour less per night.
Looking at identical twins helped control for the fact that sleep needs vary by person, Watson explained. Genes account for about 50 percent of our sleep needs, meaning identical twins are the best-case scenario for getting a good comparison.
Each study participant wore a movement-tracking sleep monitor for two weeks, which confirmed that one twin in each pair slept, on average, one hour less than the other. (Total sleep time also included any daytime napping.)
The researchers took blood samples at the end of the study, which revealed that the immune system of the twin who slept less was less active than the twin who slept more. Those who slept less were actually making fewer proteins, the molecules that our bodies run on.
“They had an underperforming immune system,” Watson said of the shorter sleepers, “which would put them at higher risk of getting sick.”
To control for other potential factors that could affect sleep need and immune health, the researchers excluded people from the study who had diabetes, depression or other mental health problems and sleep disorders. They also left out shift workers, smokers, drug users and drinkers.
The big takeaway for individuals is that getting good sleep ― quality as well as quantity ― is a really important element of human health, Watson said.
“Add risk of infection to the myriad reasons why sleep deprivation is bad for you,” he said ― a list that already includes such issues as reduced performance during the day, depression, hypertension, cardiovascular disease, diabetes and irritability.
1. The Hole We Poured Our Hearts Into
A few times in life, if we are lucky, we meet an opportunity that seems so right for us that we will pour our hearts out to get it. A life-changing something that will keep us up late, get us up early and make us dig deep to give our best 100 percent—and then go back to give more, just for insurance. When these big things don’t pan out, when we don’t get into the college, or make the team, or get the grant, or win the part, we are left with a wound in the shape of the thing we made a part of us, just by wanting it so badly.
But the hole fills— eventually in ways we can’t expect. We learn so much by giving our all that we are forever changed. We learn our limits and how to push through them—if only to help us with our next dream.
2. The Bruise of Rejection
As a kid, I was a train wreck as an athlete. I was gangly, uncoordinated and nervous. And I fell…a lot. By two weeks into the school year, everyone had learned the truth, and I became the very LAST kid to be picked—every time. Ouch. Ouch. Ouch.
My husband says that’s nothing compared to asking a girl on a date and getting turned down or, worse, laughed at.
Part of life as a grown-up is being turned down and turned away. But early rejection can batter our ever-fragile egos in ways that could leave us playing injured or sitting on the bench throughout our lives. Being picked last for the fourth-grade kickball team, not being picked for the first summer job, or being dumped by the love of a lifetime are differently shaped bruises, but they hurt in the same place.
In living to tell about it, though, we learn not only how to withstand rejection but also how to hold and extend empathy for others who have been dismissed, overlooked or otherwise rejected.
3. The Coulda-Woulda-Shoulda Mistake We Can’t Forgive Ourselves For
In our memory, it was perfect: that job, that relationship, that apartment. Except we goofed. We made a tiny error (on something we’d gotten right a thousand times before!). Or we flamed out massively due to one small miscalculation (if we’d just waited five more minutes…). And it aches every time we think of what we should have, could have and would have done differently to get and keep it.
The loss can pack a sting for years to come. Because, it seems, when we get something similar to the perfect thing that got away, we often can’t help but mess it up. We compare the real, wonderful thing we actually have to the imaginary perfect thing that got away, and we feel shortchanged.
This is a wound that hurts not only us but also the unsuspecting people who care about us. Imagine how it feels to be compared with the one that got away…being told how much better it was or could have been. This is one of those wounds it helps to be aware of so we can be sure to treat the people in our lives like grand prizes and first choices—and not like backup prom dates.
4. The Secret Under the Band-Aid
A neighbor friend often has a Band-Aid on the back of her hand, even though she has no sore there. “A bandage reminds me to protect my wounds. Today, my sore spot is my sister, who is fighting cancer,” she told me. “Wow,” I thought. Instead of trudging through the day with free-floating pain, wondering why she felt a bit fragile and tender, she just marked the wound and treated herself with TLC.
“And maybe it’s all in my head,” she said, “but wearing a bandage also makes me feel like other people are cutting me a little more slack.”
There it is, friends. How different might the world be if we saw one another’s sore spots? What if we remembered—even without seeing a bandage—that we are all walking wounded, and that a little attention, kindness and caretaking is always what the doctor orders.
(Reuters Health) – Following a diet that mimics fasting may reduce risk factors for disease in generally healthy people, according to a small study.
Dr. Min Wei of UCLA’s Longevity Institute and colleagues tested the effects of the fasting-mimicking diet on various risk factors for diabetes, heart disease, cancer or other conditions.
The diet (FMD; brand name ProLon) is low in calories, sugars and protein but high in unsaturated fats. Forty-eight study participants ate normally for three months while 52 ate FMD for five days each month and ate normally the rest of the time. After three months, the groups switched regimens. Although all participants were considered healthy, some had high blood pressure, low levels of “good” cholesterol, and other risk factors.
A total of 71 people completed the study, which was published in Science Translational Medicine. Body mass index, blood pressure, blood sugar and cholesterol improved with FMD, but mainly for those who were already at risk. Side effects were mild, including fatigue, weakness and headaches.
Wei and Dr. Valter Longo of the University of California, San Diego, said in an interview published in the journal that while “the great majority” of participants had one or more risk factors for diseases such as diabetes, heart disease or cancer, “FDA trials will be necessary to demonstrate whether periodic FMD is effective in disease prevention and treatment.”
Dr. Joseph Antoun, CEO of L-Nutra, Inc., which produces FMD, told Reuters Health by email that FMD “is intended for use by individuals who want to optimize their health and wellbeing, by overweight or obese individuals who want to manage their weight in an easy and healthy way, and by people who have abnormal levels of biomarkers for aging and age-related conditions.”
That said, Antoun acknowledged that if you have common conditions associated with overweight and obesity such as diabetes, cardiovascular disease and cancer, you should not use FMD without a doctor’s approval.
The product also should not be used by children under 18 or pregnant or nursing women. And it’s not for you if you have certain metabolic diseases, liver or kidney disorders that may be affected by the very low glucose and protein content of the diet, or if you have nut or soy allergies. What’s more, it “should never be combined with glucose-lowering drugs, such as metformin or insulin,” according to Antoun.
Registered dietitian Ashlea Braun of the Ohio State University Wexner Medical Center in Columbus pointed out that researchers compared the fasting-mimicking diet to participants’ usual diet. “Therefore, we don’t yet know how this diet stands up against long-standing approaches already shown to be beneficial, such as the Mediterranean or DASH Diet.”
“It’s not clear if (FMD) enables individuals to consistently meet all micronutrient requirements,” she told Reuters Health by email. “It’s also not known how this type of restrictive diet affects muscle mass in the long term, and what impact this has on various indicators of health.”
“Although there is some evidence showing these type of restrictive diets can help ‘jump start’ people considering lifestyle changes, more research is definitely needed before this is recommended for individuals,” Braun concluded.
My battle with acne started in middle school. It’s better than it used to be, and I’m much better at concealing it, but breakouts still happen even in my late 20s. While I’ve tried a lot of products aimed at zapping my zits, and some have worked more successfully than others (salicylic acid face wash, I love you), I’ve never tried to overhaul my diet for better skin.
There’s reason to believe that it would help, especially when it comes to dairy. Let me back up for a second. There’s no ironclad consensus among dermatologists that eliminating dairy will magically give you the complexion of a skincare spokesmodel. Some studies have found that skim milk, specifically, might be an acne contributor. (The exact mechanism isn’t clear, but it could be that skim milk increases a hormone that then ramps up oil production in your skin). Other, less conclusive research has found links between acne and any type of milk. Dermatologists who do advise blemish-prone patients to eliminate milk say you should go all in, getting rid of any dairy in your diet. For me, if there was even a chance that it would diminish my acne, I was willing to try.
Except, “challenging” doesn’t quite cover how difficult this was going to be. In college, string cheese was basically a food group for me, and though I’ve cut back (the didn’t you just buy that pack two days ago?looks from roommates, and now, my fiancé, forced me to re-examine my snack choices), I’m still a frequent cheese, ice cream and sour-cream eater. (My last meal would probably be mozzarella sticks.)
For the sake of my skin, though, I went cold-turkey off dairy for three weeks. Here’s how that went.
T-Minus 1 Day
First, one last pint of cookies and cream ice cream. Then, some research. I have a decent knowledge of nutrition—I can name surprising foods that are full of sugar, and tell you what to eat if you want to add more fiber to your diet. But I’m surprised to see what foods may contain dairy. Salad dressings, granola bars, crackers, bread, deli meats?! I look up no-dairy diet rules online and see butter on some no-no lists, while others say it’s okay. Small amounts of butter are a cooking staple in my apartment, so I decide to allow it (sorry not sorry).
My normally delicious cups of tea in the morning are not so delicious without milk. New resolution for tomorrow: Remember to bring something, anything, nondairy to put in my tea. I also realize the no-dairy thing will make it difficult to dine out or eat anything that I don’t prepare or witness being prepared. I normally buy my breakfast in my office building’s cafeteria, and while I’d never wondered if my daily oatmeal was made with milk instead of water, I’m definitely wondering now. I want to ask one of the cafeteria staff, but I also don’t want to be that person who’s asking whether food complies with their totally non-life-threatening, completely self-imposed dietary restrictions. So I don’t ask. Later, I look up the food-service company that runs our cafeteria to see if I can find nutritional information on their website—no such luck.
I forget the almond milk again. So, I switch to icing my tea, which tastes slightly better than hot milk-free tea. I decide that instead of trying to swap normal dairy sources for nondairy versions (i.e., vegan cheese. I don’t know what it’s made of and I don’t want to find out), I’ll just change my diet accordingly. Milk-free tea? Okay. Pasta without Parmesan cheese grated on top? No fun, but fine.
My fiancé is breading chicken for dinner and I say yes when he asks if he can put just a splash of milk in the egg wash to thin it out. This brings me to one of the complicating factors in this whole restrictive diet thing: My fiancé and I cook and eat dinner together pretty much every night, and this is a guy who never turns down cheese and makes himself a chocolate milk for dessert more nights than not. When I told him that I was doing this story, and that I would understand if he didn’t want to do dairy-free dinners for three weeks with me, I was thrilled when he said he was totally on board. That lasted until about day three, when he decided that he’d support me, but he was done with his own experiment.
Today was a long day and when I get home, the last thing I want to do is cook. On nights like this, I’d normally talk my fiancé into ordering pizza (we don’t eat pizza that often! It’s not that bad! Peer pressure!). Since that’s not an option now, I begrudgingly agree that we should make something. I’m happy about the choice—chicken piccata and roasted potatoes—once the food is in my stomach.
Let’s talk about my skin: It looks great! Any dermatologist, including Rachel Nazarian, MD, assistant clinical professor of dermatology at Mount Sinai in New York, who I asked to weigh in on whether three dairy-free weeks could really change my complexion, will tell you that it’s not enough time for your diet to affect your skin, especially when it comes to acne. For that, you need about three months (can you say, no thanks?). But I swear I’m noticing differences—fewer new zits, and the one or two I have now are small whiteheads, not the large, deep, cystic kind I usually get. I asked Nazarian what could explain what I’m seeing in the mirror, and she pointed to all of the ice cream I used to eat, for starters. Most ice cream is made with whole milk, so the dairy in it probably isn’t an issue, but ice cream is loaded with sugar, and foods that spike your blood sugar (known as high-glycemic-index foods) are definite contributors to acne. I wonder whether this means that I can never eat ice cream again while admiring the fact that I’m using a lot less concealer in the morning than I was a week and a half ago.
My skin still looks awesome but I miss dairy so much. We had Mexican food for dinner last night, and a cheese-less quesadilla is just sad. I’m pretty proud of the fact that I’m relying on healthier foods for snacks, though. Before, I would reach for a bowl of ranch-flavored chips (which contain dairy, by the way) after dinner; now, it’s toasted whole wheat pita with hummus, berries or—gasp—popcorn that I actually made myself on the stove and did not add cheese to. This could also be a reason why I’m noticing skin changes so quickly, said Nazarian. Lots of processed snacks are high on the GI index in addition to containing dairy. I cut them out because of the dairy, but in the process, also got rid of their blood-sugar-spiking effects.
I have family coming into town and we always go out for Italian together. I selfishly don’t want to sit there while everyone else enjoys cheesy dishes, so I make reservations at non-Italian restaurants (the benefit of being the one who makes the reservations!). We’re also having them over to watch football at our apartment, and I know they’re going to want me to make our family’s famous buffalo-chicken dip that’s topped with tons of blue cheese. Send me strength.
Not only do I not eat the dip, I don’t eat the fried-cheese curds we order from a bar up the street, either. I’ve never been so proud of myself.
Word to the wise: If you haven’t eaten dairy in a while, your stomach may stage a full-on rebellion the first time you have some again. It happened to me after dinner last night. As I’ve noticed over these past two-and-a-half weeks, it’s really hard to dine out when you’re dairy free. Even if you think you’re ordering a dairy-free dish, you can’t be sure unless you ask. I ordered a side of crispy Brussels sprouts at dinner. Seemed like a safe bet. They came with a bit of creamy, garlicky sauce on them, even though the menu description said no such thing. I didn’t want to let them go to waste so I had just a few. They were crazy delicious, but about an hour later, even that tiny amount of cream made my stomach revolt on me.
Sweet Freedom! I’m so close to being done that I can actually taste the mushroom and pepperoni pizza I’m going to order immediately afterward. But I can’t ignore the fact that my acne is better than it was three weeks ago, particularly now, at a time of the month when my jawline would normally be dotted with breakouts (thanks, hormones). I’m getting married in the fall and based on the results I saw after three weeks, I’m considering going dairy-free for the three months leading up to the main event. I don’t think I’ll stay 100 percent dairy-free in the meantime, though—it makes it really hard to dine out, and even though I don’t eat out that often, I don’t need it to be a stressful process when I do—but I am going to stick with my dairy-free, healthier snacks. And yes, I’ll even cut back on the ice cream.
What The 2016 Oscar-Nominated Films Can Teach Us About Trauma And Addiction
We can indeed triumph over pain and “beat it.”
“Manchester By The Sea”
“I can’t beat it,” says Casey Affleck’s character in “Manchester by the Sea.” Lee Chandler tries to lead a simple life, but when he is forced to return to his hometown, we realize that he just can’t shake his past demons.
As a former doctor to homeless men and women now overseeing addiction efforts at Rikers Island, I have heard my patients, time and again, express the same heart-wrenching despair. And I’ve routinely witnessed the close relationship between traumatic events — divorce, death, unemployment, physical or sexual assault, poverty, incarceration — and alcohol or drug addiction.
Watching Affleck’s performance, the doctor in me wanted to tell the troubled janitor: “Get help. Go to therapy. Just talk to someone!” It soon becomes clear that there is no simple way to cure his pain. While he appears to be in control, post-traumatic stress regularly rears its ugly head. Lee escapes to a different city and cuts ties to family and friends until another tragedy brings him back. Lee re-experiences the unspeakably tragic death of his children through flashbacks. And he refuses ― or is unable ― to discuss it.
The death of his children in a fire caused by his own drunken hands cripples Lee into silence. He can’t make simple conversation, like chat up the neighborhood mom. Instead, he makes regular visits to the local watering hole, provoking fist fights with strangers. Lee doesn’t know how to cope. He turns, as many have, to liquor and violence. But why?
We don’t know whether he seeks counseling, although it doesn’t seem like he does. We do see, however, how the film skillfully explores the different ways trauma can impact different people. Lee’s ex-wife (played by Michelle Williams, who suffered the same devastating loss) finds a way to move on. She doesn’t forget the tragedy, but she is also not paralyzed by it. She remarries, has a child. Her wounds begin to heal. Maybe she opened up to those around her? Perhaps she went to counseling? What director, Kenneth Lonergan, shows us is that it IS possible to heal. Different people heal, in different ways, at different rates.
Barry Jenkins’ tender masterpiece, “Moonlight,” is a story that beautifully weaves together the interplay of trauma and addiction. We witness firsthand how poverty and a parent’s crack habit can contribute to a child’s developmental trauma. When Naomi Harris’ character, Paula, desperately pleads to her son “I need money, Chiron,” we can see that child’s need and desire for love and support become thwarted by a parent who is no longer in control.
Chiron is beaten up by schoolmates. His mother turns tricks to support her habit. And like the many men around him, Chiron also becomes incarcerated, likely drug-related. He, too, becomes a dealer. And the vicious cycle of addiction-related repercussions and trauma repeat.
Trauma has lasting physical, neurological and emotional consequences ― particularly when endured at a young age. According to the Substance Abuse and Mental Health Services Administration (SAMHSA), trauma results from an event or series of events that is experienced by an individual as physically or emotionally harmful or threatening, with lasting adverse effects on the individual’s well-being.
People who experience trauma can sometimes develop self-destructive coping skills such as substance use, binging/purging, cutting or gambling. Exposure to traumatic events increases the risk of developing a substance use disorder (SUD), according to the National Institute of Drug Abuse.
While the trigger(s) behind Harris’ cocaine addiction were unclear, according to SAMHSA, 55-99 percent of women with substance use issues reported a lifetime history of physical and/or sexual abuse. Young Chiron clearly loved his mother, but he was also ashamed. As were the family members of young Patrick’s alcoholic mother in “Manchester.” In fact, the latter was shunned from seeing her son, even deprived of supervised visits. It was decades before either character achieved any sense of recovery: Chiron’s mother went to rehab, and Patrick’s mother found sobriety through religion (though she seemed tormented by anxiety and possibly guilt, unable to even have lunch with her teenage son).
Jackie Kennedy ― and the entire world, through the power of television ― watched her husband shot to death, right before her eyes. When a bullet pierces his skull, we see fragments scattered in her lap. Words fail to capture tragedy of this magnitude.
In the aftermath of John F. Kennedy’s assassination, Jacqueline Kennedy ― played impeccably by Natalie Portman ― was pulled in a million directions. She needed to maintain her composure in the most public of spotlights while grieving her husband’s death. She planned an exquisite funeral procession, swore in the new president, comforted her children (“Daddy is with the angels”) and continued her stoic resolve. How did she do it?
Periodically throughout the film, Jackie is counseled by her priest, played by the late Sir John Hurt. But we also catch her popping pills and consuming alcohol. It was never clear what she was taking (anxiolytics? painkillers?) or for how long. But I think it’s safe to assume that the combination of pills and alcohol calmed her nerves, at least temporarily. In fact, her nightmares, alcohol use and suicidal thoughts were well-documented by biographers. Steely resolve in the face of tragedy comes at a price.
Addiction, according to the American Society of Addiction Medicine, is a chronic medical disease impacting the brain, affecting motivation, memory and judgement. Relapse is expected. People often try to “self-medicate” with substances like drugs or alcohol, or behaviors (e.g., cutting, gambling, risky sex). These behaviors can affect others, as we saw with Lee Chandler, Paula and Jackie Kennedy.
But what’s underneath the pain? Often it’s sadness, anger, fear, frustration, confusion or emptiness. Addictive substances and behaviors can provide temporary reprieve but long-term, those initially “self-soothing” acts become uncontrollable.
As a physician, I’ve seen how the “system” ― medical, educational, correctional, political ― often stigmatizes addiction. Labels like rape “victim,” “alcoholic,” or “junkie” can thrust people seeking help even deeper into a well of inner turmoil. Language matters. And stigmatizing language not only propagates stereotypes but deters people from seeking care. But treatment can help if it addresses the physiological AND psychological aspects of addiction. In fact, once connected to the appropriate treatment and recovery services, most people get better and go on to lead productive lives. As a doctor who has cared for numerous patients with traumatic pasts and ongoing substance use, I have seen people GET BETTER.
So, what can you do? Lend a compassionate shoulder, a nonjudgemental ear, or provide some words of comfort. These efforts can go a long way to ease the suffering. Referral to a professional (physician, psychologist, counselor, rehab facilities) is always wise. In time, hurt can be replaced by healing and hope. We can indeed triumph over pain and “beat it.”
Physical appearance and fashion choices aside, you might think you’ll be essentially the same person in old age as you were in adolescence.
But the longest-running study ever conducted on human personality challenges this assumption.
The study, the first to test people’s personalities in adolescence and again in old age, shows that compared to their younger selves, most people’s personalities in older adulthood are barely recognizable.
With unprecedented access, psychologists at the University of Edinburgh in the United Kingdom investigated how character traits shift as people get older by following a cohort of Scottish adults from adolescence to old age. The findings, published in the journal Psychology and Aging, significantly challenge the idea of personality as a relative constant throughout life.
“It’s important to appreciate how rare these data are,” Dr. Ian Deary, a professor of differential psychology at the university and one of the study’s authors, told The Huffington Post Wednesday. “The questions are not ideal and the ratings methods are not ideal, but the original sample is amazingly good and the time between ratings is unsurpassed.”
The researchers first accessed data from a study conducted in 1950, in which a group of teachers filled out personality assessments for more than 1,200 14-year-old students. These measured six basic personality traits: self-confidence, conscientiousness, perseverance, desire to excel, originality and stability of moods.
Then, in 2012, the researchers managed to track down students from the 1950 study. Of the 635 participants that they were able to locate, 174 agreed to take a personality test similar to the one they had participated in 63 years earlier.
The participants, who were now 77 years old on average, each filled out a personality assessment measuring the same six characteristics on which they were rated as teenagers. They also brought along someone close to them, who assessed the participant using the same personality scale.
Personality changes only gradually throughout life, but by older age it may be quite different from personality in childhood.
In comparing the then-and-now test results, the researchers were surprised to find virtually no overlaps. The only traits that had some mild constancy were stability of moods and conscientiousness, but the correlations weren’t strong.
The younger and older self seemed to bear no resemblance for each person. It was “as if the second tests had been given to different people,” the study’s authors noted.
“The longer the interval between two assessments of personality, the weaker the relationship between the two tends to be. Our results suggest that, when the interval is increased to as much as 63 years, there is hardly any relationship at all,” the researchers wrote in conclusion. “Personality changes only gradually throughout life, but by older age it may be quite different from personality in childhood.”
Still, the results were unexpected, considering that most other research shows personality traits to be relatively stable even across decades. Personality stability as a psychological construct can be traced back to William James, the father of American psychology, who said in 1890 that after the age of 30, the personality is “set like plaster.” Once we reach adulthood, he believed, our personalities are unlikely to change in any significant way.
Consistent with James’ views, research has shown personality plasticity ― a measure of how much our character traits change ― to decline as a person moves past young adulthood. The scientific evidence generally shows that personality traits are stable over long periods, Deary noted, and one study even found traits to be stable over more than 40 years.
But the new findings hint at the idea that personality may be more malleable than researchers thought. Of course, this data isn’t totally conclusive.
Comparing teacher assessments with later self-testing isn’t nearly as reliable as having the participants themselves take the same personality test at both ages. Plus, the sample size by the end of the study was relatively small. But the dramatic results nonetheless suggest that personality may be more malleable than we’ve acknowledged.
Considering also that our cells are replaced roughly every seven years, it starts to appear that as the decades go by, you really aren’t the person you used to be.