Featured Image -- 5453

Gratitude Is the New Willpower

Harvard Business Review:

A fascinating study that shows another benefit to being grateful for one’s good fortune: restraint and willpower.

Originally posted on HBR Blog Network - Harvard Business Review:

Patience is a virtue, especially when it comes to building capital. But as with most virtues, it’s not always easy to muster, since it usually requires resisting temptations for gratification on the sooner side. Should you put the extra $1,000 earned this month in your retirement savings or use it to buy a new suit? Should you approve money from the firm’s “rainy-day” fund to cover travel for senior executives (yourself included) to a lavish conference this summer or let it continue to accrue as a buffer for future challenges? Such decisions – a type referred to by economists as intertemporal choices – are characterized by options that offer different rewards as time unfolds. That is, they contrast smaller pleasures or gains now with larger pleasures or gains later.

Almost everyone – from individual investors to CFOs of large corporations – would probably agree that the best way to choose between…

View original 618 more words

Map: U.S. Life Expectancy By State

Although the average American is living an impressive 30 years longer than 100 years ago — about 79.8 — by global standards, the U.S. still remains middle-of-the-road despite its great wealth; typically, we’re in the mid-thirties, usually along the same level as Cuba, Chile, or Costa Rica. Furthermore, life expectancy varies wildly from state to state, as the following map from The Atlantic clearly shows:

Life expectancy by state compared to closest matching country.

There’s profound variation by state, from a low of 75 years in Mississippi to a high of 81.3 in Hawaii. Mostly, we resemble tiny, equatorial hamlets like Kuwait and Barbados. At our worst, we look more like Malaysia or Oman, and at our best, like the United Kingdom. No state approaches the life expectancies of most European countries or some Asian ones. Icelandic people can expect to live a long 83.3 years, and that’s nothing compared to the Japanese, who live well beyond 84.

Life expectancy can be causal, a factor of diet, environment, medical care, and education. But it can also be recursive: People who are chronically sick are less likely to become wealthy, and thus less likely to live in affluent areas and have access to the great doctors and Whole-Foods kale that would have helped them live longer.

It’s worth noting that the life expectancy for certain groups within the U.S. can be much higher—or lower—than the norm. The life expectancy for African Americans is, on average, 3.8 years shorter than that of whites. Detroit has a life expectancy of just 77.6 years, but that city’s Asian Americans can expect to live 89.3 years.

But overall, the map reflects what we’d expect: People in southern states, which generally have lower incomes and higher obesity rates, tend to die sooner, and healthier, richer states tend to foster longevity.

It’s also worth adding that overall, the U.S. is far less healthy and long-lived than it should be, even when you adjust for wealth, race, and other factors (e.g. young Americans are less healthy than young people in other developed countries, rich people are typically less healthy than other rich non-Americans, etc).

Looking to Get Healthy? Try an Indigenous Diet

While there’s much that can be learned from indigenous peoples — particularly when it comes to herbal medicine and safeguarding the environment — nutrition was never something I had personally considered, until I came across this article from The Guardian. It reports on recent research that suggests that the centuries-old diets of indigenous groups from around the world is nutritionally superior to modern food, which consists of far more processing, refined fats and oils, and simple carbohydrates.

Many traditional and non-processed foods consumed by rural communities, such as millet and caribou, are nutrient-dense and offer healthy fatty acids, micronutrients and cleansing properties widely lacking in diets popular in high- and middle-income countries, say experts.

Indigenous diets worldwide – from forest foods such as roots and tubers in regions of eastern India to coldwater fish, caribou and seals in northern Canada – are varied, suited to local environments, and can counter malnutrition and disease.

“For many tribal and indigenous peoples, their food systems are complex, self-sufficient and deliver a very broad-based, nutritionally diverse diet,” says Jo Woodman, a senior researcher and campaigner at Survival International, a UK-based indigenous advocacy organisation.

Unfortunately, a combination of sociocultural marginalization, assimilation into modern society, and pressures by the global food market have caused indigenous peoples from the Americas to Asia to suffer from the same chronic health problems that bedevil most people in the developed world; in the process, they’re also losing the knowledge and practices that contributed to the relative healthiness of their diets.

“There is a deep irony in the fact that many dietitians are advocating [traditional and indigenous foods and diets] and yet [the] modern [western] diet is what is being pushed on tribal peoples around the world, with devastating results,” Woodman says.

“We have lost our primary relationship with our world around us,” says Dr Martin Reinhardt, assistant professor of Native American studies at Northern Michigan University.

Native American elders historically planned seven generations ahead when creating food systems, teaching each generation that it was their responsibility to ensure the survival of the seventh, says Reinhardt, an Anishinaabe Ojibway citizen of the Sault Ste. Marie Chippewa Native American people in Michigan state. They did this by hunting and gathering only what they needed, conserving resources such as wood and water, and protecting food biodiversity.

But when Native Americans were forced to assimilate, historical access to this nutritional knowledge was lost, Reinhardt points out. According to thespecial diabetes programme for Indians, run by the US federal government’s Indian health service , the 566 registered indigenous peoples in the US have a diabetes rate nine times higher than the national average.

Similarly, rates of the disease among First Nations and Inuit groups in Canada are up to five times higher than the countrywide average, according to the government’s federal health department.

In Laos, northern highland minorities such as the Yawa, Htin and Khmu traditionally eat forest-based diets, including wild pigs, birds, bamboo shoots, banana flowers and yams rich in vitamin C. But in recent decades the Laos government has moved thousands of people from the highlands to towns for economic reasons, documented in a 2012 report by the International Fund for Agricultural Development.

I encourage you to read the whole article, as it captures the complex intersection of various global issues, including the pressures of feeding a growing population (particularly one with more resources to consume more food), the environmental degradation brought about by the demand for grain (which is also crowding out other diverse and healthier food sources), and the continued destruction of traditional rural communities with whom we’re losing valuable knowledge.

U.S. Among the Sickest of Developed Nations

According to a report by the Centers for Disease Control and Prevention, 75 percent of healthcare spending — the highest in the world — goes toward aiding people with chronic conditions. In fact, almost half of American adults had at least one chronic condition as 2005.

Chronic conditions — a category that includes everything from autoimmune diseases like arthritis and lupus, to obesity, heart disease, and diabetes — are not only the number one cause of death in the U.S., they’re compromising Americans’ quality of life and disabling people for long periods of time. For example, arthritis affects 20 percent of adults, and is the most common cause of disability in America. Those afflicted are projected to increase from 46 million to 67 million by 2030, and 25 million of these individuals will have limited activity as a result.

Not only are Americans as a whole getting sicker, but so are young people. A 2013 report by the National Research Council and Institute of Medicine (NAC/IOM) found that “For many years, Americans have been dying at younger ages than people in almost all other high income countries.” Their data showed that women are less likely to live to age 50 if they’re born in the United States than other high income countries; in the 1980s, the U.S. was in the middle-range for survival of women to age 50 pack, but since then, not only has the U.S. fallen down in the ranking, they’ve fallen off the chart.

Note that even when adjusting for race and socioeconomic status, the results are the same: rich Americans die earlier than rich people in other countries, college-educated people die earlier than college-educated people in other countries, and Americans as a whole are sicker and shorter-lived than comparable developed nations.

In fact, a recent report by the University of Washington’s Institute for Health Metrics and Evaluation, says that “in some U.S. counties… life expectancies are on par with countries in North Africa and Southeast Asia.”

To learn more about this issue, and some of the complex and multidimensional factors behind it, click here.

The Post-Antibiotic Era

Wired Magazine offers a disturbing glimpse into a future where antibiotic resistance — caused largely by overuse of such treatments — has lead to the proliferation of powerful diseases.

If we really lost antibiotics to advancing drug resistance — and trust me, we’re not far off — here’s what we would lose. Not just the ability to treat infectious disease; that’s obvious.

But also: The ability to treat cancer, and to transplant organs, because doing those successfully relies on suppressing the immune system and willingly making ourselves vulnerable to infection. Any treatment that relies on a permanent port into the bloodstream — for instance, kidney dialysis. Any major open-cavity surgery, on the heart, the lungs, the abdomen. Any surgery on a part of the body that already harbors a population of bacteria: the guts, the bladder, the genitals. Implantable devices: new hips, new knees, new heart valves. Cosmetic plastic surgery. Liposuction. Tattoos.

We’d lose the ability to treat people after traumatic accidents, as major as crashing your car and as minor as your kid falling out of a tree. We’d lose the safety of modern childbirth: Before the antibiotic era, 5 women died out of every 1,000 who gave birth. One out of every nine skin infections killed. Three out of every 10 people who got pneumonia died from it.

And we’d lose, as well, a good portion of our cheap modern food supply. Most of the meat we eat in the industrialized world is raised with the routine use of antibiotics, to fatten livestock and protect them from the conditions in which the animals are raised. Without the drugs that keep livestock healthy in concentrated agriculture, we’d lose the ability to raise them that way. Either animals would sicken, or farmers would have to change their raising practices, spending more money when their margins are thin. Either way, meat — and fish and seafood, also raised with abundant antibiotics in the fish farms of Asia — would become much more expensive.

And it wouldn’t be just meat. Antibiotics are used in plant agriculture as well, especially on fruit. Right now, a drug-resistant version of the bacterial disease fire blight is attacking American apple crops. There’s currently one drug left to fight it. And when major crops are lost, the local farm economy goes too.

Needless to say, this is pretty concerning stuff, especially since it hasn’t received all that much attention. For more information, read this lengthy and detailed report, “Imagining a Post-Antibiotics Future”, and spread the word.

Do Antibiotics Cause Obesity?

It seems there are no shortage of factors contributing to the growing obesity epidemic (recently declared a pandemic by the World Health Organization). But Mother Jones presents research that purports to have found the most unusual culprit thus far: the use of antibiotics, which has similarly been on the rise. First, consider these two maps:

As you can see, there’s a pretty strong correlation between antibiotic use and obesity rates. But that immediately leads to the question of whether this relationship reflects causation. As the article notes:

When we mashed up the data behind these maps, we confirmed the strong correlation between obesity and antibiotic prescription rates (we got an r of 0.74, for the statistically inclined). We also found a correlation between the states’ median household incomes and antibiotic prescription rates: States with below-average median incomes tend to have higher antibiotic prescription rates. This makes sense, considering that high obesity rates correlate with low income levels. (You can see the data sets for antibiotic prescription rate, obesity, and median household income level here.)

Hicks and her team can’t yet explain the connection between obesity and high rates of antibiotic prescription. “There might be reasons that more obese people need antibiotics,” she says. “But it also could be that antibiotic use is leading to obesity.”

Indeed, both factors could be due to a shared cause, such as socioeconomic status: most of these states have high rates of poverty, and poorer Americans tend to be more susceptible to both illness and obesity (due to low access to medical care and nutritional food).

However, it seems as if antibiotics play a direct  role in obesity.

Indeed, a growing body of evidence suggests that antibiotics might be linked to weight gain. A 2012 New York University study found that antibiotic use in the first six months of life was linked with obesity later on. Another 2012 NYU study found that mice given antibiotics gained more weight than their drug-free counterparts. As my colleague Tom Philpott has noted repeatedly, livestock operations routinely dose animals with low levels of antibiotics to promote growth.

No one knows exactly how antibiotics help animals (and possibly humans) pack on the pounds, but there are some theories. One is that antibiotics change the composition of the microbiome, the community of microorganisms in your body that scientists are just beginning to understand. (For a more in-depth look at the connection between bacteria and weight loss, read Moises Velasquez-Manoff’s piece on the topic.)

Like any good study, there’s a caveat to keep in mind:

Hicks says that more research is needed on the potential connection between antibiotics and obesity. But there are other reasons for doctors to change the way they prescribe antibiotics. As I noted a few weeks back, a recent study in JAMA Internal Medicine found that doctors commonly prescribe antibiotics for symptoms such as sore throat and bronchitis—which don’t usually require the drugs. Considering that bacteria are already evolving to withstand many antibiotics, it’s probably time to figure out how to use them more prudently.

One thing seems certain — the causes of obesity go far beyond overeating and being too sedentary. There seem to be many more factors at work, even some unlikely ones.

Coffee vs. Smoothie: Which is Healthier?

Two of the world’s most popular drinks are head-to-head in the battle for which is healthiest. As the BBC reports, the conclusion may surprise you, and perhaps be met with much skepticism and discord.

Let’s start with coffee, which has long been regarded as unhealthy except in the most moderate amounts. As the articles notes, the widespread misconception that coffee leads to many health problems stems from methodological flaws in past studies on coffee drinkers:

These claims have been largely based on case control studies, where you take a group of people who drink coffee and compare them with another matched group who don’t.

The problem with this approach is that coffee drinkers are more likely than non-coffee drinkers to have other “bad” habits, like drinking alcohol or smoking, so it is hard to tease apart what is really doing the harm.

A more reliable way to get at the truth is to do what is called a prospective cohort study. You take a group of disease-free individuals, collect data about them, then follow them for a large number of years to see what happens.

So what did those more reliable studies reveal?

When scientists collected data on the coffee drinking habits of 130,000 men and women and then followed them for over 20 years they found that coffee is rather a good thing (The Relationship of Coffee Consumption with Mortality, Annals of Internal Medicine, June 2008).

They crunched the numbers and concluded that “regular coffee consumption was not associated with an increased mortality rate in either men or women”.

In fact, data from this study suggests that moderate coffee consumption is mildly protective, leading to slightly lower all-cause mortality in coffee drinkers than non-coffee drinkers. Based on this and other studies the most effective “dose” is two to five cups a day. More than that and any benefits drop off. There are hundreds of different substances in coffee, including many different flavonoids (compounds widely found in plants that have antioxidant effects). Which of these ingredients is beneficial, we simply don’t know.

But wait, there’s more! Coffee even confers some psychological benefits, if you could believe it:

In research recently published in the World Journal of Biological Psychiatry (July 2013) they found that people who drank two to four cups of caffeinated coffee a day were half as likely to commit suicide as those who either drank decaff or fewer than two cups a day. This research pulled together data from three studies that had followed more than 200,000 people for more than 14 years, so it’s pretty reliable. It is also supported by a number of other studies, which makes this claim even more plausible.

One reason why caffeine may be a mild anti-depressant is that as well as making you more alert, it increases levels of neurotransmitters in the brain, like dopamine and serotonin, that are known to improve mood.

As anyone who’s ever needed to get through a rough workday can attest, coffee does indeed do wonders to one’s mood. As a recent nine-to-fiver myself, I’ve been won over by coffee out of sheer necessity. Few other things help get me going, except tea (and even then, coffee is the secret weapon for when even tea won’t suffice).

Of course, like any good study (or like most things in life in general for that matter), there are important caveats to keep in mind:

The researchers don’t recommend going overboard, noting that “there is little further benefit for consumption above two to three cups”.

One note of caution is that these trials began many years ago so the sort of coffee consumption being tested is almost certainly good, old-fashioned coffee.

A simple mug of coffee delivers somewhere between zero and 60 calories, depending on whether it is black, white or white with one sugar. Cappuccinos, lattes and mochas contain coffee but they also contain a lot of calories — anything between 100 and 600 — so when it comes to fancy coffees I limit myself to the occasional tall, skinny cappuccino (70 calories).

In short, like most dietary guidelines, stay in moderation and be wary of what kind of coffee you’re drinking. It seems simple enough, especially when the benefits are vast.

So what about that other popular trend, smoothies?

In a study published in August 2013 in the British Medical Journal (Fruit Consumption and risk of type 2 diabetes) they found that while eating fruit cuts your risk of developing diabetes, drinking it appears to increase the risk.

This was another big study involving lots of people followed for many years. An interesting finding was that different fruits gave different levels of benefit. Three servings of blueberries, for example, cut the risk of diabetes by 26%, while eating apples, pears, bananas and grapefruits also had a positive, albeit much smaller, effect.

Overall those who ate fruit cut their risk of developing diabetes by 2%, while those who drank it (more than three glasses of fruit juice a week) increased their risk by 8%.

More bad news for fruit juice drinkers comes from a case-controlled study done in Western Australia that examined the daily diets of more than 2,000 people. They found that eating some types of fruit and vegetables (cabbage, broccoli, cauliflower and apples) cuts your risk of colorectal cancer, while drinking fruit juice was associated with an increased risk of rectal cancer. Sugary drinks lead to raised levels of the hormone insulin and persistently high levels of insulin are associated with increased risk of some cancers. The researchers point out that many things that protect against bowel cancer, such as antioxidants and fibre, are lost or diminished during the juicing process.

In other words, not only do juices and smoothies diminish the palpable health benefits of fruits and veggies, but they may worsen your health due to additives such as sugar. But again, here’s the caveat:

None of these studies specifically looked at the health benefits or otherwise of fruit smoothies, which are a relatively recent phenomenon, nor did they look at the impact of different types of juice – for instance, whether it was freshly squeezed or from concentrate, homemade or shop-bought. I would assume, for example, that drinking a homemade vegetable smoothie is going to be a lot better for you than a commercial fruit smoothie.

And I very much doubt that the occasional fruit juice or fruit smoothie is going to do any harm.

There you have it folks. Just watch your intake and source. Coffee, fruits, and veggies are fine so long as you’re not adding too much sugar, milk, processing, and the like. It seems like a reasonable enough result. What say you all?

We Sleep to Clean Our Minds

That’s according to researchers seeking to uncover the mystery of why humans and most animals sleep. Like the rest of our body, our brain regularly produces waste as a byproduct of its day-to-day functions, which are known collectively as metabolites. These can build up over time and interfere with the health and function of the organ, which is why our body also has an entire system devoted to flushing these wastes out: the sadly-underrated lymphatic system. But things work differently for our central command center:

But while brain cells burn up a vast amount of fuel and are highly sensitive to a build-up in their own metabolites, the brain has a trash-removal process that is far less straightforward than that by which wastes are removed from the rest of the body. The lymph system collects metabolites from tissues throughout the body and dumps them into the bloodstream, where they’re carried to the liver for breakdown and removal. The brain’s metabolic waste concentrates in interstitial fluid present in all corners of the brain. A second slurry — cerebrospinal fluid — circulates throughout the brain, and where the two fluids flow together, the metabolic byproducts are carried away by the cerebrospinal fluid.

This is part of the reason why the brain sits in a pool of fluids — among other things, this biological concoction serves as a plumbing system. It’s interesting to note that the excessive build-up of such waste coincides with the development of various dementias, including  Alzheimer’s Disease. That in turn explains why a lack of sleep, especially over a long period of time, inevitably leads to a range of behavioral and cognitive problems, including hallucinations, mood swings, depression, and ultimately death. When it comes to the brain, sleep seems to be the only time it’s given a good cleaning:

In a new study, scientists from University of Rochester Medical Center and New York University found that the brains of mice — whether they are sleeping or anesthetized — showed more activity and volume at the “transfer stations,” where interstitial and cerebrospinal fluid meet, than did mice who were awake and active. The result was that by the end of a sleep period — around early evening — mouse brains had their lowest concentration of neural refuse of the day. By the time they were ready to sleep again, those concentrations had reached their peak.

It wasn’t just the mouse circadian schedule that initiated the trash removal: even when researchers used the powerful sedative ketamine to put the mice to sleep, they saw evidence of a sudden increase in traffic at the brain’s transfer stations.

Noting the link between sleep deprivation or disruption and neurodegenerative disease, the authors suggest that neural trash removal must be one of sleep’s major benefits. Indeed, they surmised, it could even be that the build-up of brain refuse may be one of the cues that drives us to bed, and that an empty trash bin may help signal us to wake and initiate another day of mental activity and its inevitable byproduct, brain trash.

While we’ve always known that sleep serves as a way of recharging the brain, the precise way in which it does so was, until now, little known. This might also explain why people vary widely in the amount of sleep they need. Sleepiness may depend less on how much you tire your body out and more on how much “neural trash” you build up.

The Obesity Pandemic

The following map shows obesity rates worldwide as of 2008, according to the World Health Organization. As you can see, it’s an issue hardly limited to the United States or the developed world.

 

Around 500 million adults around the world are obese (defined as a body mass index of 30 or higher), which represents 10 percent of men and 14 percent of women – nearly double the rate of obesity in 1980. Another 1 billion adults are overweight (BMI of 25 or higher), with the population of obese people set to double to 1 billion by 2030. Data for children are scarcer, but one global estimate finds that as of 2010, about 43 million preschool children were overweight or obese.

Though some countries clearly suffer from the issue more than others, almost every nation has recorded a rise in overweight or obesity, especially those in the developing world, North America, and the Middle-East; notably, the rate has risen especially high among poorer people and societies. There is also ongoing controversy regarding the reliability of the BMI and whether this issue merits the label of a pandemic.

Read more about this issue — including an in-depth background for each region — here.

Link

The following excerpt is from a post on Brute Reason discussing the problems with using psychiatric terms in a colloquial and metaphorical sense.

These words are used so casually that our conception of their meaning gradually shifts without our even noticing it. It’s like a boy-who-cried-wolf type of situation in that regard. If nine different friends joke to you about how they’re ‘sooooo OCD’ because they like all their books organized just so on their shelf (a situation familiar to just about every bibliophile, honestly), then the tenth friend who comes to you and tells you that they have OCD is probably going to evoke that mental image, rather than one of someone who actually can’t stop obsessing over particular little things and carrying out rituals that interfere with that person’s normal functioning, perhaps to the point of triggering comorbid disorders like depression. This may be a person who washes their hands until they are raw and hurting, someone who has to flick the light switch on and off seven times every time they leave a room, or someone who has recurring, uncontrollable thoughts about hurting someone they love even though they have no actual desire to do that.Well, that sounds a little different than insisting that your books be categorized by subject and then alphabetized by author, no?Likewise, if your friends are constantly telling you they’re ‘depressed’ because their team lost or because they got a bad grade, only to return to their normal, cheerful selves within a few hours, the next person who tells you that they are “depressed” might elicit a reaction of, ‘Come on, get over it! You’ll feel better if you go out with us.’

And so the meanings of words change.

We must either change the way these words are used, or at the very least recognize the nuance in their meaning — not everyone who says they have anxiety or depression actually does, in the clinical sense; moreover, those who do make a serious claim to such conditions should be given the benefit of the doubt, and not assumed to be displaying mere personality quirks or the like.
Thoughts?