The Bootstraps Myth

From Melissa McEwan of the blog Shakesville:

The Myth of Bootstraps goes something like this: I never got any help from anyone. I achieved my American Dream all on my own, through hard work. I got an education, I saved my money, I worked hard, I took risks, and I never complained or blamed anyone else when I failed, and every time I fell, I picked myself up by my bootstraps and just worked even harder. No one helped me.

This is almost always a lie.

There are vanishingly few people who have never had help from anyone—who never had family members who helped them, or friends, or colleagues, or teachers. 

Who never benefited from government programs that made sure they had electricity, or mail, or passable roads, or clean drinking water, or food, or shelter, or healthcare, or a loan. 

Who never had any kind of privilege from which they benefited, even if they didn’t actively try to trade on it. 

Who never had an opportunity they saw as luck which was really someone, somewhere, making a decision that benefited them. 

Who never had friends to help them move, so they didn’t have to pay for movers. Who never inherited a couch, so they didn’t have to pay for a couch. Who never got hand-me-down clothes from a cousin, so their parents could afford piano lessons. Who never had shoes that fit and weren’t leaky, when the kid down the street didn’t.

Most, maybe all, of the people who say they never got any help from anyone are taking a lot of help for granted.

They imagine that everyone has the same basic foundations that they had—and, if you point out to them that these kids over here live in an area rife with environmental pollutants that have been shown to affect growth or brain function or breathing capacity, they will simply sniff with indifference and declare that those things don’t matter. That government regulations which protect some living spaces and abandon others to poisons isn’t help. 

The government giving you money to eat is a hand-out. The government giving you regulations that protect the air you breathe is, at best, nothing of value—and, at worst, a job-killing regulation that impedes the success of people who want to get rich dumping toxins into the ground where people getting hand-outs live.

What are your thoughts?

The Way We Treat Children

If my perceptions are correct, there seems to be a growing sentiment (perhaps typical of each older generation) that today’s youth are needlessly and excessively coddled and “wussified” (to use the kinder terminology). But the apparently prevailing notion that kids nowadays are excessively spoiled is actually dangerously overstated, according to a recent article in AlterNet by Paul L. Thomas, a doctor of education and long-time teacher.

After recalling a few anecdotes regarding personal or observed mistreatment of kids (mostly in the context of school), he makes the following point:

A day or so ago, I received an email from Alfie Kohn about his new book, The Myth of the Spoiled Child. I noticed it was similar to a book I am co-editing, Pedagogies of Kindness and Respect: On the Lives and Education of Children. I also noted that our perspectives on children—on how parents, teachers, and society treat children—appears to be a minority view.

I have been mulling, or more likely stewing, about this for some time: What makes adults—even the ones who choose to spend their lives with children—so damned negative and hateful about those children? That is the source of my palpable anger at the “grit,”“no excuses,” and “zero tolerance” narratives and policies. I grew up and live in the South, where the default attitude toward children remains that they are to be seen and not heard, that a child’s role is to do as she/he is told. If a child crosses those lines, then we must teach her/him a lesson, show her/him who is boss—rightfully, we are told, by hitting that child: spare the rod spoil the child. I find that same deficit view of children is not some backwoods remnant of the ignorant South; it is the dominant perspective on children throughout the U.S.

As Barbara Kingsolver explains in “Everybody’s Somebody’s Baby”:

>>For several months I’ve been living in Spain, and while I have struggled with the customs office, jet lag, dinner at midnight and the subjunctive tense, my only genuine culture shock has reverberated from this earthquake of a fact: People here like kids. They don’t just say so, they do. Widows in black, buttoned-down c.e.o.’s, purple-sneakered teen-agers, the butcher, the baker, all have stopped on various sidewalks to have little chats with my daughter. Yesterday, a taxi driver leaned out his window to shout “ Hola, guapa !” My daughter, who must have felt my conditioned flinch, looked up at me wide-eyed and explained patiently, “I like it that people think I’m pretty.”

With a mother’s keen myopia, I would tell you, absolutely, my daughter is beautiful enough to stop traffic. But in Santa Cruz de Tenerife, I have to confess, so is every other person under the height of one meter. Not just those who agree to be seen and not heard. When my daughter gets cranky in a restaurant (and really, what do you expect at midnight?), the waiters flirt and bring her little presents and nearby diners look on with that sweet, wistful gleam of eye that before now I have only seen aimed at the dessert tray. Children are the meringues and eclairs of this culture. Americans, it seems to me now, sometimes regard children as a sort of toxic-waste product: a necessary evil, maybe, but if it’s not their own they don’t want to see it or hear it or, God help us, smell it.<<

I’ve often noticed — and frankly even related with — the contradictory ways in which we regard children: they’re cute and enlivening on the one hand, but also irritating and burdensome on the other.  Their easily exploitable and powerless status also makes them a tempting target for venting one’s frustration or sense of inadequacy, which perhaps explains why children — along with women and the elderly — are frequently the victims of abuse in households and care centers.

Thomas also notes how the overall negative treatment of children intersects with racist and classist sentiments as well:

A child is not a small adult, not a blank slate to be filled with our “adult weariness,” or a broken human that must be repaired. It is also true that children are not angels; they are not pure creatures suited to be set free to find the world on their own. Seeing children through deficit or ideal lenses does not serve them—or anyone—well.

>>Within the U.S. culture there is a schizophrenia around kids—we worship young adulthood in popular media, but seem to hate children—that is multiplied exponentially by a lingering racism and classism that compounds the deficit view of childhood. Nowhere is this more evident than in the research showing how people view children of color:

Asked to identify the age of a young boy that committed a felony, participants in a study routinely overestimated the age of black children far more than they did white kids. Worse: Cops did it, too… The correlation between dehumanization and use of force becomes more significant when you consider that black boys are routinely estimated to be older than they are… The less the black kids were seen as human, the less they were granted “the assumption that children are essentially innocent.” And those officers who were more likely to dehumanize black suspects overlapped with those who used more force against them.<<

In the enduring finger-pointing dominant in the U.S.—blaming the poor for their poverty, blaming racial minorities for the burdens of racism, blaming women for the weight of sexism—we maintain a gaze that blinds us to ourselves, and allows us to ignore that in that gaze are reflections of the worst among us.

Why do the police sweep poor African American neighborhoods and not college campuses in search of illegal drugs? Why do we place police in the hallways of urban high schools serving mostly poor African American and Latino students, demanding “zero tolerance”? Why are “grit” narratives and “no excuses” policies almost exclusively targeting high-poverty, majority-minority schools (often charter schools with less public oversight)?

Here’s the basic crux of Thomas’ point.

Children are not empty vessels to be filled, blank hard drives upon which we save the data we decide they should have. Nor are children flawed or wild; they do not need us to repair or break them. Neither are they to be coddled or worshipped. They are children, and they are all our children. Yes, there are lessons to be taught, lessons to be learned. But those driven by deficit or idealized views are corrupted and corrupting lessons. Each and every child—as all adults—deserves to have her/his basic dignity respected, first, and as adults charged with the care of any child, our initial question before we do anything with or to a child must be about ourselves. In 31 years of teaching, I can still see and name the handful of students I mis-served in my career, like Billy above. Those faces and names today serve as my starting point: with any child, first do no harm.

What do you think?

Men Increasingly Struggling With Body-Image Issues

It’s been long documented, even accepted, that women suffer from pervasive body-image problems. It says a lot about our society that we take it as a given that women are inherently concerned about their weight and appearance. While that sadly hasn’t changed, despite coming increasingly under scrutiny, the problem seems to have caught up with men as well, as reported in The Atlantic:

new study of a national sample of adolescent boys, published in the January issue of JAMA Pediatrics, reveals that nearly 18 percent of boys are highly concerned about their weight and physique. They are also at increased risk for a variety of negative outcomes: Boys in the study who were extremely concerned about weight were more likely to be depressed, and more likely to engage in high-risk behaviors such as binge drinking and drug use.

The trend toward weight obsession among boys is cause for worry, says Dr. Alison Field, an associate professor of pediatrics at Boston Children’s Hospital and the lead author of the study. “You want people to be concerned enough about their weight to make healthy decisions,” she says, “but not so concerned that they’re willing to take whatever means it takes—healthy or unhealthy—to achieve their desired physique.”

Of the boys who were highly concerned with their weight, about half were worried only about gaining more muscle, and approximately a third were concerned with both thinness and muscularity simultaneously. Meanwhile, less than 15 percent were concerned only with thinness. Those statistics reflect a major difference between boys and girls when it comes to weight concerns: whereas girls typically want to be thinner, boys are as likely to feel pressure to gain weight as to lose it.

“There are some males who do want to be thinner and are focused on thinness,” Field says, “but many more are focused on wanting bigger or at least more toned and defined muscles. That’s a very different physique.”

The culprit, as with women is media, particularly entertainment media:

If boys are increasingly concerned about weight, changing representations of the male form in the media over the last decade or two are at least partly to blame. “We used to really discriminate—and we still do—against women” in terms of media portrayals, says Dr. Raymond Lemberg, a Prescott, Arizona-based clinical psychologist and an expert on male eating disorders. “If you look at the Miss America pageant winners or the Playboy centerfolds or the runway models over the years, there’s been more and more focus on thinness.”

But while the media pressure on women hasn’t abated, the playing field has nevertheless leveled in the last 15 years, as movies and magazines increasingly display bare-chested men with impossibly chiseled physiques and six-pack abs. “The media has become more of an equal opportunity discriminator,” says Lemberg. “Men’s bodies are not good enough anymore either.”

Even toys contribute to the distorted messages youngsters receive about the ideal male form. Take action figures, for example, which Lemberg suggests are the male equivalent of Barbie dolls in terms of the unrealistic body images they set up for young boys. In the last decade or two, action figures have lost a tremendous proportion of fat and added a substantial proportion of muscle. “Only 1 or 2 percent of [males] actually have that body type,” says Lemberg. “We’re presenting men in a way that is unnatural.”

I’m not sure if there’s any solid research linking idealized toys with warped views of body image, but the overall point remains: we’re surrounded by increasingly idealized and unrealistic standards of beauty that are being amplified by modern media and exploited (if not further amplified) by special interests seeking to profit on people’s desire to do whatever they can to meet these images. The problem is made worse by the widespread mentality that “real men” aren’t supposed to have body image issues — it’s a girl’s thing. Even if it’s recognized, there’s a misconception of how differently males suffer from the problem.

Although awareness of the risk of weight disorders among males is growing, there is still a problem with under-recognition, Field says, primarily because of the assumption that the disorders look the same in males as they do in females. Current assessments for eating disorders focus on the classical presentation typical of females, but since young men are often more concerned with gaining muscle than becoming thin, they typically don’t present as underweight, as girls often do. They’re also not as likely to starve themselves, use laxatives or induce vomiting; instead, they’re much more likely to engage in excessive amounts of exercise and steroid abuse. “Instead of wanting to do something unhealthy to get smaller, they’re using unhealthy means to become larger,” Field says.

But though the presentation might be different, excessive worries about weight, especially in combination with high-risk behaviors, are no less concerning in males than in females. According to Field, it’s time to sit up and take note of the boys. “Pediatricians and adolescent medicine docs and parents [need] to become aware that they should be listening as much to their sons’ conversations about weight as their daughters’.”

Having grown up obese, and now struggling with feeling too thin and out of shape, I can certainly relate with the low self-esteem and sense of personal failure that comes with not looking a certain way. Indeed, I imagine just about everyone shares this sentiment to some degree or another. While idealized standards of beauty have always existed, today’s world makes it far harder to avoid the pressure, especially when we’re equally bombarded with commercial solutions that supposedly help us.

Perhaps this development ties in with the increasing incidence of anxiety that is starting to characterize modern society, especially among younger people. Thoughts?

The Problem With ‘White History Month’

As Americans enter February, which is Black History Month, many of us will inevitably hear (or consider for ourselves) why there’s a month dedicated to blacks (and for that matter women and Hispanics) but not to whites. Setting aside the fact that minority views are often underrepresented or marginalized in mainstream history and culture — hence the effort to highlight these perspectives with their own dedicated events and institutions — Mary-Alice Daniel of Salon offers another good reason, one which explores the U.S.’s unusual, complex, and largely unknown history of racial identity.

The very notion of whiteness is relatively recent in our human history, linked to the rise of European colonialism and the Atlantic slave trade in the 17th century as a way to distinguish the master from the slave. From its inception, “white” was not simply a separate race, but the superior race. “White people,” in opposition to non-whites or “colored” people, have constituted a meaningful social category for only a few hundred years, and the conception of who is included in that category has changed repeatedly. If you went back to even just the beginning of the last century, you’d witness a completely different racial configuration of whites and non-whites. The original white Americans — those from England, certain areas of Western Europe, and the Nordic States — excluded other European immigrants from that category to deny them jobs, social standing, and legal privileges. It’s not widely known in the U.S. that several ethnic groups, such as Germans, Italians, Russians and the Irish, were excluded from whiteness and considered non-white as recently as the early 20th century.

Members of these groups sometimes sued the state in order to be legally recognized as white, so they could access a variety of rights available only to whites — specifically American citizenship, which was then limited, by the U.S. Naturalization Law of 1790, to “free white persons” of “good character.” Attorney John Tehranian writes in the Yale Law Journal that petitioners could present a case based not on skin color, but on “religious practices, culture, education, intermarriage and [their] community’s role,” to try to secure their admission to this elite social group and its accompanying advantages.

More than color, it was class that defined race. For whiteness to maintain its superiority, membership had to be strictly controlled. The “gift” of whiteness was bestowed on those who could afford it, or when it was politically expedient. In his book “How the Irish Became White,”Noel Ignatiev argues that Irish immigrants were incorporated into whiteness in order to suppress the economic competitiveness of free black workers and undermine efforts to unite low-wage black and Irish Americans into an economic bloc bent on unionizing labor. The aspiration to whiteness was exploited to politically and socially divide groups that had more similarities than differences. It was an apple dangled in front of working-class immigrant groups, often as a reward for subjugating other groups.

A lack of awareness of these facts has lent credence to the erroneous belief that whiteness is inherent and has always existed, either as an actual biological difference or as a cohesive social grouping. Some still claim it is natural for whites to gravitate to their own and that humans are tribal and predisposed to congregate with their kind. It’s easy, simple and natural: White people have always been white people. Thinking about racial identity is for those other people.

Those who identify as white should start thinking about their inheritance of this identity and understand its implications. When what counts as your “own kind” changes so frequently and is so susceptible to contemporaneous political schemes, it becomes impossible to argue an innate explanation for white exclusion. Whiteness was never about skin color or a natural inclination to stand with one’s own; it was designed to racialize power and conveniently dehumanize outsiders and the enslaved. It has always been a calculated game with very real economic motivations and benefits.

This revelation should not function as an excuse for those in groups recently accepted as white to claim to understand racism, to absolve themselves of white privilege or to deny that their forefathers, while not considered white, were still, in the hierarchy created by whites, responsible in turn for oppressing those “lower” on the racial scale. During the Civil War, Irish immigrants were responsible for some of the most violent attacks against freedmen in the North, such as the wave of lynchings during the 1863 Draft Riots, in which “the majority of participants were Irish,” according to Eric Foner’s book “Reconstruction: America’s Unfinished Revolution, 1863-1877”and various other sources.  According to historian Dominic Pacyga, Polish Americans groups in Chicago and Detroit “worked to prevent the integration of blacks into their communities by implementing rigid housing segregation” out of a fear that black people would “leap over them into a higher social status position.”

Behind every racial conversation is a complex history that extends to present-day interactions and policies, and we get nowhere fast if large swaths of our population have a limited frame of reference. An understanding of whiteness might have prevented the utter incapability of some Americans to realize that “Hispanic” is not a race — that white Hispanics do exist, George Zimmerman among them. This knowledge might have lessened the cries that Trayvon Martin’s murder could not have been racially motivated and might have led to, if not a just verdict, a less painfully ignorant response from many white Americans.

As for how all this ties into why a white history month would be wrongheaded and besides the point:

If students are taught that whiteness is based on a history of exclusion, they might easily see that there is nothing in the designation as “white” to be proud of. Being proud of being white doesn’t mean finding your pale skin pretty or your Swedish history fascinating. It means being proud of the violent disenfranchisement of those barred from this category. Being proud of being black means being proud of surviving this ostracism. Be proud to be Scottish, Norwegian or French, but not white.

Above all, such an education might help answer the question of whose problem modern racism really is. The current divide is a white construction, and it is up to white people to do the necessary work to dismantle the system borne from the slave trade, instead of ignoring it or telling people of color to “get over” its extant legacy. Critics of white studies have claimed that this kind of inquiry leads only to self-hatred and guilt. Leaving aside that avoiding self-reflection out of fear of bad feelings is the direct enemy of personal and intellectual growth, I agree that such an outcome should be resisted, because guilt is an unproductive emotion, and merely feeling guilty is satisfying enough for some. My hope in writing this is that white Americans will discover how it is they came to be set apart from non-whites and decide what they plan to do about it.

What do you think?

Map: U.S. Life Expectancy By State

Although the average American is living an impressive 30 years longer than 100 years ago — about 79.8 — by global standards, the U.S. still remains middle-of-the-road despite its great wealth; typically, we’re in the mid-thirties, usually along the same level as Cuba, Chile, or Costa Rica. Furthermore, life expectancy varies wildly from state to state, as the following map from The Atlantic clearly shows:

Life expectancy by state compared to closest matching country.

There’s profound variation by state, from a low of 75 years in Mississippi to a high of 81.3 in Hawaii. Mostly, we resemble tiny, equatorial hamlets like Kuwait and Barbados. At our worst, we look more like Malaysia or Oman, and at our best, like the United Kingdom. No state approaches the life expectancies of most European countries or some Asian ones. Icelandic people can expect to live a long 83.3 years, and that’s nothing compared to the Japanese, who live well beyond 84.

Life expectancy can be causal, a factor of diet, environment, medical care, and education. But it can also be recursive: People who are chronically sick are less likely to become wealthy, and thus less likely to live in affluent areas and have access to the great doctors and Whole-Foods kale that would have helped them live longer.

It’s worth noting that the life expectancy for certain groups within the U.S. can be much higher—or lower—than the norm. The life expectancy for African Americans is, on average, 3.8 years shorter than that of whites. Detroit has a life expectancy of just 77.6 years, but that city’s Asian Americans can expect to live 89.3 years.

But overall, the map reflects what we’d expect: People in southern states, which generally have lower incomes and higher obesity rates, tend to die sooner, and healthier, richer states tend to foster longevity.

It’s also worth adding that overall, the U.S. is far less healthy and long-lived than it should be, even when you adjust for wealth, race, and other factors (e.g. young Americans are less healthy than young people in other developed countries, rich people are typically less healthy than other rich non-Americans, etc).

When Mega-Cities Rule the World

The United States has always stood out among developed nations for its sheer size, in terms of territory, population, and urban centers. So perhaps it’s no surprise that we’ve seen the organic emergence of “mega-regions,” sprawling urban centers than span across multiple countries, states, and municipalities, often for hundreds of miles. Needless to say, these megalopolises dominate (or even completely consume) their respective regions, and together they drive the nation’s economic, cultural, social, and political direction.

The following is a map created by the Regional Plan Association, an urban research institute in New York, identifying the eleven main ‘mega-regions’ that are transcending both conventional cities and possibly even states.

To reiterate, the areas are Cascadia, Northern and Southern California, the Arizona Sun Corridor, the Front Range, the Texas Triangle, the Gulf Coast, the Great Lakes, the Northeast, Piedmont Atlantic, and peninsular Florida, my home state (and the only one that is almost entirely consumed by its own distinct mega-region).

Also note how some of these mega-regions spillover into neighboring Mexico and Canada, a transnational blending of urban regions that can be seen in many other developed countries (most notably those in Europe and E.U. specifically. I’d be curious to see a similar map for other parts of the world, especially since developing countries such as China, India, and Brazil are leading the global trend of mass urbanization.

This intriguing map is part of the Regional Plan Association’s America 2050 project,  which proposes that we begin to change our views of urban areas away from being distinct metropolitan areas but instead as interconnected “megaregions” act as distinct economic, social, and infrastructure areas in their own right.

These are the areas in which residents and policymakers are the most likely to have shared common interests and policy goals and would benefit most from co-operation with each other. It’s especially important, because as the Regional Plan Association notes, “Our competitors in Asia and Europe are creating Global Integration Zones by linking specialized economic functions across vast geographic areas and national boundaries with high-speed rail and separated goods movement systems.”

By concentrating investment in these regions and linking them with improved infrastructure, such megaregions enjoy competitive advantages such as efficiency, time savings, and mobility.

The U.S., however, has long focused on individual metro areas and the result has been a “limited capacity” to move goods quickly — this is a major liability threatening long-term economic goals. And while U.S. commuters are opting to drive less, public transportation isn’t even close to commuters’ needs.

The Regional Plan Association proposes aggressive efforts to promote new construction, and finds that even existing lines are in desperate need of large-scale repairs or updates to improve service. In particular, they say the emerging megaregions need transportation modes that can work at distances 200-500 miles across, such as high-speed rail.

While this makes sense, what are the consequences of having such potent sub-national entities emerging separately from already-established state and city limits? Should we, or will we, have to re-draw the map? Will these megaregions become the new powerhouses that influence the political and economic systems of the country at the expense of current representative structures? Will they coalesce into distinct interests that have their own separate political demands from the individual local and state governments that are wholly or partly covered by them?

Interesting questions to consider, especially in light of this being an accelerating global trend with little sign of stopping, let alone reversing. I’m reminded of Parag Khanna’s article, “When Cities Rule the World,” which argued that urban regions will come to dominate the world, ahead of — and often at the expense of —  nation states:

In this century, it will be the city—not the state—that becomes the nexus of economic and political power. Already, the world’s most important cities generate their own wealth and shape national politics as much as the reverse. The rise of global hubs in Asia is a much more important factor in the rebalancing of global power between West and East than the growth of Asian military power, which has been much slower. In terms of economic might, consider that just forty city-regions are responsible for over two-thirds of the total world economy and most of its innovation. To fuel further growth, an estimated $53 trillion will be invested in urban infrastructure in the coming two decades.

Given what we’ve seen with America’s megaregions, the prescient Mr. Khanna (who wrote this article three years ago) has a point. Here are some of his highlights regarding this trend and its implications:

Mega-cities have become global drivers because they are better understood as countries unto themselves. 20 million is no longer a superlative figure; now we need to get used to the nearly 100 million people clustered around Mumbai. Across India, it’s estimated that more than 275 million people will move into India’ s teeming cities over the next two decades, a population equivalent to the U.S. Cairo’s urban development has stretched so far from the city’ s core that it now encroaches directly on the pyramids, making them and the Sphynx commensurately less exotic. We should use the term “gross metropolitan product” to measure their output and appreciate the inequality they generate with respect to the rest of the country. They are markets in their own right, particularly when it comes to the “bottom of the pyramid,” which holds such enormous growth potential.

As cities rise in power, their mayors become ever more important in world politics. In countries where one city completely dominates the national economy, to be mayor of the capital is just one step below being head of state—and more figures make this leap than is commonly appreciated. From Willy Brandt to Jacques Chirac to Mahmoud Ahmadinejad, mayors have gone on to make their imprint on the world stage. In America, New York’s former mayor Rudy Giuliani made it to the final cut among Republican presidential candidates, and Michael Bloomberg is rumored to be considering a similar run once his unprecedented third term as Giuliani’s successor expires. In Brazil, José Serra, the governor of the São Paulo municipal region, lost the 2010 presidential election in a runoff vote. Serra rose to prominence in the early 1980s as the planning and economy minister of the state of São Paulo, and made his urban credentials the pillar of his candidacy.

It is too easy to claim, as many city critics do, that the present state of disrepair and pollution caused by many cities means suburbs will be the winner in the never-ending race to create suitable habitats for the world’s billions. In fact, it is urban centers—without which suburbs would have nothing to be “sub” to—where our leading experiments are taking place in zero-emissions public transport and buildings, and where the co-location of resources and ideas creates countless important and positive spillover effects. Perhaps most importantly, cities are a major population control mechanism: families living in cities have far fewer children. The enterprising research surrounding urban best practices is also a source of hope for the future of cities. Organizations like the New Cities Foundation, headquartered in Geneva, connect cities by way of convening and sharing knowledge related to sustainability, wealth creation, infrastructure finance, sanitation, smart grids, and healthcare. As this process advances and deepens, cities themselves become nodes in our global brain.

While most visions of the future imagine mega-corporations to be the entities that transcend nations and challenge them for supremacy, it may be these mega-regions or mega-cities that will be the true powerhouses of the world. In fact, we may even see something of a three-way struggle between all of these globalizing behemoths, as many nation-states also begin to band together to form more powerful blocs.

One things is for certain: the future will be an interesting experiment in testing humanity’s organizational and technological prowess, especially in the midst of worsening environmental conditions and strained national resources, which such mega-regions will no doubt need to overcome. What are your thoughts?

Hat tip to my friend Will for sharing this article with me.

The Double Standard of Drug Addiction

Yesterday, Phillip Seymour Hoffman — like sadly many other talented actors — died of an accidental drug overdose after years of struggles and relapses. His death has universally been mourned, including by yours truly. But like most high-profile deaths related to drugs, it exposes an even bigger tragedy: the unusual and ultimately counter-productive way in which society treats the subject of drug use. As Simon Jenkins of The Guardian succinctly observes:

Does the law also mourn? It lumps Hoffman together with thousands found dead and friendless in urban backstreets, also with needles in their arms. It treats them all as outlaws. Such is the double standard that now governs the regulation of addictive substances that we have had to develop separate universes of condemnation.

We cannot jail or otherwise hurl beyond the pale all who use drugs. We therefore treat some as “responsible users” and when something goes wrong mourn the tragedy. Offices, schools, hospitals, prisons, even parliament, are awash in illegal drug use. Their illegality is no deterrent. The courts could not handle proper enforcement, the prisons could not house the “criminals”. In Hoffman’s case his friends clearly knew that he was a drug addict. The police would have done nothing had they known.

So what do we do? We turn a blind eye to an unworkable law and assume it does not apply to people like us. We then relieve the implied guilt by taking draconian vengeance on those who supply drugs to those who need them, but who lack the friends and resources either to combat them or to avoid the law. Hospitals and police stations are littered each night with the wretched results.

There are no winners in the illegality of drugs, except the lucky ones who make money from it without getting caught. The only hope is that high-profile casualties such as Hoffman’s might lead a few legislators to see the damage done by these laws and correct their ways. At least in some American states the door of legalisation is now ajar. Not so in Britain, where the most raging addiction is inertia.

What do you think? Are drug-related deaths like Hoffman’s (among so many less visible ones) at least partly the result of a legal culture that criminalizes drugs, and by extension its victims? Would legalizing or decriminalizing once-illicit substances help turn drug abuse into a public health problem to be addressed, rather than a crime to be unequally and ineffectively enforced? Evidence from some U.S. states, as well countries around the world, suggests these steps would help to some extent. But what do you think?

Five Accurate Predictions By Karl Marx

Since the recession, and especially over the last couple of years, there’s been a flurry of articles that discuss (and sure enough, reflect) the growing interest in Marx and his theories. Most of them either explore whether Marxist ideology is relevant, and/or discus the increasing sympathy he’s garnering among people across the world.

Regardless of what you think of Marx or his ideas, it’s pretty interesting to see this once-widely dismissed and highly controversial (at least in the U.S.) figure gain so much coverage even from the likes of free-market or pro-capitalist publications. By the same token, a lot of once-fringe libertarian economists and thinkers are seeing their stars rise as well, in conjunction with the growth of libertarian movements in the U.S. (and to a lesser degree other parts of the world).

Needless to say, desperate times have called for desperate measures, so to speak, as more and more people show a willingness to explore alternatives once seen as unnecessary or unthinkable. To a large extent, this is typical in any period of crisis, as people are awakened from the once stable status quo that made them complacent and begin to ponder whether a better way is possible.

But that’s a discussion for another day. For now, consider this contentious post in Rolling Stone, which shares five of Marx’s observations that remain as relevant today, if not more so, than they were in his time (the nascent Industrial Revolution of the 19th century).

1. The Great Recession (Capitalism’s Chaotic Nature)

The inherently chaotic, crisis-prone nature of capitalism was a key part of Marx’s writings. He argued that the relentless drive for profits would lead companies to mechanize their workplaces, producing more and more goods while squeezing workers’ wages until they could no longer purchase the products they created. Sure enough, modern historical events from the Great Depression to the dot-com bubble can be traced back to what Marx termed “fictitious capital” – financial instruments like stocks and credit-default swaps. We produce and produce until there is simply no one left to purchase our goods, no new markets, no new debts. The cycle is still playing out before our eyes: Broadly speaking, it’s what made the housing market crash in 2008. Decades of deepening inequality reduced incomes, which led more and more Americans to take on debt. When there were no subprime borrows left to scheme, the whole façade fell apart, just as Marx knew it would.

2. The iPhone 5S (Imaginary Appetites)

Marx warned that capitalism’s tendency to concentrate high value on essentially arbitrary products would, over time, lead to what he called “a contriving and ever-calculating subservience to inhuman, sophisticated, unnatural and imaginary appetites.” It’s a harsh but accurate way of describing contemporary America, where we enjoy incredible luxury and yet are driven by a constant need for more and more stuff to buy. Consider the iPhone 5S you may own. Is it really that much better than the iPhone 5 you had last year, or the iPhone 4S a year before that? Is it a real need, or an invented one? While Chinese families fall sick with cancer from our e-waste, megacorporations are creating entire advertising campaigns around the idea that we should destroy perfectly good products for no reason. If Marx could see this kind of thing, he’d nod in recognition.

3. The IMF (The Globalization of Capitalism)

Marx’s ideas about overproduction led him to predict what is now called globalization – the spread of capitalism across the planet in search of new markets. “The need of a constantly expanding market for its products chases the bourgeoisie over the entire surface of the globe,” he wrote. “It must nestle everywhere, settle everywhere, establish connections everywhere.” While this may seem like an obvious point now, Marx wrote those words in 1848, when globalization was over a century away. And he wasn’t just right about what ended up happening in the late 20th century – he was right about why it happened: The relentless search for new markets and cheap labor, as well as the incessant demand for more natural resources, are beasts that demand constant feeding.

4. Walmart (Monopoly)

The classical theory of economics assumed that competition was natural and therefore self-sustaining. Marx, however, argued that market power would actually be centralized in large monopoly firms as businesses increasingly preyed upon each other. This might have struck his 19th-century readers as odd: As Richard Hofstadter writes, “Americans came to take it for granted that property would be widely diffused, that economic and political power would decentralized.” It was only later, in the 20th century, that the trend Marx foresaw began to accelerate. Today, mom-and-pop shops have been replaced by monolithic big-box stores like Walmart, small community banks have been replaced by global banks like J.P. Morgan Chase and small famers have been replaced by the likes of Archer Daniels Midland. The tech world, too, is already becoming centralized, with big corporations sucking up start-ups as fast as they can. Politicians give lip service to what minimal small-business lobby remains and prosecute the most violent of antitrust abuses – but for the most part, we know big business is here to stay.

5. Low Wages, Big Profits (The Reserve Army of Industrial Labor)

Marx believed that wages would be held down by a “reserve army of labor,” which he explained simply using classical economic techniques: Capitalists wish to pay as little as possible for labor, and this is easiest to do when there are too many workers floating around. Thus, after a recession, using a Marxist analysis, we would predict that high unemployment would keep wages stagnant as profits soared, because workers are too scared of unemployment to quit their terrible, exploitative jobs. And what do you know? No less an authority than the Wall Street Journal warns, “Lately, the U.S. recovery has been displaying some Marxian traits. Corporate profits are on a tear, and rising productivity has allowed companies to grow without doing much to reduce the vast ranks of the unemployed.” That’s because workers are terrified to leave their jobs and therefore lack bargaining power. It’s no surprise that the best time for equitable growth is during times of “full employment,” when unemployment is low and workers can threaten to take another job.

As always, I ask: what are your thoughts?

DNA Tests Reveal Ancient Europeans To Be Dark-Skinned

We’re accustomed to seeing portrayals of early humans (aka cavemen) as slightly tanned but otherwise mostly European-looking. But a recent study reported in NBC challenges that assumption, finding that as fairly recently as 7,000 years ago, Europeans were dark-skinned as Africans.

A 7,000-year-old European man from a transitional time known as the Mesolithic Period (from 10,000 to 5,000 years ago) whose bones were left behind in a Spanish cave had the dark skin of an African, but the blue eyes of a Scandinavian. He was a hunter-gatherer who ate a low-starch diet and couldn’t digest milk well — which meshes with the lifestyle that predated the rise of agriculture. But his immune system was already starting to adapt to a new lifestyle.

Researchers found all this out not from medical records, or from a study of the man’s actual skin or eyes, but from an analysis of the DNA extracted from his tooth.

The remains of the Mesolithic male, dubbed La Braña 1, were found in 2006 in the La Braña-Arintero cave complex in northwest Spain. In the Nature paper, the researchers describe how they isolated the ancient DNA, sequenced the genome and looked at key regions linked to physical traits — including lactose intolerance, starch digestion and immune response.

The biggest surprise was that the genes linked to skin pigmentation reflected African rather than modern European variations. That indicates that the man had dark skin, “although we cannot know the exact shade,” Carles Lalueza-Fox, a member of the research team from the Spanish National Research Council, said in a news release.

Meanwhile, The Guardian gets to another big, social implication:

Another surprise finding was that the man had blue eyes. That was unexpected, said Lalueza-Fox, because the mutation for blue eyes was thought to have arisen more recently than the mutations that cause lighter skin colour. The results suggest that blue eye colour came first in Europe, with the transition to lighter skin ongoing through Mesolithic times.

On top of the scientific impact, artists might have to rethink their drawings of the people. “You see a lot of reconstructions of these people hunting and gathering and they look like modern Europeans with light skin. You never see a reconstruction of a mesolithic hunter-gatherer with dark skin and blue eye colour,” Lalueza-Fox said. Details of the study are published in the journal, Nature.

It’s no secret (though perhaps underplayed) that modern humans originate from Africa, and thus would have had similar pigmentation and physiology to indigenous African (although note that Sub-Saharan Africa is the most diverse area in the world, so there is no quintessential African look, and many different skin shades and phenotypes are represented). 

However, the revelation that Europeans were — fairly recently by evolutionary standards — once indistinguishable from many modern Africans challenges popular attitudes towards race and human identity. We have a tendency to apply our modern biases to historical retrospection, and to over-emphasize physical differences that are superficial and ultimately artificial. Notions of race, nationhood, and what constitutes “European” or “African” are all social constructs of our very recent making.

Granted, this doesn’t mean that such concepts are worthless or negative, per se — although, needless to say, the potential for harm is great — but it does cloud up the facts about humanity’s origins and history, and overlooks how fundamentally arbitrary and transient our racial and national identities are.