Unfortunately, I’m working this Presidents Day — which is the birthday of both George Washington and Abraham Lincoln — so I’ve decided to just share this interesting article from HuffPost that offers at least one quirky fact about each president (Taft gets two, since he is the only president to have served two non-consecutive terms — there’s a fun fact!). Here are some of my favorites:
- Andrew Jackson had a pet parrot that he taught how to swear.
- Supposedly, President Van Buren popularized one of the most commonly used phrases to date: “OK”, or “Okay”. Van Buren was from Kinderhook, NY which was also called “Old Kinderhook”. His support groups came to be known as “O.K. Clubs” and the term OK came to mean “all right”.
- When Abe Lincoln moved to New Salem, Illinois in 1831, he ran into a local bully named Jack Armstrong. Armstrong challenged Lincoln to a wrestling match outside of Denton Offutt’s store, where Lincoln was a clerk, and townspeople gathered to watch and wager on it. Lincoln won.
- Andrew Johnson was drunk during his inauguration (go figure, he’s considered one of the worst presidents in U.S. history).
- After leaving office, William Taft became the only ex-president to serve as Chief Justice of the Supreme Court, effectively becoming the only person to serve as the head of two branches of government. In doing so, he swore in both Calvin Coolidge and Herbert Hoover to the presidency. (On an unrelated note, he also lost 150 pounds after leaving office.)
- To date, Woodrow Wilson was the only president to hold a doctorate degree, making him the highest educated president in the history of the United States. He was awarded the degree in Political Science and History from Johns Hopkins University. He also passed the Georgia Bar Exam despite not finishing law school.
Enjoy and have a safe and happy Presidents Day!
As Americans enter February, which is Black History Month, many of us will inevitably hear (or consider for ourselves) why there’s a month dedicated to blacks (and for that matter women and Hispanics) but not to whites. Setting aside the fact that minority views are often underrepresented or marginalized in mainstream history and culture — hence the effort to highlight these perspectives with their own dedicated events and institutions — Mary-Alice Daniel of Salon offers another good reason, one which explores the U.S.’s unusual, complex, and largely unknown history of racial identity.
The very notion of whiteness is relatively recent in our human history, linked to the rise of European colonialism and the Atlantic slave trade in the 17th century as a way to distinguish the master from the slave. From its inception, “white” was not simply a separate race, but the superior race. “White people,” in opposition to non-whites or “colored” people, have constituted a meaningful social category for only a few hundred years, and the conception of who is included in that category has changed repeatedly. If you went back to even just the beginning of the last century, you’d witness a completely different racial configuration of whites and non-whites. The original white Americans — those from England, certain areas of Western Europe, and the Nordic States — excluded other European immigrants from that category to deny them jobs, social standing, and legal privileges. It’s not widely known in the U.S. that several ethnic groups, such as Germans, Italians, Russians and the Irish, were excluded from whiteness and considered non-white as recently as the early 20th century.
Members of these groups sometimes sued the state in order to be legally recognized as white, so they could access a variety of rights available only to whites — specifically American citizenship, which was then limited, by the U.S. Naturalization Law of 1790, to “free white persons” of “good character.” Attorney John Tehranian writes in the Yale Law Journal that petitioners could present a case based not on skin color, but on “religious practices, culture, education, intermarriage and [their] community’s role,” to try to secure their admission to this elite social group and its accompanying advantages.
More than color, it was class that defined race. For whiteness to maintain its superiority, membership had to be strictly controlled. The “gift” of whiteness was bestowed on those who could afford it, or when it was politically expedient. In his book “How the Irish Became White,”Noel Ignatiev argues that Irish immigrants were incorporated into whiteness in order to suppress the economic competitiveness of free black workers and undermine efforts to unite low-wage black and Irish Americans into an economic bloc bent on unionizing labor. The aspiration to whiteness was exploited to politically and socially divide groups that had more similarities than differences. It was an apple dangled in front of working-class immigrant groups, often as a reward for subjugating other groups.
A lack of awareness of these facts has lent credence to the erroneous belief that whiteness is inherent and has always existed, either as an actual biological difference or as a cohesive social grouping. Some still claim it is natural for whites to gravitate to their own and that humans are tribal and predisposed to congregate with their kind. It’s easy, simple and natural: White people have always been white people. Thinking about racial identity is for those other people.
Those who identify as white should start thinking about their inheritance of this identity and understand its implications. When what counts as your “own kind” changes so frequently and is so susceptible to contemporaneous political schemes, it becomes impossible to argue an innate explanation for white exclusion. Whiteness was never about skin color or a natural inclination to stand with one’s own; it was designed to racialize power and conveniently dehumanize outsiders and the enslaved. It has always been a calculated game with very real economic motivations and benefits.
This revelation should not function as an excuse for those in groups recently accepted as white to claim to understand racism, to absolve themselves of white privilege or to deny that their forefathers, while not considered white, were still, in the hierarchy created by whites, responsible in turn for oppressing those “lower” on the racial scale. During the Civil War, Irish immigrants were responsible for some of the most violent attacks against freedmen in the North, such as the wave of lynchings during the 1863 Draft Riots, in which “the majority of participants were Irish,” according to Eric Foner’s book “Reconstruction: America’s Unfinished Revolution, 1863-1877”and various other sources. According to historian Dominic Pacyga, Polish Americans groups in Chicago and Detroit “worked to prevent the integration of blacks into their communities by implementing rigid housing segregation” out of a fear that black people would “leap over them into a higher social status position.”
Behind every racial conversation is a complex history that extends to present-day interactions and policies, and we get nowhere fast if large swaths of our population have a limited frame of reference. An understanding of whiteness might have prevented the utter incapability of some Americans to realize that “Hispanic” is not a race — that white Hispanics do exist, George Zimmerman among them. This knowledge might have lessened the cries that Trayvon Martin’s murder could not have been racially motivated and might have led to, if not a just verdict, a less painfully ignorant response from many white Americans.
As for how all this ties into why a white history month would be wrongheaded and besides the point:
If students are taught that whiteness is based on a history of exclusion, they might easily see that there is nothing in the designation as “white” to be proud of. Being proud of being white doesn’t mean finding your pale skin pretty or your Swedish history fascinating. It means being proud of the violent disenfranchisement of those barred from this category. Being proud of being black means being proud of surviving this ostracism. Be proud to be Scottish, Norwegian or French, but not white.
Above all, such an education might help answer the question of whose problem modern racism really is. The current divide is a white construction, and it is up to white people to do the necessary work to dismantle the system borne from the slave trade, instead of ignoring it or telling people of color to “get over” its extant legacy. Critics of white studies have claimed that this kind of inquiry leads only to self-hatred and guilt. Leaving aside that avoiding self-reflection out of fear of bad feelings is the direct enemy of personal and intellectual growth, I agree that such an outcome should be resisted, because guilt is an unproductive emotion, and merely feeling guilty is satisfying enough for some. My hope in writing this is that white Americans will discover how it is they came to be set apart from non-whites and decide what they plan to do about it.
What do you think?
Although the average American is living an impressive 30 years longer than 100 years ago — about 79.8 — by global standards, the U.S. still remains middle-of-the-road despite its great wealth; typically, we’re in the mid-thirties, usually along the same level as Cuba, Chile, or Costa Rica. Furthermore, life expectancy varies wildly from state to state, as the following map from The Atlantic clearly shows:
There’s profound variation by state, from a low of 75 years in Mississippi to a high of 81.3 in Hawaii. Mostly, we resemble tiny, equatorial hamlets like Kuwait and Barbados. At our worst, we look more like Malaysia or Oman, and at our best, like the United Kingdom. No state approaches the life expectancies of most European countries or some Asian ones. Icelandic people can expect to live a long 83.3 years, and that’s nothing compared to the Japanese, who live well beyond 84.
Life expectancy can be causal, a factor of diet, environment, medical care, and education. But it can also be recursive: People who are chronically sick are less likely to become wealthy, and thus less likely to live in affluent areas and have access to the great doctors and Whole-Foods kale that would have helped them live longer.
It’s worth noting that the life expectancy for certain groups within the U.S. can be much higher—or lower—than the norm. The life expectancy for African Americans is, on average, 3.8 years shorter than that of whites. Detroit has a life expectancy of just 77.6 years, but that city’s Asian Americans can expect to live 89.3 years.
But overall, the map reflects what we’d expect: People in southern states, which generally have lower incomes and higher obesity rates, tend to die sooner, and healthier, richer states tend to foster longevity.
It’s also worth adding that overall, the U.S. is far less healthy and long-lived than it should be, even when you adjust for wealth, race, and other factors (e.g. young Americans are less healthy than young people in other developed countries, rich people are typically less healthy than other rich non-Americans, etc).
The United States has always stood out among developed nations for its sheer size, in terms of territory, population, and urban centers. So perhaps it’s no surprise that we’ve seen the organic emergence of “mega-regions,” sprawling urban centers than span across multiple countries, states, and municipalities, often for hundreds of miles. Needless to say, these megalopolises dominate (or even completely consume) their respective regions, and together they drive the nation’s economic, cultural, social, and political direction.
The following is a map created by the Regional Plan Association, an urban research institute in New York, identifying the eleven main ‘mega-regions’ that are transcending both conventional cities and possibly even states.
To reiterate, the areas are Cascadia, Northern and Southern California, the Arizona Sun Corridor, the Front Range, the Texas Triangle, the Gulf Coast, the Great Lakes, the Northeast, Piedmont Atlantic, and peninsular Florida, my home state (and the only one that is almost entirely consumed by its own distinct mega-region).
Also note how some of these mega-regions spillover into neighboring Mexico and Canada, a transnational blending of urban regions that can be seen in many other developed countries (most notably those in Europe and E.U. specifically. I’d be curious to see a similar map for other parts of the world, especially since developing countries such as China, India, and Brazil are leading the global trend of mass urbanization.
This intriguing map is part of the Regional Plan Association’s America 2050 project, which proposes that we begin to change our views of urban areas away from being distinct metropolitan areas but instead as interconnected “megaregions” act as distinct economic, social, and infrastructure areas in their own right.
These are the areas in which residents and policymakers are the most likely to have shared common interests and policy goals and would benefit most from co-operation with each other. It’s especially important, because as the Regional Plan Association notes, “Our competitors in Asia and Europe are creating Global Integration Zones by linking specialized economic functions across vast geographic areas and national boundaries with high-speed rail and separated goods movement systems.”
By concentrating investment in these regions and linking them with improved infrastructure, such megaregions enjoy competitive advantages such as efficiency, time savings, and mobility.
The U.S., however, has long focused on individual metro areas and the result has been a “limited capacity” to move goods quickly — this is a major liability threatening long-term economic goals. And while U.S. commuters are opting to drive less, public transportation isn’t even close to commuters’ needs.
The Regional Plan Association proposes aggressive efforts to promote new construction, and finds that even existing lines are in desperate need of large-scale repairs or updates to improve service. In particular, they say the emerging megaregions need transportation modes that can work at distances 200-500 miles across, such as high-speed rail.
While this makes sense, what are the consequences of having such potent sub-national entities emerging separately from already-established state and city limits? Should we, or will we, have to re-draw the map? Will these megaregions become the new powerhouses that influence the political and economic systems of the country at the expense of current representative structures? Will they coalesce into distinct interests that have their own separate political demands from the individual local and state governments that are wholly or partly covered by them?
Interesting questions to consider, especially in light of this being an accelerating global trend with little sign of stopping, let alone reversing. I’m reminded of Parag Khanna’s article, “When Cities Rule the World,” which argued that urban regions will come to dominate the world, ahead of — and often at the expense of — nation states:
In this century, it will be the city—not the state—that becomes the nexus of economic and political power. Already, the world’s most important cities generate their own wealth and shape national politics as much as the reverse. The rise of global hubs in Asia is a much more important factor in the rebalancing of global power between West and East than the growth of Asian military power, which has been much slower. In terms of economic might, consider that just forty city-regions are responsible for over two-thirds of the total world economy and most of its innovation. To fuel further growth, an estimated $53 trillion will be invested in urban infrastructure in the coming two decades.
Given what we’ve seen with America’s megaregions, the prescient Mr. Khanna (who wrote this article three years ago) has a point. Here are some of his highlights regarding this trend and its implications:
Mega-cities have become global drivers because they are better understood as countries unto themselves. 20 million is no longer a superlative figure; now we need to get used to the nearly 100 million people clustered around Mumbai. Across India, it’s estimated that more than 275 million people will move into India’ s teeming cities over the next two decades, a population equivalent to the U.S. Cairo’s urban development has stretched so far from the city’ s core that it now encroaches directly on the pyramids, making them and the Sphynx commensurately less exotic. We should use the term “gross metropolitan product” to measure their output and appreciate the inequality they generate with respect to the rest of the country. They are markets in their own right, particularly when it comes to the “bottom of the pyramid,” which holds such enormous growth potential.
As cities rise in power, their mayors become ever more important in world politics. In countries where one city completely dominates the national economy, to be mayor of the capital is just one step below being head of state—and more figures make this leap than is commonly appreciated. From Willy Brandt to Jacques Chirac to Mahmoud Ahmadinejad, mayors have gone on to make their imprint on the world stage. In America, New York’s former mayor Rudy Giuliani made it to the final cut among Republican presidential candidates, and Michael Bloomberg is rumored to be considering a similar run once his unprecedented third term as Giuliani’s successor expires. In Brazil, José Serra, the governor of the São Paulo municipal region, lost the 2010 presidential election in a runoff vote. Serra rose to prominence in the early 1980s as the planning and economy minister of the state of São Paulo, and made his urban credentials the pillar of his candidacy.
It is too easy to claim, as many city critics do, that the present state of disrepair and pollution caused by many cities means suburbs will be the winner in the never-ending race to create suitable habitats for the world’s billions. In fact, it is urban centers—without which suburbs would have nothing to be “sub” to—where our leading experiments are taking place in zero-emissions public transport and buildings, and where the co-location of resources and ideas creates countless important and positive spillover effects. Perhaps most importantly, cities are a major population control mechanism: families living in cities have far fewer children. The enterprising research surrounding urban best practices is also a source of hope for the future of cities. Organizations like the New Cities Foundation, headquartered in Geneva, connect cities by way of convening and sharing knowledge related to sustainability, wealth creation, infrastructure finance, sanitation, smart grids, and healthcare. As this process advances and deepens, cities themselves become nodes in our global brain.
While most visions of the future imagine mega-corporations to be the entities that transcend nations and challenge them for supremacy, it may be these mega-regions or mega-cities that will be the true powerhouses of the world. In fact, we may even see something of a three-way struggle between all of these globalizing behemoths, as many nation-states also begin to band together to form more powerful blocs.
One things is for certain: the future will be an interesting experiment in testing humanity’s organizational and technological prowess, especially in the midst of worsening environmental conditions and strained national resources, which such mega-regions will no doubt need to overcome. What are your thoughts?
Hat tip to my friend Will for sharing this article with me.
Yesterday, the North African nation of Tunisia, which overthrew its autocratic ruler in 2011 and served as a catalyst for the Arab Spring, passed one of the most progressive constitutions in the world, with only 12 out of its 216 legislators voting against.
The new constitution explicitly guarantees women’s rights and gender equality; mandates environmental protection and water conservation (only the third country in the world to do so); declares healthcare a human right; reaffirms the democratic, secular, and civil nature of its government; officially respects freedom of religion; establishes a right to due process and protection from torture; and promotes workers’ rights. Needless to say, they were keen on celebrating:
Furthermore, the government has agreed to step down in favor of a technocratic caretaker administration that will be in place until proper elections can be held later this year. I can only hope that after two years of instability and tenuous peace, this historic achievement will amount to long-term change for both the country and the wider region.
Show your support for these brave reformers here, and let’s wish them well.
Think Progress has posted about a recent report by Citizens for Tax Justice (available here in PDF) that examines half of the Fortune 500 companies based in the U.S. Unsurprisingly, it found that many industries — including the richest and most profitable — receive the biggest subsidizes, beginning with financial and energy companies.
In fact, a whopping 56 percent of total tax subsidies went to the following four industries: financial, utilities, telecommunications; and resource extraction (oil, gas, and pipelines). You can get a more detailed picture from the following chart (which shows all the private-sector interests that benefit from tax subsidies versus their effective average tax rates).
While ultimately not surprising, the details are nonetheless upsetting, especially considering that most of these industries are far from troubled enough to warrant any taxpayer support (be it in terms of tax relief or direct financial transfers).
Yesterday, Phillip Seymour Hoffman — like sadly many other talented actors — died of an accidental drug overdose after years of struggles and relapses. His death has universally been mourned, including by yours truly. But like most high-profile deaths related to drugs, it exposes an even bigger tragedy: the unusual and ultimately counter-productive way in which society treats the subject of drug use. As Simon Jenkins of The Guardian succinctly observes:
Does the law also mourn? It lumps Hoffman together with thousands found dead and friendless in urban backstreets, also with needles in their arms. It treats them all as outlaws. Such is the double standard that now governs the regulation of addictive substances that we have had to develop separate universes of condemnation.
We cannot jail or otherwise hurl beyond the pale all who use drugs. We therefore treat some as “responsible users” and when something goes wrong mourn the tragedy. Offices, schools, hospitals, prisons, even parliament, are awash in illegal drug use. Their illegality is no deterrent. The courts could not handle proper enforcement, the prisons could not house the “criminals”. In Hoffman’s case his friends clearly knew that he was a drug addict. The police would have done nothing had they known.
So what do we do? We turn a blind eye to an unworkable law and assume it does not apply to people like us. We then relieve the implied guilt by taking draconian vengeance on those who supply drugs to those who need them, but who lack the friends and resources either to combat them or to avoid the law. Hospitals and police stations are littered each night with the wretched results.
There are no winners in the illegality of drugs, except the lucky ones who make money from it without getting caught. The only hope is that high-profile casualties such as Hoffman’s might lead a few legislators to see the damage done by these laws and correct their ways. At least in some American states the door of legalisation is now ajar. Not so in Britain, where the most raging addiction is inertia.
What do you think? Are drug-related deaths like Hoffman’s (among so many less visible ones) at least partly the result of a legal culture that criminalizes drugs, and by extension its victims? Would legalizing or decriminalizing once-illicit substances help turn drug abuse into a public health problem to be addressed, rather than a crime to be unequally and ineffectively enforced? Evidence from some U.S. states, as well countries around the world, suggests these steps would help to some extent. But what do you think?
Since the recession, and especially over the last couple of years, there’s been a flurry of articles that discuss (and sure enough, reflect) the growing interest in Marx and his theories. Most of them either explore whether Marxist ideology is relevant, and/or discus the increasing sympathy he’s garnering among people across the world.
Regardless of what you think of Marx or his ideas, it’s pretty interesting to see this once-widely dismissed and highly controversial (at least in the U.S.) figure gain so much coverage even from the likes of free-market or pro-capitalist publications. By the same token, a lot of once-fringe libertarian economists and thinkers are seeing their stars rise as well, in conjunction with the growth of libertarian movements in the U.S. (and to a lesser degree other parts of the world).
Needless to say, desperate times have called for desperate measures, so to speak, as more and more people show a willingness to explore alternatives once seen as unnecessary or unthinkable. To a large extent, this is typical in any period of crisis, as people are awakened from the once stable status quo that made them complacent and begin to ponder whether a better way is possible.
But that’s a discussion for another day. For now, consider this contentious post in Rolling Stone, which shares five of Marx’s observations that remain as relevant today, if not more so, than they were in his time (the nascent Industrial Revolution of the 19th century).
1. The Great Recession (Capitalism’s Chaotic Nature)
The inherently chaotic, crisis-prone nature of capitalism was a key part of Marx’s writings. He argued that the relentless drive for profits would lead companies to mechanize their workplaces, producing more and more goods while squeezing workers’ wages until they could no longer purchase the products they created. Sure enough, modern historical events from the Great Depression to the dot-com bubble can be traced back to what Marx termed “fictitious capital” – financial instruments like stocks and credit-default swaps. We produce and produce until there is simply no one left to purchase our goods, no new markets, no new debts. The cycle is still playing out before our eyes: Broadly speaking, it’s what made the housing market crash in 2008. Decades of deepening inequality reduced incomes, which led more and more Americans to take on debt. When there were no subprime borrows left to scheme, the whole façade fell apart, just as Marx knew it would.
2. The iPhone 5S (Imaginary Appetites)
Marx warned that capitalism’s tendency to concentrate high value on essentially arbitrary products would, over time, lead to what he called “a contriving and ever-calculating subservience to inhuman, sophisticated, unnatural and imaginary appetites.” It’s a harsh but accurate way of describing contemporary America, where we enjoy incredible luxury and yet are driven by a constant need for more and more stuff to buy. Consider the iPhone 5S you may own. Is it really that much better than the iPhone 5 you had last year, or the iPhone 4S a year before that? Is it a real need, or an invented one? While Chinese families fall sick with cancer from our e-waste, megacorporations are creating entire advertising campaigns around the idea that we should destroy perfectly good products for no reason. If Marx could see this kind of thing, he’d nod in recognition.
3. The IMF (The Globalization of Capitalism)
Marx’s ideas about overproduction led him to predict what is now called globalization – the spread of capitalism across the planet in search of new markets. “The need of a constantly expanding market for its products chases the bourgeoisie over the entire surface of the globe,” he wrote. “It must nestle everywhere, settle everywhere, establish connections everywhere.” While this may seem like an obvious point now, Marx wrote those words in 1848, when globalization was over a century away. And he wasn’t just right about what ended up happening in the late 20th century – he was right about why it happened: The relentless search for new markets and cheap labor, as well as the incessant demand for more natural resources, are beasts that demand constant feeding.
4. Walmart (Monopoly)
The classical theory of economics assumed that competition was natural and therefore self-sustaining. Marx, however, argued that market power would actually be centralized in large monopoly firms as businesses increasingly preyed upon each other. This might have struck his 19th-century readers as odd: As Richard Hofstadter writes, “Americans came to take it for granted that property would be widely diffused, that economic and political power would decentralized.” It was only later, in the 20th century, that the trend Marx foresaw began to accelerate. Today, mom-and-pop shops have been replaced by monolithic big-box stores like Walmart, small community banks have been replaced by global banks like J.P. Morgan Chase and small famers have been replaced by the likes of Archer Daniels Midland. The tech world, too, is already becoming centralized, with big corporations sucking up start-ups as fast as they can. Politicians give lip service to what minimal small-business lobby remains and prosecute the most violent of antitrust abuses – but for the most part, we know big business is here to stay.
5. Low Wages, Big Profits (The Reserve Army of Industrial Labor)
Marx believed that wages would be held down by a “reserve army of labor,” which he explained simply using classical economic techniques: Capitalists wish to pay as little as possible for labor, and this is easiest to do when there are too many workers floating around. Thus, after a recession, using a Marxist analysis, we would predict that high unemployment would keep wages stagnant as profits soared, because workers are too scared of unemployment to quit their terrible, exploitative jobs. And what do you know? No less an authority than the Wall Street Journal warns, “Lately, the U.S. recovery has been displaying some Marxian traits. Corporate profits are on a tear, and rising productivity has allowed companies to grow without doing much to reduce the vast ranks of the unemployed.” That’s because workers are terrified to leave their jobs and therefore lack bargaining power. It’s no surprise that the best time for equitable growth is during times of “full employment,” when unemployment is low and workers can threaten to take another job.
As always, I ask: what are your thoughts?
It seems deceptively obvious, doesn’t it? Poverty is absence of wealth, so the solution is to simply give the poor money. The problem is that, in addition to the misery that comes with scarcity, the poor suffer the added stigma of victim-blaming: their economic state is widely seen as a personal failing, a product of laziness, irresponsibility, or stupidity (especially among Americans).
But if one accepts the fact that poor people are no more or less likely to be savvy with money than the rich, then it simply becomes a matter of boosting their material conditions, albeit in a far less paternalistic and bureaucratic fashion than is typically prescribed. Indeed, traditional approaches to welfare are no more effective than the Right’s contention that poor would be better off in a freer market (or spurred into action by cut benefits). As Bloomberg Businessweek — hardly a leftist source — reports:
A growing number of studies suggest…that just handing over cash even to some of the world’s poorest people actually does have a considerable and long-lasting positive impact on their incomes, employment, health, and education. And that suggests we should update both our attitudes about poor people and our poverty reduction programs.
In 2008, the Ugandan government handed out cash transfers worth $382, about a year’s income, to thousands of poor 16- to 35-year-olds. The money came with few strings—recipients only had to explain how they would use the money to start a trade. Columbia University’s Chris Blattman and his co-authors found that, four years after receiving the cash, recipients were two-thirds more likely to be practicing a trade than non-recipients, and their earnings were more than 40 percent higher. They were also about 40 percent more likely to be paying taxes.In a second study, Blattman and colleagues looked at a program that gave $150 cash grants to 1,800 of the very poorest women in northern Uganda. Most began some sort of retail operation to supplement their income, and within a year their monthly earnings had doubled and cash savings tripled. The impact was pretty much the same whether or not participants received mentoring; business training added some value, but handing over the money it cost to provide would have added more.Findings from around the world suggest that giving cash over goods or in-kind transfers is cheaper and more cost-effective, too. Economist Jenny Aker has found that cash transfers are better used than food vouchers in a comparison in the Democratic Republic of the Congo. Unsurprisingly, giving people a food voucher means they purchase more food than they do if you give them cash. But give them cash and they are able to save some of the money and pay school fees, all while consuming as diverse a diet as those who got vouchers. And the cash-transfer program is considerably less expensive to run.
Keep in mind that these are societies where poverty is widespread and endemic, and yet still most recipients knew how to use their money effectively. A presumed “culture of poverty” did nothing to undermine their ability to be self-sufficient when given the opportunity. This success isn’t limited to the developing world either:
Back in the 1970s, the U.S. federal government experimented with a “negative income tax” that guaranteed an income to thousands of randomly selected low-income recipients. (Think of today’s Earned Income Tax Credit, only without the requirement to earn income.) The results suggested that the transfers improved test scores and school attendance for the children of recipients, reduced prevalence of low-birth-weight kids, and increased homeownership. Early analysis of a 2007 cash transfer program in New York City suggested that transfers averaging $6,000 per family conditional on employment, preventative health care, and children’s educational attendance led to reduced poverty and hunger, improved school attendance and grade advancement, reduced health-care hardships, and increased savings.
Additionally, the Canadians also experimented with unconditional cash transfers, with similar success. It seems that no matter the culture or society, most individuals will use whatever resources they have at their disposal as effectively as possible (or at least make the attempt, which would regardless undermine the assumptions made about the competence of the poor).
Most cash-transfer programs do impose conditions—like requiring kids to go to school or get vaccinated, which does improve school attendance and vaccination rates considerably. But Blattman’s research suggests conditions aren’t necessary to improve the quality of life of poor families. In fact, while analysis by the World Bank’s Berk Ozler shows that making cash transfers conditional on kids being in school has a bigger impact than a no-strings-attached check, even “condition-less cash” considerably raises enrollment. Conditional programs increase the odds of a child being in school by 41 percent; unconditional programs, 23 percent. Other studies of cash transfers in developing countries have found a range of impacts that had little or nothing to do with any conditions applied: lower crime rates, improved child nutrition and child health, lower child mortality, improved odds of kids being in school, and declines in early marriage and teenage pregnancy.
So even the fairly successful conditional cash transfers implemented in places like Brazil and Mexico are, in a sense, unnecessary. While they’re definitely great steps, ultimately most poor people don’t need to be told the best way to spend their money. Indeed, as the article concludes:
It is comfortable for richer people to think they are richer because of the moral failings of the poor. And that justifies a paternalistic approach to poverty relief using vouchers and in-kind support. But the big reason poor people are poor is because they don’t have enough money, and it should’t come as a huge surprise that giving them money is a great way to reduce that problem—considerably more cost-effectively than paternalism. So let’s abandon the huge welfare bureaucracy and just give money to those we should help out.
Of course some will inevitably squander it out of greed, negligence, or simple error — and again, they won’t do this any more than many wealthier people do — but by and large, the majority will put it to good, sustainable use. They’ll put into the economy, which is driven by consumer demand, or into small businesses and education, which will also benefit the economy. In essence, such cash transfers are an investment.
Obviously, such an approach won’t resolve the systemic factors responsible for poverty — the lack of well-paying jobs, an economy driven too much by short-term profit and consumerism, the increasing expensiveness of education and healthcare, and so on — but it’s a fairly simple start, and the money wasted on inefficient programs — among other things — might be better spent going straight into the hands of poor people just waiting to tap into their own potential.
What do you think?