The Bootstraps Myth

From Melissa McEwan of the blog Shakesville:

The Myth of Bootstraps goes something like this: I never got any help from anyone. I achieved my American Dream all on my own, through hard work. I got an education, I saved my money, I worked hard, I took risks, and I never complained or blamed anyone else when I failed, and every time I fell, I picked myself up by my bootstraps and just worked even harder. No one helped me.

This is almost always a lie.

There are vanishingly few people who have never had help from anyone—who never had family members who helped them, or friends, or colleagues, or teachers. 

Who never benefited from government programs that made sure they had electricity, or mail, or passable roads, or clean drinking water, or food, or shelter, or healthcare, or a loan. 

Who never had any kind of privilege from which they benefited, even if they didn’t actively try to trade on it. 

Who never had an opportunity they saw as luck which was really someone, somewhere, making a decision that benefited them. 

Who never had friends to help them move, so they didn’t have to pay for movers. Who never inherited a couch, so they didn’t have to pay for a couch. Who never got hand-me-down clothes from a cousin, so their parents could afford piano lessons. Who never had shoes that fit and weren’t leaky, when the kid down the street didn’t.

Most, maybe all, of the people who say they never got any help from anyone are taking a lot of help for granted.

They imagine that everyone has the same basic foundations that they had—and, if you point out to them that these kids over here live in an area rife with environmental pollutants that have been shown to affect growth or brain function or breathing capacity, they will simply sniff with indifference and declare that those things don’t matter. That government regulations which protect some living spaces and abandon others to poisons isn’t help. 

The government giving you money to eat is a hand-out. The government giving you regulations that protect the air you breathe is, at best, nothing of value—and, at worst, a job-killing regulation that impedes the success of people who want to get rich dumping toxins into the ground where people getting hand-outs live.

What are your thoughts?

The James Bond of Philanthropy

In my view, with great wealth comes great responsibility. It gives you the capacity to do tremendous good or harm in the world, far more than the overwhelming majority of fellow humans. A little-known Irish-American businessman named Chuck Feeney exemplifies the incredible moral potential that the world’s richest can exercise if they so choose. Forbes did a piece on this amazing philanthropist in 2012, likening him to James Bond for his uniquely low-key and strategic approach to charitable giving:

Over the last 30 years he’s crisscrossed the globe conducting a clandestine operation to give away a $7.5 billion fortune derived from hawking cognac, perfume and cigarettes in his empire of duty-free shops. His foundation, the Atlantic Philanthropies, has funneled $6.2 billion into education, science, health care, aging and civil rights in the U.S., Australia, Vietnam, Bermuda, South Africa and Ireland. Few living people have given away more, and no one at his wealth level has ever given their fortune away so completely during their lifetime. The remaining $1.3 billion will be spent by 2016, and the foundation will be shuttered in 2020. While the business world’s titans obsess over piling up as many riches as possible, Feeney is working double time to die broke.

Feeney embarked on this mission in 1984, in the middle of a decade marked by wealth creation–and conspicuous consumption–when he slyly transferred his entire 38.75% ownership stake in Duty Free Shoppers to what became the Atlantic Philanthropies. “I concluded that if you hung on to a piece of the action for yourself you’d always be worrying about that piece,” says Feeney, who estimates his current net worth at $2 million (with an “m”). “People used to ask me how I got my jollies, and I guess I’m happy when what I’m doing is helping people and unhappy when what I’m doing isn’t helping people.”

What Feeney does is give big money to big problems–whether bringing peace to Northern Ireland, modernizing Vietnam’s health care system or seeding $350 million to turn New York’s long-neglected Roosevelt Island into a technology hub. He’s not waiting to grant gifts after he’s gone nor to set up a legacy fund that annually tosses pennies at a $10 problem. He hunts for causes where he can have dramatic impact and goes all-in. “Chuck Feeney is a remarkable role model,” Bill Gates tells FORBES, “and the ultimate example of giving while living.”

I highly recommend you read the rest of the article, as it eventually discusses the nuances of Feeny’s character and his rather sophisticated philanthropic methods. The amount of wealth he is donating in both proportional and absolute terms is staggering enough without the added humility and strategic approach.

It is unfortunate that amid ever-higher rates of inequality — best epitomized by the fact that a mere 85 individuals own more wealth than around half of the world’s poorest people (3.5 billion) – most of the world’s elites aren’t following in Feeny’s footsteps, or at the very least donating more than a mere percentage of their assets. There’s a lot of untapped potential out there, and even a number of us who are comfortably well-off could be doing more.

Low-Wage Work Becomes The New Normal

It’s been well documented that the recession eliminated most of the already-declining number of well-paying jobs, with most of the (still-anemic) growth in employment occurring overwhelmingly in low-paying sectors. Now the latest report from the Bureau of Labor Statistics (BLS), courtesy of Business Insider, further underscores this troubling trend, revealing that nearly all of the top ten most common jobs are low-paying.

So retail and food (wherein most cashiers work) represent the lion’s share of new job growth.

To emphasize just how low-paying most of these vocations are, consider how much their annual mean pay matches up with the overall national mean wage (e.g. all U.S. occupations combined).

Registered nurses are the only folks doing fairly well, on average. Most of the other common jobs fall well short of the annual mean wage, with the three most common being around half or less of it. Needless to say, this represents a troubling development. While the degree to which one can survive on a low wage does vary by state, county, or city, overall you can’t get by for very long in many parts of the country by just working in these positions (which, by the way, typically lack benefits, paid sick leave, and full-time hours).

It’s also worth pointing out that given the decline the minimum wage’s value — when adjusted for inflation it’s actually less than what it was in 1968 — a lot of these menial and currently low-paying positions would actually have offered a decent standard of living (at least relative to what they do now). With the application of new technology and various administrative changes within businesses (outsourcing, streamlining management, etc) the economy seems to have reached a point where there just isn’t enough need for anything but food, consumer goods, medical care, and the like; even then, we need a lot more cashiers and cooks then we do doctors, managers, and lawyers — hence all the growth in the former jobs as compared to the latter.

In short, I think we need to re-think the way we pay people and scale back the notion that only “skilled” or technical work deserves a decent, living wage. The fact is, most well-paying professions have been replaced, rendered redundant, or simply aren’t in high enough demand relative to the number of people who need steady work. If companies have the resources to pay people better — and indeed most of the lowest-paying employers are very profitable — they should pay well enough to ensure a decent standard of living for their employees. After all, we can’t sustain an economy without a large market of consumers, who in turn can’t consume if they’re increasingly taking on low-paying work.

I’ll close by noting that this chart is full of caveats, as expressed by many of the commentators below. There are claims that the numbers attributed to certain professions don’t add up with other data, or that the mean annual wage for some jobs have been miscalculated, etc. I honestly don’t have the time to analyze the veracity of those criticism, but I leave it to your best judgment to determine the matter. Of course, always feel free to share your thoughts.

Hat tip to my friend Michael for first sharing this piece.

Twenty-One Children and Their Bedrooms From Around the World

PolicyMic is featuring the engaging works of James Mollison, a Kenyan-born, English photographer based in Venice whose 2011 photo book, Where Children Sleep, collects photos of various children and their sleeping quarters. It was meant to draw attention to each child’s “material and cultural circumstances” and to put perspective on the class, poverty, and the diversity of children worldwide.

I strongly suggest you check it out here; it’s well worth your time. Some of these images are pretty powerful, highlighting the vast discrepancies in standard of living between (and within) countries around the world. Many of the subjects have a lot of personality and character as well (which is no doubt why they were chosen.

The Problem With Being “Paid What You’re Worth”

In American culture, there is a widespread tendency to determine someone’s social value based on how they earn a living and what they get paid. Setting aside the fact that one’s moral has nothing to do with what sort of job they end up with, how much money you make isn’t really reflective of your personal or professional value either.

After all, some of the lowest-paying jobs in the U.S. include such necessary work as custodial services, care giving, social work, nursing home assistance, and more. Even the fast-food and retail work that is often looked-down upon is valuable, insofar as we rely on these workers to get the goods and services we demand (heck, these aren’t some of the fastest-growing sectors for nothing).

Former U.S. Secretary of Labor Robert Reich highlights the fallacy of this mentality in his recent piece in Salon, noting just how arbitrary corporate pay structures can be:

Fifty years ago, when General Motors was the largest employer in America, the typical GM worker got paid $35 an hour in today’s dollars. Today, America’s largest employer is Walmart, and the typical Walmart workers earns $8.80 an hour.

Does this mean the typical GM employee a half-century ago was worth four times what today’s typical Walmart employee is worth? Not at all. Yes, that GM worker helped produce cars rather than retail sales. But he wasn’t much better educated or even that much more productive. He often hadn’t graduated from high school. And he worked on a slow-moving assembly line. Today’s Walmart worker is surrounded by digital gadgets — mobile inventory controls, instant checkout devices, retail search engines — making him or her quite productive.

The real difference is the GM worker a half-century ago had a strong union behind him that summoned the collective bargaining power of all autoworkers to get a substantial share of company revenues for its members. And because more than a third of workers across America belonged to a labor union, the bargains those unions struck with employers raised the wages and benefits of non-unionized workers as well. Non-union firms knew they’d be unionized if they didn’t come close to matching the union contracts.

Today’s Walmart workers don’t have a union to negotiate a better deal. They’re on their own. And because fewer than 7 percent of today’s private-sector workers are unionized, non-union employers across America don’t have to match union contracts. This puts unionized firms at a competitive disadvantage. The result has been a race to the bottom.

Juxtapose this fact with the following one:

If you still believe people are paid what they’re worth, take a look at Wall Street bonuses. Last year’s average bonus was up 15 percent over the year before, to more than $164,000. It was the largest average Wall Street bonus since the 2008 financial crisis and the third highest on record, according to New York’s state comptroller. Remember, we’re talking bonuses, above and beyond salaries.

All told, the Street paid out a whopping $26.7 billion in bonuses last year.

According to the Institute for Policy Studies, the $26.7 billion of bonuses Wall Street banks paid out last year would be enough to more than double the pay of every one of America’s 1,085,000 full-time minimum wage workers.

Does this disparity in compensation truly reflect the worth of these positions and those who work them? Or are is it simply the result of power imbalances between employers and employees, as the latter lack the leverage to negotiate better salaries and treatment due to declining unionization and mass unemployment (which brings down labor costs)? Aren’t the subsidies and tax breaks that are going to corporations and big financial firms — in conjunction with the erosion of worker protections and the minimum wage — also reflected in the way jobs are paid?

In short, there are many systemic, institutional, and political reasons why people make the amount they do. It has little to nothing to do with their individual or collective worth, and everything to do with the vagaries of our economic and political system, which has become increasingly unequal and unfair in terms of how compensation is allocated (e.g. executives and shareholders receiving massive bonuses during profitable years while average workers receive nothing or even endure cuts).

But what do you think?

 

 

Over Half of All U.S. Tax Subsidies Go to Just Four Industries

Think Progress has posted about a recent report by Citizens for Tax Justice (available here in PDF) that examines half of the Fortune 500 companies based in the U.S. Unsurprisingly, it found that many industries — including the richest and most profitable — receive the biggest subsidizes, beginning with financial and energy companies.

In fact, a whopping 56 percent of total tax subsidies went to the following four industries: financial, utilities, telecommunications; and resource extraction (oil, gas, and pipelines). You can get a more detailed picture from the following chart (which shows all the private-sector interests that benefit from tax subsidies versus their effective average tax rates).

While ultimately not surprising, the details are nonetheless upsetting, especially considering that most of these industries are far from troubled enough to warrant any taxpayer support (be it in terms of tax relief or direct financial transfers).

Five Accurate Predictions By Karl Marx

Since the recession, and especially over the last couple of years, there’s been a flurry of articles that discuss (and sure enough, reflect) the growing interest in Marx and his theories. Most of them either explore whether Marxist ideology is relevant, and/or discus the increasing sympathy he’s garnering among people across the world.

Regardless of what you think of Marx or his ideas, it’s pretty interesting to see this once-widely dismissed and highly controversial (at least in the U.S.) figure gain so much coverage even from the likes of free-market or pro-capitalist publications. By the same token, a lot of once-fringe libertarian economists and thinkers are seeing their stars rise as well, in conjunction with the growth of libertarian movements in the U.S. (and to a lesser degree other parts of the world).

Needless to say, desperate times have called for desperate measures, so to speak, as more and more people show a willingness to explore alternatives once seen as unnecessary or unthinkable. To a large extent, this is typical in any period of crisis, as people are awakened from the once stable status quo that made them complacent and begin to ponder whether a better way is possible.

But that’s a discussion for another day. For now, consider this contentious post in Rolling Stone, which shares five of Marx’s observations that remain as relevant today, if not more so, than they were in his time (the nascent Industrial Revolution of the 19th century).

1. The Great Recession (Capitalism’s Chaotic Nature)

The inherently chaotic, crisis-prone nature of capitalism was a key part of Marx’s writings. He argued that the relentless drive for profits would lead companies to mechanize their workplaces, producing more and more goods while squeezing workers’ wages until they could no longer purchase the products they created. Sure enough, modern historical events from the Great Depression to the dot-com bubble can be traced back to what Marx termed “fictitious capital” – financial instruments like stocks and credit-default swaps. We produce and produce until there is simply no one left to purchase our goods, no new markets, no new debts. The cycle is still playing out before our eyes: Broadly speaking, it’s what made the housing market crash in 2008. Decades of deepening inequality reduced incomes, which led more and more Americans to take on debt. When there were no subprime borrows left to scheme, the whole façade fell apart, just as Marx knew it would.

2. The iPhone 5S (Imaginary Appetites)

Marx warned that capitalism’s tendency to concentrate high value on essentially arbitrary products would, over time, lead to what he called “a contriving and ever-calculating subservience to inhuman, sophisticated, unnatural and imaginary appetites.” It’s a harsh but accurate way of describing contemporary America, where we enjoy incredible luxury and yet are driven by a constant need for more and more stuff to buy. Consider the iPhone 5S you may own. Is it really that much better than the iPhone 5 you had last year, or the iPhone 4S a year before that? Is it a real need, or an invented one? While Chinese families fall sick with cancer from our e-waste, megacorporations are creating entire advertising campaigns around the idea that we should destroy perfectly good products for no reason. If Marx could see this kind of thing, he’d nod in recognition.

3. The IMF (The Globalization of Capitalism)

Marx’s ideas about overproduction led him to predict what is now called globalization – the spread of capitalism across the planet in search of new markets. “The need of a constantly expanding market for its products chases the bourgeoisie over the entire surface of the globe,” he wrote. “It must nestle everywhere, settle everywhere, establish connections everywhere.” While this may seem like an obvious point now, Marx wrote those words in 1848, when globalization was over a century away. And he wasn’t just right about what ended up happening in the late 20th century – he was right about why it happened: The relentless search for new markets and cheap labor, as well as the incessant demand for more natural resources, are beasts that demand constant feeding.

4. Walmart (Monopoly)

The classical theory of economics assumed that competition was natural and therefore self-sustaining. Marx, however, argued that market power would actually be centralized in large monopoly firms as businesses increasingly preyed upon each other. This might have struck his 19th-century readers as odd: As Richard Hofstadter writes, “Americans came to take it for granted that property would be widely diffused, that economic and political power would decentralized.” It was only later, in the 20th century, that the trend Marx foresaw began to accelerate. Today, mom-and-pop shops have been replaced by monolithic big-box stores like Walmart, small community banks have been replaced by global banks like J.P. Morgan Chase and small famers have been replaced by the likes of Archer Daniels Midland. The tech world, too, is already becoming centralized, with big corporations sucking up start-ups as fast as they can. Politicians give lip service to what minimal small-business lobby remains and prosecute the most violent of antitrust abuses – but for the most part, we know big business is here to stay.

5. Low Wages, Big Profits (The Reserve Army of Industrial Labor)

Marx believed that wages would be held down by a “reserve army of labor,” which he explained simply using classical economic techniques: Capitalists wish to pay as little as possible for labor, and this is easiest to do when there are too many workers floating around. Thus, after a recession, using a Marxist analysis, we would predict that high unemployment would keep wages stagnant as profits soared, because workers are too scared of unemployment to quit their terrible, exploitative jobs. And what do you know? No less an authority than the Wall Street Journal warns, “Lately, the U.S. recovery has been displaying some Marxian traits. Corporate profits are on a tear, and rising productivity has allowed companies to grow without doing much to reduce the vast ranks of the unemployed.” That’s because workers are terrified to leave their jobs and therefore lack bargaining power. It’s no surprise that the best time for equitable growth is during times of “full employment,” when unemployment is low and workers can threaten to take another job.

As always, I ask: what are your thoughts?

The Best Way To Solve Poverty: Just Give the Poor Money

It seems deceptively obvious, doesn’t it? Poverty is absence of wealth, so the solution is to simply give the poor money. The problem is that, in addition to the misery that comes with scarcity, the poor suffer the added stigma of victim-blaming: their economic state is widely seen as a personal failing, a product of laziness, irresponsibility, or stupidity (especially among Americans).

But if one accepts the fact that poor people are no more or less likely to be savvy with money than the rich, then it simply becomes a matter of boosting their material conditions, albeit in a far less paternalistic and bureaucratic fashion than is typically prescribed. Indeed, traditional approaches to welfare are no more effective than the Right’s contention that poor would be better off in a freer market (or spurred into action by cut benefits).  As Bloomberg Businessweek — hardly a leftist source — reports:

A growing number of studies suggest…that just handing over cash even to some of the world’s poorest people actually does have a considerable and long-lasting positive impact on their incomes, employment, health, and education. And that suggests we should update both our attitudes about poor people and our poverty reduction programs.

In 2008, the Ugandan government handed out cash transfers worth $382, about a year’s income, to thousands of poor 16- to 35-year-olds. The money came with few strings—recipients only had to explain how they would use the money to start a trade. Columbia University’s Chris Blattman and his co-authors found that, four years after receiving the cash, recipients were two-thirds more likely to be practicing a trade than non-recipients, and their earnings were more than 40 percent higher. They were also about 40 percent more likely to be paying taxes.

In a second study, Blattman and colleagues looked at a program that gave $150 cash grants to 1,800 of the very poorest women in northern Uganda. Most began some sort of retail operation to supplement their income, and within a year their monthly earnings had doubled and cash savings tripled. The impact was pretty much the same whether or not participants received mentoring; business training added some value, but handing over the money it cost to provide would have added more.
Findings from around the world suggest that giving cash over goods or in-kind transfers is cheaper and more cost-effective, too. Economist Jenny Aker has found that cash transfers are better used than food vouchers in a comparison in the Democratic Republic of the Congo. Unsurprisingly, giving people a food voucher means they purchase more food than they do if you give them cash. But give them cash and they are able to save some of the money and pay school fees, all while consuming as diverse a diet as those who got vouchers. And the cash-transfer program is considerably less expensive to run.

Keep in mind that these are societies where poverty is widespread and endemic, and yet still most recipients knew how to use their money effectively. A presumed “culture of poverty” did nothing to undermine their ability to be self-sufficient when given the opportunity. This success isn’t limited to the developing world either:

Back in the 1970s, the U.S. federal government experimented with a “negative income tax” that guaranteed an income to thousands of randomly selected low-income recipients. (Think of today’s Earned Income Tax Credit, only without the requirement to earn income.) The results suggested that the transfers improved test scores and school attendance for the children of recipients, reduced prevalence of low-birth-weight kids, and increased homeownership. Early analysis of a 2007 cash transfer program in New York City suggested that transfers averaging $6,000 per family conditional on employment, preventative health care, and children’s educational attendance led to reduced poverty and hunger, improved school attendance and grade advancement, reduced health-care hardships, and increased savings.

Additionally, the Canadians also experimented with unconditional cash transfers, with similar success. It seems that no matter the culture or society, most individuals will use whatever resources they have at their disposal as effectively as possible (or at least make the attempt, which would regardless undermine the assumptions made about the competence of the poor).

Most cash-transfer programs do impose conditions—like requiring kids to go to school or get vaccinated, which does improve school attendance and vaccination rates considerably. But Blattman’s research suggests conditions aren’t necessary to improve the quality of life of poor families. In fact, while analysis by the World Bank’s Berk Ozler shows that making cash transfers conditional on kids being in school has a bigger impact than a no-strings-attached check, even “condition-less cash” considerably raises enrollment. Conditional programs increase the odds of a child being in school by 41 percent; unconditional programs, 23 percent. Other studies of cash transfers in developing countries have found a range of impacts that had little or nothing to do with any conditions applied: lower crime rates, improved child nutrition and child healthlower child mortality, improved odds of kids being in school, and declines in early marriage and teenage pregnancy.

So even the fairly successful conditional cash transfers implemented in places like Brazil and Mexico are, in a sense, unnecessary. While they’re definitely great steps, ultimately most poor people don’t need to be told the best way to spend their money. Indeed, as the article concludes:

It is comfortable for richer people to think they are richer because of the moral failings of the poor. And that justifies a paternalistic approach to poverty relief using vouchers and in-kind support. But the big reason poor people are poor is because they don’t have enough money, and it should’t come as a huge surprise that giving them money is a great way to reduce that problem—considerably more cost-effectively than paternalism. So let’s abandon the huge welfare bureaucracy and just give money to those we should help out.

Of course some will inevitably squander it out of greed, negligence, or simple error — and again, they won’t do this any more than many wealthier people do — but by and large, the majority will put it to good, sustainable use. They’ll put into the economy, which is driven by consumer demand, or into small businesses and education, which will also benefit the economy. In essence, such cash transfers are an investment.

Obviously, such an approach won’t resolve the systemic factors responsible for poverty — the lack of well-paying jobs, an economy driven too much by short-term profit and consumerism, the increasing expensiveness of education and healthcare, and so on — but it’s a fairly simple start, and the money wasted on inefficient programs — among other things — might be better spent going straight into the hands of poor people just waiting to tap into their own potential.

What do you think?

 

Inequality Isn’t Just a Moral Problem, But a Practical One

Whenever the topic of socioeconomic inequality is brought up, the emphasis is usually placed upon its moral and humanitarian consequences. While this is certainly a valid approach — after all, there’s a lot of human suffering involved — there is another factor to consider: inequality is bad for business, the economy, and the nation’s long-term political stability (one would hope that this would also win over those for whom poverty isn’t a moral problem, but many such individuals typically either don’t recognize inequality as a growing problem, or reject that it has such consequences).

An article by Daniel Altman of Foreign Policy highlights the argument fairly succinctly, noting that the practical problems of inequality could actually just as bad, if not worse, than the social ones:

When any market has a shortage, not everyone gets the things they want. But who does get them also matters, because it’s not always the people who value those things the most.

Economists Edward Glaeser and Erzo Luttmer made this point in a 2003 paper about rent control. “The standard analysis of price controls assumes that goods are efficiently allocated, even when there are shortages,” they wrote. “But if shortages mean that goods are randomly allocated across the consumers that want them, the welfare costs from misallocation may be greater than the undersupply costs.” In other words, letting the wrong people buy the scarce goods can be even worse for society than the scarcity itself.

This problem, which economists call inefficient allocation, is present in the market for opportunities as well. It’s best for the economy when the person best able to exploit an opportunity is the one who gets it. By giving opportunities to these people, we make the economic pie as big as possible. But sometimes, the allocation of opportunity is not determined solely by effort or ability.

For example, consider all of the would-be innovators, thinkers, and other social contributors who are otherwise precluded from realizing their potential — and benefiting society further — due a lack of resources? Conversely, what happens when unqualified or immoral people are allowed to amass disproportionate amount of wealth and resources, and thus gain all the outsized political and economic influence that goes with it? As the article goes on to note, unless you meet the ever-higher financial demands needed to access the avenues of influences and personal development, you wont’t get far in America:

To a great degree, access to opportunity in the United States depends on wealth. Discrimination based on race, religion, gender, and sexual discrimination may be on the wane in many countries, but discrimination based on wealth is still a powerful force. It opens doors, especially for people who may not boast the strongest talents or work ethic.

Country club memberships, charity dinners, and other platforms for economic networking come with high price tags decided by existing elites. Their exclusion of a whole swath of society because of something other than human potential automatically creates scope for inefficient allocation. But it’s not always people who do the discriminating; sometimes it’s just the system.

For instance, consider elected office. It’s a tremendous opportunity, both for the implementation of a person’s ideas and, sad to say, for financial enrichment as well. Yet running for office takes money — lots of it — and there are no restrictions on how much a candidate may spend. As a result, the people who win have tended to be very wealthy.

Of course, political life isn’t the only economic opportunity with a limited number of spots. In the United States, places at top universities are so scant that many accept fewer than 10 percent of applicants. Even with need-blind admissions, kids from wealthy backgrounds have huge advantages; they apply having received better schooling, tutoring if they needed it, enrichment through travel, and good nutrition and healthcare throughout their youth.

The fact that money affects access to these opportunities, even in part, implies some seats in Congress and Ivy League lecture halls would have been used more productively by poorer people of greater gifts. These two cases are particularly important, because they imply that fighting poverty alone is not enough to correct inefficient allocations. With a limited number of places at stake, what matters is relative wealth, or who can outspend whom. And when inequality rises, this gap grows.

I would hope that it goes without saying  that it’s problematic when a majority of society’s policymakers, public officials, academics, corporate executives, and other influential classes come from the same small (and narrowing) economic class of people. Diversity of experience and background is valuable to informing how society should be run. If all kinds of groups are locked out of the avenues of power due to not fitting some arbitrary requirement (in this case money and the connections that it brings), then it bodes ill for our ability to solve pressing problems.

So here comes the tricky and controversial part: how do we solve this problem?

If you believe that poor people are poor because they are stupid or lazy — and that their children probably will be as well — then the issue of inefficient allocation disappears. But if you think that a smart and hardworking child could be born into a poor household, then inefficient allocation is a serious problem. Solving it would enhance economic growth and boost the value of American assets.

There are two options. The first is to remove wealth from every process that doles out economic opportunities: take money out of politics, give all children equal schooling and college prep, base country club admissions on anonymous interviews, etc. This will be difficult, especially since our society is moving rapidly in the other direction. Election campaigns are flooded with more money than ever, and the net price of a college education — after loans and grants — has jumped even as increases in list prices have finally slowed. Poor kids who do make it to college will have to spend more time scrubbing toilets and dinner trays and less time studying.

The other option is to reduce inequality of wealth. Giving poor children a leg up through early childhood education or other interventions might help, but it would also take decades. So would deepening the progressivity of the income tax, which only affects flows of new wealth and not existing stocks. In the meantime, a huge amount of economic activity might be lost to inefficient allocation.

The easiest way to redistribute wealth continues to be the estate tax, yet it is politically unpopular and applies to only about 10,000 households a year. All of this might change, however, as more research estimates the harm caused by inequality through the inefficient allocation of opportunities.

This kind of research is not always straightforward, since it measures things that didn’t happen as well as those that did. Nevertheless, some economists have already shown how value can be destroyed through inheritance and cronyism among the wealthy. Scaled up to the entire economy, the numbers are on the order of billions of dollars.

These costs are not unique to the United States. Even as globalization has reduced inequality between countries, it has often increased inequality within them; the rich are better able to capitalize on its opportunities. Where nepotism and privilege are prevalent, the costs are amplified.

Needless to say, this a complicated issue that will take a lot more than a few changes to regulatory and tax policies. What do you think of this issue or the solution posited?