Featured Image -- 7140

Why we shouldn’t judge a country by its GDP

Eupraxsophy:

There is more to human progress and well-being than the sum value of a country’s goods and services. Instead of aiming for constant growth of Gross Domestic Product (GDP), governments, civil society groups, and economists should look to other, more meaningful metrics, as outlined by the Social Progress Imperative.

Originally posted on ideas.ted.com:

Gross Domestic Product has become the yardstick by which we measure a country’s success. But, says Michael Green, GDP isn’t the best way to measure a good society. His alternative? The Social Progress Index, which measures things like basic human needs and opportunity.

Analysts, reporters and big thinkers love to talk about Gross Domestic Product. Put simply, GDP, which tallies the value of all the goods and services produced by a country each year, has become the yardstick by which we measure a country’s success. But there’s a big, elephant-like problem with that: GDP only accounts for a country’s economic performance, not the happiness or well-being of its citizens. With GDP, if your richest 100 people get richer, your GDP rises … but most of your citizens are just as badly off as they were before.

That’s one of the reasons the team that I lead at the Social Progress…

View original 1,406 more words

What Scandinavia Can Teach Us About Taxes

While I am on something of a tax kick (and apparently a Vox.com one), the website has another article that questions the very notion that average citizens should have to worry about these details to begin with.

The IRS knows what you make. It knows if you typically take the standard deduction. For a lot of Americans, the IRS could just fill out their taxes for them. It would save billions of dollars in tax preparation fees and hundreds of millions of hours spent filling out tax forms.

This isn’t some wild idea: it was piloted in California, where citizens loved it — 97 percent of those who used it said they would do so again. It’s how taxes work in Denmark, Sweden, and Spain. “No other industrialized country asks its citizens to jump through as many hoops to calculate their taxes as ours,” writes Farhad Manjoo at the New York Times.

This idea has considerable traction across the political spectrum, though it faces powerful opposition:

Intuit, the maker of TurboTax, is a particularly powerful opponent. Such a system “minimizes the taxpayers’ voice and control over the tax process by reducing their role in filing their taxes and getting their own money back,” David Williams, the company’s chief tax officer, told the Times.

But that excuse doesn’t hold much water. Under these automatic systems, no one has to let the IRS fill out their taxes for them. They can continue to do it by hand or by TurboTax, or hire an accountant. Intuit knows, however, that many fewer Americans would do their own taxes under this scenario, and that would be a big hit to Intuit’s bottom line.

Some anti-tax conservatives also hate the idea of the IRS filling out sample returns. Grover Norquist, president of Americans for Tax Reform, warns, “Conservatives, in particular, should see this ploy for what it clearly is: a money-grab by the government.” The easier and more efficient the tax system is, the more money it will raise, and the less public anger there will be for anti-tax conservatives to harness.

For much more on this subject, ProPublica’s investigation of Intuit’s lobbying against automatic tax filing is the best look at why a policy with so much bipartisan support can’t seem to pass Congress, and the Sunlight Foundation has even more lobbying numbers here. Wonks will want to spend some time with economist Austan Goolsbee’s white paper on how automatic filing could work in practice. And you can read Intuit’s case against California’s Ready Return system here.

It seems there is no issue in American politics, however broadly supported or commonsensical, that does not face well-monied and powerful lobbyists. I wonder if the Scandinavian nations had to deal with this?

Nine Charts That Explain Taxes In America

While taxes remain on many Americans’ minds, click here to view nine charts from Vox.com that explain the vagaries and little-known facts about the U.S. tax system.

For an even more extensive guide to taxes — such as how federal income taxes work and why payroll taxes differ from taxes on investment — click here.

Some Consolation This Tax Season

Well, that depends on your point of view. While tax season has come and gone in the United States, if you are still feeling the sting and contempt wrought by taxes — as so many Americans do year-round — perhaps you will feel more fortunate following the recent findings by Pew Research Center, which found that U.S. citizens are among the least-taxed of the developed world.

Here is a quick analysis from the Washington Post

The graph above shows where Americans rank in terms of average income taxes and mandatory social insurance contributions as a percentage of gross income. It compares the 34 countries in the Organization for Economic Cooperation and Development, excluding Mexico and adding in Bulgaria, Croatia, Latvia, Lithuania Malta and Romania.

The U.S. consistently ranks toward the bottom of the group, indicating that Americans spend a smaller portion of their income on taxes than people in many advanced countries.

Only South Korea and Chile have a lower tax rate than the U.S. (Mexico, if included, would be dead-last in tax rates).

Here is the more comprehensive assessment of the data by Pew itself, which notes some caveats as well:

Much of the difference in relative tax burdens among different countries is due to the taxes that fund social-insurance programs, such as Social Security and Medicare in the U.S. These taxes tend to be higher in other developed nations than they are in the U.S. Take that married couple referred to above: In 20 of the 39 countries studied, they paid more in social-insurance taxes than in income taxes. The U.S. had the 11th-lowest social-insurance tax rate for such couples among the countries we examined.

Like pretty much anything about taxes, there are caveats with the OECD data. The biggest caveat, of course, is that our comparisons don’t take into account what citizens receive from their governments in either direct or indirect benefits as a result of these different tax structures. We’re only looking at what citizens pay into the system – and even then, just a portion.

For instance, these figures don’t include taxes paid at the state, provincial or local level (such as sales and property taxes in the U.S.), nor do they include other national taxes, such as gasoline and cigarette taxes in the U.S. or value-added taxes in dozens of other countries. And they include only the individual portion of social-insurance taxes, not anything paid by employers. (In the U.S., for instance, employers and workers both pay Social Security and Medicare taxes.)

Granted, all this is only consoling when you ignore the fact that while Americans are under-taxed by global standards, the system is nonetheless widely perceived to be unfair and unequal.

None of this is likely to shift Americans’ opinions about the fairness, or lack thereof, of their own tax system. In the Pew Research Center report, for instance, some six-in-ten Americans said they were bothered a lot by the feeling that “some wealthy people” and “some corporations” don’t pay their fair share. And in a prior Fact Tank post we discussed the data behind the U.S.’s progressive income tax system: A small number of high earners pay the most income tax. According to IRS data, taxpayers with $250,000 or more in adjusted gross income (AGI) accounted for 2.4% of all individual tax returns, 25.9% of total AGI, 32.2% of total taxable income, and 48.9% of total individual tax receipts.

Indeed, this perception is grounded in reality: once one factors in state and local taxes, the system overall is quite regressive (that is, it disproportionately impacts the poor and middle class).

Needless to say, this is hardly comforting in a country facing high inequality, growing poverty, soaring tuition and healthcare costs, and numerous other socioeconomic ills resulting, in part, from a lack of public investment in such services.

How Machines Will Conquer The Economy

From Zeynep Tufecki over at the New York Times:

But computers do not just replace humans in the workplace. They shift the balance of power even more in favor of employers. Our normal response to technological innovation that threatens jobs is to encourage workers to acquire more skills, or to trust that the nuances of the human mind or human attention will always be superior in crucial ways. But when machines of this capacity enter the equation, employers have even more leverage, and our standard response is not sufficient for the looming crisis.

Machines aren’t used because they perform some tasks that much better than humans, but because, in many cases, they do a “good enough” job while also being cheaper, more predictable and easier to control than quirky, pesky humans. Technology in the workplace is as much about power and control as it is about productivity and efficiency.

This is the way technology is being used in many workplaces: to reduce the power of humans, and employers’ dependency on them, whether by replacing, displacing or surveilling them. Many technological developments contribute to this shift in power: advanced diagnostic systems that can do medical or legal analysis; the ability to outsource labor to the lowest-paid workers, measure employee tasks to the minute and “optimize” worker schedules in a way that devastates ordinary lives. Indeed, regardless of whether unemployment has gone up or down, real wages have been stagnant or declining in the United States for decades. Most people no longer have the leverage to bargain.

I can think of no better a justification for implementing a guaranteed basic income than this trend. How much longer until we run out of sustainable employment to support our population? Already, in the United States and elsewhere, most fast-growing sectors are low paying service jobs like fast-food and retail; even the professions that should ostensibly pay well, such as those requiring degrees or experience, increasingly do not.

Most people are already running out of alternatives for liveable, meaningful work — and now mechanization and automation threaten to undermine what comparatively little remains. I think this says a lot more about the social, economic, and moral failings of our society than it does about technology.

Why should everything be hyper-efficient at the expense of workers — who are also consumers and thus drivers of the economy? Why should we have a business culture, or indeed an economic and social structure, whereby those at the top must ruthlessly undercut the leverage and well-being of everyone else, whom they nonetheless depend on? If we want to optimize production and cost-effectiveness, which are of course not bad aims, then why not do so while providing some alternative means of survival for those who get displaced?

How we respond to this trend will speak volumes about our values, priorities, and moral grounding.

A Vivid Visualization of Inequality in America

The rise of wealth and income inequality is a (thankfully) widespread topic in media and public discourse, so by now most readers will no doubt be familiar with the various charts, videos, and graphs that translate it for our viewing pleasure.

But the Washington Post, citing an NPR column, presents an even more dramatic approach to showing the growth of inequality in the United States:

Source: Quoctrung Bui/NPR

Columnist Matt O’Brien breaks down what the data mean and the context of this sobering development:

It compares how much, in inflation-adjusted 2012 dollars, average households in the bottom 90 and top 1 percent have made each year. Now, it’s hard to tell because everyone was making less back then, but inequality really was high during the 1920s. The bottom 90 didn’t make much progress then, while the top 1 rode the, well, roaring stock market to even higher highs. All that was erased, though, during the Great Depression. The top 1 got wiped out when stocks fell almost 90 percent, and the bottom 90 did too when unemployment shot up to 25 percent. It was a bad time to be rich or poor, but mostly poor.

But the New Deal set the stage for a new society. FDR made it easier for workers to unionize, and started taxing the rich at confiscatory levels. It didn’t hurt that first the war and later the baby boom put everyone back to work. The result, as you can see above, was the creation of the American middle class. Between 1940 and 1970, the bottom 90 percent went from making, on average, $12,000 to $33,000. The top 1 percent, meanwhile, were stuck making “only” $300,000 this whole time. It’s what economists call the “Great Compression,” and it was a story about workers having the bargaining power to ask for higher wages and the rich not having much reason to ask for higher wages themselves. That’s because top marginal tax rates were so high—at their peak, 94 percent—that it wasn’t worth it for CEOs to pay themselves that much more. Besides, that was just something executives didn’t do back then. George Romney, for example, turned down a $100,000 bonus in 1960—and those are unadjusted dollars—because he didn’t think anyone needed to make that much more.

This didn’t last. It all started to unravel in the 1970s. Inflation ate up everyone’s pay, so that incomes for the top 1 and bottom 90 percent both stagnated. But it wasn’t just a monetary problem. It was an educational one, too. Starting in the 1930s, America had led the way with universal high school, but by the 1970s this progress had petered out. Making matters worse was that the rest of the world was already catching up—especially Germany and Japan—and forcing our workers to compete against theirs.

Ronald Reagan’s answer to all this was to cut taxes for the rich and deregulate the economy. The idea was to give the top 1 percent the freedom and incentive to work more and invest more, which was supposed to make the economy grow more—and, yes, trickle down to everybody else. It didn’t. Now part of that was because U.S. workers had to compete against even more low-wage workers overseas after the Berlin Wall came down and billions of people joined the global economy. Another was that new technologies like the internet helped the people at the top more than those at the bottom by creating winner-take-all markets. But a big part of it, like we said, was policy. Wall Street, in particular, went from being a relatively sleepy sector to a wheeling-and-dealing one where a couple of good bonuses could make you set for life. Indeed, more than 60 percent of the increasing share of income going to the top 1 percent came from CEOs and financiers who make most of their money in the markets.

It turns out, though, that even if a rising tide lifts all boats, most people can’t afford a boat. The bottom 90 percent, in other words, haven’t done much better the last 30 years, even as the top 1 percent have created a second Gilded Age. The only exception was the late 1990s—highlighted in yellow—when a tight labor market gave workers the bargaining power that unions used to. But other than that, it’s been a tale of two economies. There’s the financial one, where the top 1 percent have tied their fortunes to the booming stock market, and the real one, where everyone else is struggling not to fall behind. Now it’s true that the picture isn’t as bleak if you account for the fact that, as people marry later and have fewer kids, households aren’t as big as they used to be. And it’s also true that government benefits from Social Security to Medicare to food stamps and unemployment insurance help out the bottom 90 percent too. But it’s also true that even with these caveats, a growing economy hasn’t really translated into growing incomes for median households the last 15 years.

The change in fortunes between the bulk of society and a relative handful of families could not be more stark.

U.S. Leads Developed World in Child Poverty

Over the past six years, America’s wealth expanded by over $30 billion — a growth rate of 60 percent — despite the weak recovery. During the same span of time, another metric grew by that percentage: the number of homeless and food insecure children.

As Raw Story reports, despite its vast and ever-growing wealth, the world’s richest country by a considerable margin lags behind most other developed nations in measurements of child poverty.

America is a ‘Leader’ in Child Poverty

The U.S. has one of the highest relative child poverty rates in the developed world. As UNICEF reports, “[Children’s] material well-being is highest in the Netherlands and in the four Nordic countries and lowest in Latvia, Lithuania, Romania and the United States.”

Over half of public school students are poor enough to qualify for lunch subsidies, and almost half of black children under the age of six are living in poverty.

$5 a Day for Food, But Congress Thought it was Too Much.

Nearly half of all food stamp recipients are children, and they averaged about $5 a day for their meals before the 2014 farm bill cut $8.6 billion (over the next ten years) from the food stamp program.

In 2007 about 12 of every 100 kids were on food stamps. Today it’s 20 of every 100.

For Every 2 Homeless Children in 2006, There Are Now 3

On a typical frigid night in January, 138,000 children, according to the U.S. Department of Housing, were without a place to call home.

That’s about the same number of households that have each increased their wealth by $10 million per year since the recession.

The US: Near the Bottom in Education, and Sinking

The U.S. ranks near the bottom of the developed world in the percentage of 4-year-olds in early childhood education. Early education should be a primary goal for the future, as numerous studies have shown that pre-school helps all children to achieve more and earn more through adulthood, with the most disadvantaged benefiting the most. But we’re going in the opposite direction. Head Start was recently hit with the worst cutbacks in its history.

Children’s Rights? Not in the U.S.

It’s hard to comprehend the thinking of people who cut funding for homeless and hungry children. It may be delusion about trickle-down, it may be indifference to poverty, it may be resentment toward people unable to “make it on their own”.

The indifference and resentment and disdain for society reach around the globe. Only two nations still refuse to ratify the UN Convention on the Rights of the Child: South Sudan and the United States.

Aside from the obvious immorality of allowing so many millions of children to suffer during their most formative years, this abysmal performance in child well-being will leave a lasting legacy of social ills, poor children are increasingly more likely to remain poor for the rest of their lives (especially given the declining social mobility for which the U.S. was once famous).

Five Myths About Fast Food Jobs

As one of the fastest-growing industries in the country, food service — along with other low-paying sectors like retail and hospitality — is becoming the new normal of employment.

But as the following list from the Washington Post shows, this is a troubling trend, as many Americans do not realize what little the industry has to offer to its burgeoning and increasingly desperate labor force.

1. Fast-food workers are mostly teenagers working for pocket money.

Fast food was indeed an adolescent gig in the 1950s and 1960s, when the paper hat symbolized the classic short-term, entry-level job. But today, despite arguments that these low-wage jobs are largely filled by “suburban teenagers,” as the Heritage Foundation put it, labor data show that about 70 percent of the fast-food workforce is at least 20 years old. The typical burger-flipper is an independent adult of about 29, with a high school diploma. Nearly a third have some college experience, and many are single parents raising families on $9 an hour. In contrast to McDonald’s rather optimistic model budget — which assumes that an employee lives in a two-income household and doesn’t need child care or gas or groceries — a large portion of fast-food workers are forced to borrow from friends to cover basic household expenses, or sometimes fall into homelessness.

According to researchers at the University of California at Berkeley, about half of the families of front-line fast-food workers depend on public programs, compared with 25 percent of the American workforce. About 87 percent of fast-food workers lack employer health benefits, compared with 40 percent of the general workforce. And roughly one-fifth of workers’ families are below the poverty line. That adds up to some $7 billion in welfare payouts each year — essentially enabling fast-food mega-chains to subsidize ultra-low wages with public benefits.

2. Employees can work their way up and eventually even own a franchise.

Burger King’s career Web site proclaims: “You’ll never be short of opportunities to show what you’ve got. And if we like what we see, there’s no limit to how far you could go here.” The New York Restaurant Association boasts that restaurant work “creates an opportunity for people to live the American dream.” Under its franchise “success stories,” McDonald’s features a man who advanced from being a crew member to owning a franchise in just a few years.

The dream of upward mobility, however, eludes most workers. The National Employment Law Project (NELP) points out that about 90 percent of the fast-food workforce is made up of “front-line workers” such as line cooks and cashiers. About 9 percent are lower-level supervisors, who earn about $13 an hour. And just 2.2 percent of fast-food jobs are “managerial, professional, and technical occupations,” compared with 31 percent of jobs in the U.S. economy.

As for the notion of working your way up to ownership, NELP reports that 1 percent of the fast-food workforce owns a franchise — a purchase that could require $750,000 to several million dollars in financial assets. And there’s no indication that many of these franchisees actually did “rise through the ranks” to become owners, which requires an amount of capital that might top the lifetime salary of an average kitchen worker.

3. Fast-food companies can’t control franchise wages or working conditions.

McDonald’s plan to raise wages at least $1 over the local minimum wagewas announced this month to much fanfare. But the raise applies only to employees of the 1,500 stores McDonald’s owns directly. The company contends that as a chain franchisor, it merely licenses its brand to individual franchise operators; is not legally liable as an employer; and thus “does not direct or co-determine the hiring, termination, wages, hours” and other working conditions for all who toil under the golden arches.

But critics say these fast-food chains actually exert powerful oversight over their franchisees by closely tracking their spending and operations. Domino’s, one franchisee claims, critiqued how his employees answered the phone; Burger King franchisees sued the chain in 2009, claiming that it was forcing them to sell menu items for a loss at $1. Companies often pressure owner-operators to squeeze down labor costs: According to one employee quoted in the Guardian, “McDonald’s corporate representatives turn up at the restaurant where he works five or six times a year, counting the number of cars using the drive-through service, timing sales, making sure staff are preparing food according to McDonald’s specifications.” More so than most fast-food chains, McDonald’s wields financial control over its franchisees and owns the rental real estate of the restaurants.

Former McDonald’s executive Richard Adams has said: “McDonald’s franchisees are pretty compliant. They don’t really organize, they don’t really protest. And if you do, they tell you you’re not a good member of the McFamily. I don’t want to make this seem too Orwellian, but the average franchisee has about six restaurants, and the franchise agreement is for 20 years. You’re probably going to have a renewal coming up. If you’re not a compliant member of the team, you’re probably not going to get that renewal.”

The issue of whether McDonald’s can be labeled a “joint employer” is being litigated in numerous claims of unfair labor practices that workers have filed with the National Labor Relations Board. The NLRB’s general counsel recently deemed McDonald’s a joint employer, and if it is ultimately penalized as such, workers could see a dramatic expansion in the company’s legal and regulatory obligations.

4. Flipping burgers is an easy job.

Some people chafe at the idea of “unskilled” fast-food workers meriting a wage more suited to a “high-skilled” job. Not only does this ignore the fact that this work requires skills — from managing inventory to training and supervising other employees — it also disregards the day-to-day challenges workers navigate on the job. According to a slew of complaints filed with the Occupational Safety and Health Administration, workers often suffer injuries such as hot-oil burns and are sometimes denied proper medical care. (Some are told to dress wounds with condiments.) Violence is also common at fast-food restaurants; according to a recent survey, roughly one in eight workers reported being assaulted at work in the past year.

Workers have also complained of racial discrimination, sexual harassment and retaliatory punishment by management. More than 40 of the NLRB claims filed against McDonald’s in the past few years alleged illegal firings or penalties because of workers’ engagement in labor activism. Add to all of this the challenge of just getting paid: Subway was found guilty of 17,000separate wage and hour violations since 2000, and in 2013, Taco Bell was hit with a $2.5 million settlement in a class-action lawsuit over unpaid overtime.

5. Paying workers $15 an hour would make burgers prohibitively costly and hurt the industry.

Some analysts, particularly on the right, have laid out doomsday scenarios of massive economic disruption caused by a sudden doubling of wages in the fast-food industry. The Heritage Foundation argues that raising wages to $15 an hour could lead to a price spike, shrinking job opportunities, and huge drops in sales and profits . In reality, any such wage increase would probably be incremental and could be absorbed in large part by lowering the fees collected by parent companies from franchisees. Fast-food workers already enjoy such higher pay in other countries with strong labor regulation and union representation. A Big Mac in New Zealand costs less than one in the United States — $4.49 vs. $4.79, according to the Economist’s Big Mac index — and it’ll likely be served by a full-time union worker earning about $12 per hour.

Higher wages might also bring business benefits, in the form of lower turnover and good press. The Michigan-based fast-casual restaurant Moo Cluck Moo offers a $15 wage alongside premium grass-fed burgers, turning its reputation as a socially responsible employer into a selling point. The market for super-cheap fast food is apparently declining. Consumers just might be hungry for a more conscientious business model.

The World’s Twenty Largest Economies By 2030

Citing data from the U.S. Department of Agriculture (which apparently conducts macroeconomic studies and projections), Bloomberg Business shows the twenty countries that will have the largest economies in the world in just fifteen years time.

An assessment of the data:

The U.S. will just barely remain the global leader, with $24.8 trillion in annual output. The gray bar represents the $16.8 trillion gross domestic product projected for 2015, and the green bar shows how much bigger the economy is expected to be 15 years from now. The country, worth 25 percent of the world economy in 2006 and 23 percent in 2015, will see its share decline to 20 percent.

India, ranked eighth for 2015, will climb past Brazil, the United Kingdom, France, Germany and Japan to take third place in the world ranking. The International Monetary Fund calls India “the bright spot in the global landscape.” The country will have the largest workforce in the world within the next 15 years, the IMF notes, and among the youngest.

Other nations won’t be so lucky, particularly among developed economies. Japan, which was a roaring economy until its asset bubble burst in the early 1990s, has already slogged through decades of stagnation and will likely continue to see very little growth over the next 15 years. That will push Japan down a spot in the rankings by 2030, according to the USDA estimates.

Japan is “an important lesson in how quickly you can downshift your status of what a structure of an economy delivers,” said Bruce Kasman, JPMorgan’s chief economist.

France will slide three spots, while Italy drops two.

In the overall ranking, Jamaica will surrender the most ground, bumping down 13 places to 136. Countries with the biggest advances — like Uganda, which will climb 18 spots to rank 91 — are concentrated in Africa, Asia and the Middle East.

Some caveats:

“There are lots of uncertainties,” said Kasman. “Whether China grows at 4 percent or 6 percent matters an awful lot for where it looks like it’s going to be in the global economy. Whether India grows at 3 percent or 8 percent — these are huge differences when you compound them over long periods of time.”

The USDA is not the only — and hardly the most widely-followed — ranking of global economic growth, though it does offer the advantage of particularly long-term outlooks. The International Monetary Fund’s economic outlook only projects out two years. Look out for it later this month.

For a projection that goes another twenty years beyond the USDA’s, here are the biggest economies by 2050, accoriding to the World Bank and Goldman Sachs:

Future Powers

Keep in mind that all these studies are based on Gross Domestic Product (GDP), which has its limitations and is only one of several ways to measure an economy (albeit the most widely utilized).

Finally, here are the twenty fastest-growing countries of 2015, also courtesy of Bloomberg. 

Needless to say, there will be interesting times ahead, as economic power, and with it global influence, diffuses across an increasingly multipolar world.

What are your thoughts?

Why Do The Poor Buy Luxury Goods?

From Tressie McMillan Cottom at TPM

Why do poor people make stupid, illogical decisions to buy status symbols? For the same reason all but only the most wealthy buy status symbols, I suppose. We want to belong. And, not just for the psychic rewards, but belonging to one group at the right time can mean the difference between unemployment and employment, a good job as opposed to a bad job, housing or a shelter, and so on. Someone mentioned on twitter that poor people can be presentable with affordable options from Kmart. But the issue is not about being presentable. Presentable is the bare minimum of social civility. It means being clean, not smelling, wearing shirts and shoes for service and the like. Presentable as a sufficient condition for gainful, dignified work or successful social interactions is a privilege. It’s the aging white hippie who can cut the ponytail of his youthful rebellion and walk into senior management while aging black panthers can never completely outrun the effects of stigmatization against which they were courting a revolution. Presentable is relative and, like life, it ain’t fair.

In contrast, “acceptable” is about gaining access to a limited set of rewards granted upon group membership. I cannot know exactly how often my presentation of acceptable has helped me but I have enough feedback to know it is not inconsequential. One manager at the apartment complex where I worked while in college told me, repeatedly, that she knew I was “Okay” because my little Nissan was clean. That I had worn a Jones of New York suit to the interview really sealed the deal. She could call the suit by name because she asked me about the label in the interview. Another hiring manager at my first professional job looked me up and down in the waiting room, cataloging my outfit, and later told me that she had decided I was too classy to be on the call center floor. I was hired as a trainer instead. The difference meant no shift work, greater prestige, better pay and a baseline salary for all my future employment.

….

At the heart of these incredulous statements about the poor decisions poor people make is a belief that we would never be like them. We would know better. We would know to save our money, eschew status symbols, cut coupons, practice puritanical sacrifice to amass a million dollars. There is a regular news story of a lunch lady who, unbeknownst to all who knew her, died rich and leaves it all to a cat or a charity or some such. Books about the modest lives of the rich like to tell us how they drive Buicks instead of BMWs. What we forget, if we ever know, is that what we know now about status and wealth creation and sacrifice are predicated on who we are, i.e. not poor. If you change the conditions of your not-poor status, you change everything you know as a result of being a not-poor. You have no idea what you would do if you were poor until you are poor. And not intermittently poor or formerly not-poor, but born poor, expected to be poor and treated by bureaucracies, gatekeepers and well-meaning respectability authorities as inherently poor. Then, and only then, will you understand the relative value of a ridiculous status symbol to someone who intuits that they cannot afford to not have it.