The World’s Billionaire Cities

Right off the heel of my last post about the world’s poorest denizens, comes sobering article from PolicyMic that highlights the stark reality of global wealth inequality. It identifies the world’s most popular cities for billionaires, based on a recent report from Forbes.

Moscow remains the billionaire capital of the world, with 84 of the world’s richest people, together worth a total of over $366 billion. Of the other major cities on the list (some of which tied), five are in Asia (Istanbul, Mumbai, Seoul, Hong Kong and Beijing), two are in Europe (London and Paris), two are in the U.S. (New York City and Dallas), and one is in Latin America (Sao Paulo, Brazil).

According to the 2013 Wealth-X and UBS Billionaire Census, the first comprehensive study of the world’s billionaire population, the average billionaire holds $78 million in real estate, owns four homes (each worth nearly $20 million) and posses numerous luxury items, the most common being yachts, private jets and works of art.

Despite boasting many uber-rich residents, these cities also account for a disproportionate share of overall economic growth and rising income inequality, with many of them also hosting a large proportion of poor residents. According a report by Oxfam, 85 of the richest people in the world (most of whom live in these cities) control as much wealth as the poorest half of the world (3.5 billion people).

Portraits of People Living on a Dollar a Day

As a lifelong citizen in a well-off part of a wealthy country (the U.S.), I’m doubly insulated from the miserable circumstances that are the norm for most of my fellow humans. Around 17 percent of the world’s population — that’s one out of six people — live on a dollar or less a day, lacking any stable source of food, medical care, housing, and other basic needs.

Not only do more than a billion people lack material goods and comforts, but they live a precarious existence in which they’re never certain when or if the next meal will come; in which they’re just one injury or illness away from deeper poverty or even death; in which housing is barely livable, if existent at all. And all this transpires practically invisibly, with few people truly understanding, much less addressing, this extreme level of poverty.

But not if people like Thomas A. Nazario, the founder of a nonprofit called The Forgotten International, can help it. He’s written a new book with Pulitzer Prize winner Renée C. Byer called Living on a Dollar a Day: The Lives and Faces of the World’s Poor, which offers a much needed window into these people’s everyday lives, ultimately calling for action on their behalf.

Mother Jones interviewed Nazario about his motivations for this book, as well as about bigger topics like global inequality and the pervasive savior complex of well-meaning humanitarians. The interview is pretty insightful, and the article is full of excellent photos shared from the book (which I’m interested in reading and perhaps reviewing here at a later date). I highly recommend you read the rest of it, but here’s the part that most stood out for me.

Which stories affected you the most?

 There are three. One was the kids who live on an e-waste dump in Ghana. That was quite compelling for a variety of reasons, but I think if you look at the book and see those photographs and read that piece, it’ll hit you pretty hard.

Another piece was a family in Peru that lives on recycling. That, in and of itself, is not a big deal. Recycling is probably the second-largest occupation of the poor. But [the mother's] personal story, about how she had been abused by two different husbands, how her boys were taken away because they were needed to farm, and she was given all the girls—and how her kids will probably not ever go to school. She gets constantly evicted from one place or another because she can’t find enough recycling to pay the rent. When we left her—we gave everybody a gift of at least some kind for giving us their time and telling us their story—we gave her $80, which is about as much money as she makes in two months. She fell to her knees and started crying. Not only did I learn that 25 percent of garbage produced in developing countries is picked up by individuals like her, but that one of the biggest drivers of global poverty is domestic violence, and how women and children are thrown into poverty largely for that reason.
Of course, even those of us who hear anecdotes like this or see vivid photos of unspeakable squalor do far less than we can to help. While certain psychological factors play a role in our collective apathy, there’s no denying the inherent exploitative and inefficient characteristics of the current global economic system, in which tremendous amounts of wealth continue to be allocated to a small minority of people who are largely disconnected and unconcerned in regards to the horrific reality of most of their fellow citizens.
But that’s a conversation for a different day.

Study Finds Government Influenced By Mostly Wealthy Interests

Think Progress reports on new research that won’t surprise anyone but helps confirm a troubling trend: the policies and actions of the U.S. government overwhelmingly align with the preferences of wealthy citizens and well-moneyed interest groups.

“That’s according to a forthcoming article in Perspectives on Politics by Martin Gilens of Princeton University and Benjamin I. Page of Northwestern University. The two looked at a data set of 1,779 policy issues between 1981 and 2002 and matched them up against surveys of public opinion broken down by income as well as support from interest groups.

They estimate that the impact of what an average citizen prefers put up against what the elites and interest groups want is next to nothing, or “a non-significant, near-zero level.” They note that their findings show “ordinary citizens…have little or no independent influence on policy at all.” The affluent, on the other hand, have “a quite substantial, highly significant, independent impact on policy,” they find, “more so than any other set of actors” that they studied. Organized interest groups similarly fare well, with “a large, positive, highly significant impact on public policy.”

When they hold constant the preferences of interest groups and the rich, “it makes very little difference what the general public thinks,” they note. The probability that policy change occurs is basically the same whether a small group or a large majority of average citizens are in favor. On the other hand, all else being the same, opposition from the wealthy means that a particular policy is only adopted about 18 percent of the time, but when they support it it gets adopted 45 percent of the time. Similar patterns are true for interest groups.”

While it remains to be seen whether their research will stand up to scrutiny, I think it’s safe to say that, judging from how powerless or disinterested our politicians are with respect to broadly serving the public good, our government is hardly indicative of true democracy. But what are your thoughts?

The Bootstraps Myth

From Melissa McEwan of the blog Shakesville:

The Myth of Bootstraps goes something like this: I never got any help from anyone. I achieved my American Dream all on my own, through hard work. I got an education, I saved my money, I worked hard, I took risks, and I never complained or blamed anyone else when I failed, and every time I fell, I picked myself up by my bootstraps and just worked even harder. No one helped me.

This is almost always a lie.

There are vanishingly few people who have never had help from anyone—who never had family members who helped them, or friends, or colleagues, or teachers. 

Who never benefited from government programs that made sure they had electricity, or mail, or passable roads, or clean drinking water, or food, or shelter, or healthcare, or a loan. 

Who never had any kind of privilege from which they benefited, even if they didn’t actively try to trade on it. 

Who never had an opportunity they saw as luck which was really someone, somewhere, making a decision that benefited them. 

Who never had friends to help them move, so they didn’t have to pay for movers. Who never inherited a couch, so they didn’t have to pay for a couch. Who never got hand-me-down clothes from a cousin, so their parents could afford piano lessons. Who never had shoes that fit and weren’t leaky, when the kid down the street didn’t.

Most, maybe all, of the people who say they never got any help from anyone are taking a lot of help for granted.

They imagine that everyone has the same basic foundations that they had—and, if you point out to them that these kids over here live in an area rife with environmental pollutants that have been shown to affect growth or brain function or breathing capacity, they will simply sniff with indifference and declare that those things don’t matter. That government regulations which protect some living spaces and abandon others to poisons isn’t help. 

The government giving you money to eat is a hand-out. The government giving you regulations that protect the air you breathe is, at best, nothing of value—and, at worst, a job-killing regulation that impedes the success of people who want to get rich dumping toxins into the ground where people getting hand-outs live.

What are your thoughts?

Low-Wage Work Becomes The New Normal

It’s been well documented that the recession eliminated most of the already-declining number of well-paying jobs, with most of the (still-anemic) growth in employment occurring overwhelmingly in low-paying sectors. Now the latest report from the Bureau of Labor Statistics (BLS), courtesy of Business Insider, further underscores this troubling trend, revealing that nearly all of the top ten most common jobs are low-paying.

So retail and food (wherein most cashiers work) represent the lion’s share of new job growth.

To emphasize just how low-paying most of these vocations are, consider how much their annual mean pay matches up with the overall national mean wage (e.g. all U.S. occupations combined).

Registered nurses are the only folks doing fairly well, on average. Most of the other common jobs fall well short of the annual mean wage, with the three most common being around half or less of it. Needless to say, this represents a troubling development. While the degree to which one can survive on a low wage does vary by state, county, or city, overall you can’t get by for very long in many parts of the country by just working in these positions (which, by the way, typically lack benefits, paid sick leave, and full-time hours).

It’s also worth pointing out that given the decline the minimum wage’s value — when adjusted for inflation it’s actually less than what it was in 1968 — a lot of these menial and currently low-paying positions would actually have offered a decent standard of living (at least relative to what they do now). With the application of new technology and various administrative changes within businesses (outsourcing, streamlining management, etc) the economy seems to have reached a point where there just isn’t enough need for anything but food, consumer goods, medical care, and the like; even then, we need a lot more cashiers and cooks then we do doctors, managers, and lawyers — hence all the growth in the former jobs as compared to the latter.

In short, I think we need to re-think the way we pay people and scale back the notion that only “skilled” or technical work deserves a decent, living wage. The fact is, most well-paying professions have been replaced, rendered redundant, or simply aren’t in high enough demand relative to the number of people who need steady work. If companies have the resources to pay people better — and indeed most of the lowest-paying employers are very profitable — they should pay well enough to ensure a decent standard of living for their employees. After all, we can’t sustain an economy without a large market of consumers, who in turn can’t consume if they’re increasingly taking on low-paying work.

I’ll close by noting that this chart is full of caveats, as expressed by many of the commentators below. There are claims that the numbers attributed to certain professions don’t add up with other data, or that the mean annual wage for some jobs have been miscalculated, etc. I honestly don’t have the time to analyze the veracity of those criticism, but I leave it to your best judgment to determine the matter. Of course, always feel free to share your thoughts.

Hat tip to my friend Michael for first sharing this piece.

The International Arms Market

The Economist’s #Dailychart  from yesterday revealed the countries that buy and sell the most weapons. The United States, Russia, Germany, China, and France accounted for three-quarters of international arms exports over the past five years, with the first two taking the lion’s share of the export market (largely a legacy of the Cold War, which led both nations to build up a massive and still influential indigenous arms industry).

 

Other major arms dealers include the U.K., Spain, Ukraine, Italy, and Israel. Only 10 other countries, mostly in the developed world, have some sort of presence in the global arms market.

Notably, China — which was once a net importer of weapons, mostly from the U.S.S.R. — has tripled its share of exports in that time, overtaking France and set to surpass Germany as the third largest arms dealer (it still receives almost as many weapons as it sells, however). Germany’s significant role in arms dealing is interesting given the country’s otherwise pacifistic and low-key foreign policy, which is characterized by a reluctance to intervene in international affairs.

Some of the bigger importers include rising powers like India, China, and to lesser degrees Pakistan and South Korea. The Persian Gulf nations of the U.A.E. and Saudi Arabia also top the list, as does the tiny but influential city-state of Singapore (which is said to have one of the most advanced and well-trained armed forces in the world). Australia’s fairly high import rate likely reflect’s its growing influence in the Asia-Pacific region and its desire to play a bigger role therein.

Needless to say, this is revealing stuff. Read more about it here.

Lesser-Known Fun Facts About Each U.S. Presidents

Unfortunately, I’m working this Presidents Day — which is the birthday of both George Washington and Abraham Lincoln — so I’ve decided to just share this interesting article from HuffPost that offers at least one quirky fact about each president (Taft gets two, since he is the only president to have served two non-consecutive terms — there’s a fun fact!). Here are some of my favorites:

  • Andrew Jackson had a pet parrot that he taught how to swear.
  • Supposedly, President Van Buren popularized one of the most commonly used phrases to date: “OK”, or “Okay”. Van Buren was from Kinderhook, NY which was also called “Old Kinderhook”. His support groups came to be known as “O.K. Clubs” and the term OK came to mean “all right”.
  • When Abe Lincoln moved to New Salem, Illinois in 1831, he ran into a local bully named Jack Armstrong. Armstrong challenged Lincoln to a wrestling match outside of Denton Offutt’s store, where Lincoln was a clerk, and townspeople gathered to watch and wager on it. Lincoln won.
  • Andrew Johnson was drunk during his inauguration (go figure, he’s considered one of the worst presidents in U.S. history).
  • After leaving office, William Taft became the only ex-president to serve as Chief Justice of the Supreme Court, effectively becoming the only person to serve as the head of two branches of government. In doing so, he swore in both Calvin Coolidge and Herbert Hoover to the presidency. (On an unrelated note, he also lost 150 pounds after leaving office.)
  • To date, Woodrow Wilson was the only president to hold a doctorate degree, making him the highest educated president in the history of the United States. He was awarded the degree in Political Science and History from Johns Hopkins University. He also passed the Georgia Bar Exam despite not finishing law school.

Enjoy and have a safe and happy Presidents Day!

The Problem With ‘White History Month’

As Americans enter February, which is Black History Month, many of us will inevitably hear (or consider for ourselves) why there’s a month dedicated to blacks (and for that matter women and Hispanics) but not to whites. Setting aside the fact that minority views are often underrepresented or marginalized in mainstream history and culture — hence the effort to highlight these perspectives with their own dedicated events and institutions — Mary-Alice Daniel of Salon offers another good reason, one which explores the U.S.’s unusual, complex, and largely unknown history of racial identity.

The very notion of whiteness is relatively recent in our human history, linked to the rise of European colonialism and the Atlantic slave trade in the 17th century as a way to distinguish the master from the slave. From its inception, “white” was not simply a separate race, but the superior race. “White people,” in opposition to non-whites or “colored” people, have constituted a meaningful social category for only a few hundred years, and the conception of who is included in that category has changed repeatedly. If you went back to even just the beginning of the last century, you’d witness a completely different racial configuration of whites and non-whites. The original white Americans — those from England, certain areas of Western Europe, and the Nordic States — excluded other European immigrants from that category to deny them jobs, social standing, and legal privileges. It’s not widely known in the U.S. that several ethnic groups, such as Germans, Italians, Russians and the Irish, were excluded from whiteness and considered non-white as recently as the early 20th century.

Members of these groups sometimes sued the state in order to be legally recognized as white, so they could access a variety of rights available only to whites — specifically American citizenship, which was then limited, by the U.S. Naturalization Law of 1790, to “free white persons” of “good character.” Attorney John Tehranian writes in the Yale Law Journal that petitioners could present a case based not on skin color, but on “religious practices, culture, education, intermarriage and [their] community’s role,” to try to secure their admission to this elite social group and its accompanying advantages.

More than color, it was class that defined race. For whiteness to maintain its superiority, membership had to be strictly controlled. The “gift” of whiteness was bestowed on those who could afford it, or when it was politically expedient. In his book “How the Irish Became White,”Noel Ignatiev argues that Irish immigrants were incorporated into whiteness in order to suppress the economic competitiveness of free black workers and undermine efforts to unite low-wage black and Irish Americans into an economic bloc bent on unionizing labor. The aspiration to whiteness was exploited to politically and socially divide groups that had more similarities than differences. It was an apple dangled in front of working-class immigrant groups, often as a reward for subjugating other groups.

A lack of awareness of these facts has lent credence to the erroneous belief that whiteness is inherent and has always existed, either as an actual biological difference or as a cohesive social grouping. Some still claim it is natural for whites to gravitate to their own and that humans are tribal and predisposed to congregate with their kind. It’s easy, simple and natural: White people have always been white people. Thinking about racial identity is for those other people.

Those who identify as white should start thinking about their inheritance of this identity and understand its implications. When what counts as your “own kind” changes so frequently and is so susceptible to contemporaneous political schemes, it becomes impossible to argue an innate explanation for white exclusion. Whiteness was never about skin color or a natural inclination to stand with one’s own; it was designed to racialize power and conveniently dehumanize outsiders and the enslaved. It has always been a calculated game with very real economic motivations and benefits.

This revelation should not function as an excuse for those in groups recently accepted as white to claim to understand racism, to absolve themselves of white privilege or to deny that their forefathers, while not considered white, were still, in the hierarchy created by whites, responsible in turn for oppressing those “lower” on the racial scale. During the Civil War, Irish immigrants were responsible for some of the most violent attacks against freedmen in the North, such as the wave of lynchings during the 1863 Draft Riots, in which “the majority of participants were Irish,” according to Eric Foner’s book “Reconstruction: America’s Unfinished Revolution, 1863-1877”and various other sources.  According to historian Dominic Pacyga, Polish Americans groups in Chicago and Detroit “worked to prevent the integration of blacks into their communities by implementing rigid housing segregation” out of a fear that black people would “leap over them into a higher social status position.”

Behind every racial conversation is a complex history that extends to present-day interactions and policies, and we get nowhere fast if large swaths of our population have a limited frame of reference. An understanding of whiteness might have prevented the utter incapability of some Americans to realize that “Hispanic” is not a race — that white Hispanics do exist, George Zimmerman among them. This knowledge might have lessened the cries that Trayvon Martin’s murder could not have been racially motivated and might have led to, if not a just verdict, a less painfully ignorant response from many white Americans.

As for how all this ties into why a white history month would be wrongheaded and besides the point:

If students are taught that whiteness is based on a history of exclusion, they might easily see that there is nothing in the designation as “white” to be proud of. Being proud of being white doesn’t mean finding your pale skin pretty or your Swedish history fascinating. It means being proud of the violent disenfranchisement of those barred from this category. Being proud of being black means being proud of surviving this ostracism. Be proud to be Scottish, Norwegian or French, but not white.

Above all, such an education might help answer the question of whose problem modern racism really is. The current divide is a white construction, and it is up to white people to do the necessary work to dismantle the system borne from the slave trade, instead of ignoring it or telling people of color to “get over” its extant legacy. Critics of white studies have claimed that this kind of inquiry leads only to self-hatred and guilt. Leaving aside that avoiding self-reflection out of fear of bad feelings is the direct enemy of personal and intellectual growth, I agree that such an outcome should be resisted, because guilt is an unproductive emotion, and merely feeling guilty is satisfying enough for some. My hope in writing this is that white Americans will discover how it is they came to be set apart from non-whites and decide what they plan to do about it.

What do you think?

Map: U.S. Life Expectancy By State

Although the average American is living an impressive 30 years longer than 100 years ago — about 79.8 — by global standards, the U.S. still remains middle-of-the-road despite its great wealth; typically, we’re in the mid-thirties, usually along the same level as Cuba, Chile, or Costa Rica. Furthermore, life expectancy varies wildly from state to state, as the following map from The Atlantic clearly shows:

Life expectancy by state compared to closest matching country.

There’s profound variation by state, from a low of 75 years in Mississippi to a high of 81.3 in Hawaii. Mostly, we resemble tiny, equatorial hamlets like Kuwait and Barbados. At our worst, we look more like Malaysia or Oman, and at our best, like the United Kingdom. No state approaches the life expectancies of most European countries or some Asian ones. Icelandic people can expect to live a long 83.3 years, and that’s nothing compared to the Japanese, who live well beyond 84.

Life expectancy can be causal, a factor of diet, environment, medical care, and education. But it can also be recursive: People who are chronically sick are less likely to become wealthy, and thus less likely to live in affluent areas and have access to the great doctors and Whole-Foods kale that would have helped them live longer.

It’s worth noting that the life expectancy for certain groups within the U.S. can be much higher—or lower—than the norm. The life expectancy for African Americans is, on average, 3.8 years shorter than that of whites. Detroit has a life expectancy of just 77.6 years, but that city’s Asian Americans can expect to live 89.3 years.

But overall, the map reflects what we’d expect: People in southern states, which generally have lower incomes and higher obesity rates, tend to die sooner, and healthier, richer states tend to foster longevity.

It’s also worth adding that overall, the U.S. is far less healthy and long-lived than it should be, even when you adjust for wealth, race, and other factors (e.g. young Americans are less healthy than young people in other developed countries, rich people are typically less healthy than other rich non-Americans, etc).