Celebrating Fourth of July? You’re Either Two Days Late or One Month Early

Fourth of July factoid: the legal separation of the Thirteen Colonies from Great Britain technically occurred on July 2, when the Second Continental Congress voted to approve a formal resolution of independence, which had first been suggested a month earlier.

The Declaration of Independence was hammered out two days later to explain this decision and subsequently signed July 4. Since what occurred July 2 was private, the American people saw the day that the public announcement was signed as the true day of independence — although John Adams allegedly preferred July 2 as the date. As he wrote on July 3:

The second day of July, 1776, will be the most memorable epoch in the history of America. I am apt to believe that it will be celebrated by succeeding generations as the great anniversary festival. It ought to be commemorated as the day of deliverance, by solemn acts of devotion to God Almighty. It ought to be solemnized with pomp and parade, with shows, games, sports, guns, bells, bonfires, and illuminations, from one end of this continent to the other, from this time forward forever more.

It gets more interesting: despite the claim of the Founding Fathers, many historians believe that the Declaration was actually signed nearly a month after its adoption, on August 2, 1776. Coincidentally, both John Adams and Thomas Jefferson – the only signers of the Declaration who would later serve as Presidents – died on the same day: July 4, 1826, the 50th anniversary of the Declaration (moreover, although not a signer of the document, James Madison also died on July 4).

Anyway, have a safe and happy Fourth of July.

Source: Wikipedia, Quartz

Quote

Calling the invasion and slaughter that followed a mistake papers over the lies that took us to Iraq. This assessment of the war as mistake is coming mostly from well-intentioned people, some of whom spoke out against the war before it began and every year it dragged on. It may seem like a proper retort to critics of Obama (who inherited that war rather than started it). But it feeds a dangerous myth.

A mistake is not putting enough garlic in the minestrone, taking the wrong exit, typing the wrong key, falling prey to an accident.

Invading Iraq was not a friggin’ mistake. Not an accident. Not some foreign policy mishap.

The guys in charge carried out a coldly though ineptly calculated act. An act made with the intention of privatizing Iraq and using that country as a springboard to other Middle Eastern targets, most especially Iran. They led a murderous, perfidious end run around international law founded on a dubious “preventive” military doctrine piggybacked on the nation’s rage over the 9/11 attacks. An imperial, morally corrupt war. They ramrodded it past the objections of those in and out of Congress who challenged the fabricated claims of administration advisers who had been looking for an excuse to take out Saddam Hussein years before the U.S. Supreme Court plunked George W. Bush into the Oval Office.

The traditional media did not make a mistake either. They misled their audiences through sloppiness and laziness because it was easier and better for ratings than for them actually to do their jobs. For the worst of them, the misleading was deliberate. They fed us disinformation. Lapdogs instead of watchdogs.

Meteor Blades, “Stop pretending the invasion of Iraq was a ‘mistake.’ It lets the liars who launched it off the hook“, Daily Kos. 

Read the linked article above and decide for yourself. Personally, I think it makes a compelling case, although even if it were genuine ineptitude, there’d be just as much culpability given the horrific scale of the consequences.

Don’t Call The Iraq War A Mistake

The Top Cities for College Graduates

While major metropolises like Los Angeles and New York City have long served as Meccas for young talent, new research by CityLab and the Martin Prosperity Institute reveal several medium-sized but fast-growing cities that are overtaking these traditional destinations. Here is the data in question, which is based on the net domestic migration of workers from one city to another between 2011 and 2012.

Not that the map doesn’t necessarily reflect larger trends, but rather which cities are seeing growth or decline in their skilled workforce; nevertheless, this gives us a pretty good idea of which cities will likely do better in the long run as their talent pool grows. Here’s an analysis courtesy of PolicyMic (my source for this data).

[The] cities that are attracting a coveted educated workforce are “knowledge and tech hubs like San Francisco, Austin, Seattle and Denver, and also Sun Belt metros like Phoenix, Charlotte and Miami.” In particular, “Seattle, San Francisco, D.C., Denver, San Jose, Austin and Portland, as well as the banking hub of Charlotte” are attracting Americans with professional and graduate degrees.

Overall, large metros (especially ones that cost a lot to live and work in) have seen their share of educated workers grow, even as lower-class workers get priced out. San Francisco; Los Angeles; Washington, D.C.; and Miami, for example, all saw a net reduction in less-educated workers. Meanwhile, the cities that saw the biggest influx of workers with just a high school diploma were all in Sun Belt states, mainly with thriving tourist and/or service economies.

Moreover, the analysis also determined that the cities most appealing to graduates and educated professionals tended to have the following characteristics: high concentrations of high-tech and venture capital firms, which tend to invest in the sort of start-ups younger people are more likely to launch; a strong cultural scene with a large creative class; and a high level of diversity and tolerance, particularly with respect to LGBTQ people (indeed, there was a correlation between a large LGBTQ population and growth in the number of young talent moving in).

None of this is too surprising, given that younger people typically favor more tolerance, culture, and social progressiveness. Such values create an atmosphere more conducive to creativity, innovation, and the exchange of ideas — which in turn are vital for a knowledge-based economy (I am also speaking from experience as a lifelong Miami resident currently working for a young marketing start-up that in turn works with other young start-ups).

Well, you might be wondering how it is that LA and NYC don’t perform better in this regard, given that they certainly fit the prerequisites that have benefited other cities. Unfortunately, there’s a clear reason for this: high rents and high levels of gentrification are basically pricing younger grads (among others) out of these cities; furthermore, their overall fast growth and size makes them a bit too crowded for younger people, who increasingly prefer medium-sized cities that offer something of a balance.

Granted, I find that there is a lot of inequality, gentrification, and poverty in the majority of medium-sized cities that have become popular (especially my own hometown of Miami, which ranks as the second-most unequal census area in the country). Will it be that the recent growth in professionals and talent help counteract these trends and bring prosperity? Or is their growth in this area a reflection of widening inequality, such that whole sections of these metros can thrive and growth while others languish and decline?

Again, speaking for Miami, I can say that the latter trend seems to be the case: a drive around the city will simultaneously take you past affluent and vibrant communities as well as blighted slums and ghettos — often right across the street from each other. The growth in talent is important, but how it’s harnessed and where is important. I welcome this trend of course, but I hope it leads to broader change rather than more yawning inequality; otherwise, these now-attractive cities may fall to the wayside just as previous metros have.

Your thoughts?

 

An Amazing and Heartwarming Way to Learn a Language

ADWEEK recently featured a simple but innovative way to address two seemingly unrelated issues at once: teaching young people English while giving lonely elderly people someone to talk to.

FCB Brazil did just that with its “Speaking Exchange” project for CNA language schools. As seen in the touching case study below, the young Brazilians and older Americans connect via Web chats, and they not only begin to share a language—they develop relationships that enrich both sides culturally and emotionally.

The differences in age and background combine to make the interactions remarkable to watch. And the participants clearly grow close to one another, to the point where they end up speaking from the heart in a more universal language than English.

The pilot project was implemented at a CNA school in Liberdade, Brazil, and the Windsor Park Retirement Community in Chicago. The conversations are recorded and uploaded as private YouTube videos for the teachers to evaluate the students’ development.

“The idea is simple and it’s a win-win proposition for both the students and the American senior citizens. It’s exciting to see their reactions and contentment. It truly benefits both sides,” says Joanna Monteiro, executive creative director at FCB Brazil.

Says Max Geraldo, FCB Brazil’s executive director: “The beauty of this project is in CNA’s belief that we develop better students when we develop better people.”

Needless to say, this is pretty touching and inspiring stuff. I’d love to see more programs like this take off between other countries. Come to think of it, I wouldn’t mind participating in one myself.

Check out the heartwarming introductory video below. What do you think?

America’s Surprising Linguistic Diversity

Most of you probably know that Spanish is the second-most common language in the United States after English. But did you know that Chinese is in third place, followed by Tagalog, a Philippine language? It gets even more interesting when you crunch the numbers by state, as Ben Bratt of Slate did using data from the 2009 American Community Survey endorsed by the U.S. Census Bureau.

The following numbers are based on how many people over the age of five speak speak a particular language at home. Thus, the results don’t include people who learned a second language, say at school, but otherwise don’t utilize it as their primary one in the U.S.

 

Again, Spanish’s dominance isn’t too surprising — it’s spoken by around 10 percent of the population, roughly 35 million people (and that’s of almost five years ago).

French is dominant in areas that were former French settlements (Louisiana and parts of Maine) or that were (and still are) in close proximity to French communities (Maine, Vermont, and New Hampshire).

Yupik, an Inuit language, is the second-most spoken language in Alaska, which also isn’t surprising given that 15 percent of the state’s population is indigenous — the highest proportion of any U.S. state.

Tagalog’s popularity in Hawaii reflects the large Filipino population, which is the single largest ethnic group in the state. In fact, only around a quarter of Hawaiians are non-Hispanic whites, with over a third being Asian (although native Hawaiians comprise around six percent of the state’s population, only around 0.1 percent of Hawaii’s residents speak the language).

Now here’s how the U.S. looks once you reveal the third-most common languages per state.

 

 

Pretty fascinating stuff, yes? Who would think that Vietnamese is the third-most common language in Nebraska, Oklahoma, and Texas? Or that Portuguese is prominent in Massachusetts and Rhode Island? Given that Germans are the largest ethnic group in the U.S., its linguistic prevalence isn’t too surprising – in fact, it would arguably be much more common were it not for the World Wars discouraging its usage. 

My own home state, Florida, is a major gateway for people from Latin America and the Caribbean — hence why French Creole, namely the Haitian variety, follows Spanish as the most common non-English language. The fact that a native language is most prevalent in Arizona and New Mexico (as well as in South Dakota and Alaska) reflects the pattern of settlement of the U.S. — most Native Americans in the live in the West because it was settled much later on, leading to relatively less cultural and demographic destruction.

In any case, the commonality of a particular language in a state provides a lot of interesting insight into the new and/or historic developments in the area. If you’re curious, here are the most popular languages in the U.S. overall according to the survey (courtesy of Wikipedia).

  1. English only – 228,699,523
  2. Spanish – 35,468,501
  3. Chinese (mostly Yue dialects like Cantonese, with a growing number of Mandarin speakers) – 2,600,150
  4. Tagalog – 1,513,734
  5. French – 1,305,503
  6. Vietnamese – 1,251,468
  7. German – 1,109,216
  8. Korean – 1,039,021
  9. Russian – 881,723
  10. Arabic – 845,396
  11. Italian – 753,992
  12. Portuguese – 731,282
  13. Other Indian Languages  – 668,596
  14. French Creole – 659,053
  15. Polish – 593,598
  16. Hindi – 560,983
  17. Armenian – 498,700
  18. Japanese – 445,471
  19. Persian – 396,769
  20. Urdu – 355,964
  21. Greek – 325,747
  22. Hebrew – 221,593
  23. Hmong – 260,073
  24. Mon–KhmerCambodian – 202,033
  25. Hmong – 193,179
  26. Navajo – 169,009
  27. Thai – 152,679
  28. Yiddish – 148,155
  29. Laotian – 146,297

Hat tip to my friend Alexander for sharing this with me. 

Source: Gizmodo

How Americans Die

Bloomberg.com has posted a fascinating compilation of visual data that explores the changes in health, lifestyle, and mortality for Americans between 1970 and 2010. Aside from satiating my morbid curiosity, these data also make it possible to learn a lot about our society and how it’s changed based on how we die.

For example, most people are now dying from suicide, drug abuse, and natural causes rather than infectious diseases, demonstrating that we’re living long enough to be claimed by age-related ailments, as well as raising questions about the impact of modern living on mental health.

Since all the charts and graphs are interactive, I can’t embed them here, but I highly recommend you give them a look by clicking here.

 

 

Lesser-Known Fun Facts About Each U.S. Presidents

Unfortunately, I’m working this Presidents Day — which is the birthday of both George Washington and Abraham Lincoln — so I’ve decided to just share this interesting article from HuffPost that offers at least one quirky fact about each president (Taft gets two, since he is the only president to have served two non-consecutive terms — there’s a fun fact!). Here are some of my favorites:

  • Andrew Jackson had a pet parrot that he taught how to swear.
  • Supposedly, President Van Buren popularized one of the most commonly used phrases to date: “OK”, or “Okay”. Van Buren was from Kinderhook, NY which was also called “Old Kinderhook”. His support groups came to be known as “O.K. Clubs” and the term OK came to mean “all right”.
  • When Abe Lincoln moved to New Salem, Illinois in 1831, he ran into a local bully named Jack Armstrong. Armstrong challenged Lincoln to a wrestling match outside of Denton Offutt’s store, where Lincoln was a clerk, and townspeople gathered to watch and wager on it. Lincoln won.
  • Andrew Johnson was drunk during his inauguration (go figure, he’s considered one of the worst presidents in U.S. history).
  • After leaving office, William Taft became the only ex-president to serve as Chief Justice of the Supreme Court, effectively becoming the only person to serve as the head of two branches of government. In doing so, he swore in both Calvin Coolidge and Herbert Hoover to the presidency. (On an unrelated note, he also lost 150 pounds after leaving office.)
  • To date, Woodrow Wilson was the only president to hold a doctorate degree, making him the highest educated president in the history of the United States. He was awarded the degree in Political Science and History from Johns Hopkins University. He also passed the Georgia Bar Exam despite not finishing law school.

Enjoy and have a safe and happy Presidents Day!

The Problem With ‘White History Month’

As Americans enter February, which is Black History Month, many of us will inevitably hear (or consider for ourselves) why there’s a month dedicated to blacks (and for that matter women and Hispanics) but not to whites. Setting aside the fact that minority views are often underrepresented or marginalized in mainstream history and culture — hence the effort to highlight these perspectives with their own dedicated events and institutions — Mary-Alice Daniel of Salon offers another good reason, one which explores the U.S.’s unusual, complex, and largely unknown history of racial identity.

The very notion of whiteness is relatively recent in our human history, linked to the rise of European colonialism and the Atlantic slave trade in the 17th century as a way to distinguish the master from the slave. From its inception, “white” was not simply a separate race, but the superior race. “White people,” in opposition to non-whites or “colored” people, have constituted a meaningful social category for only a few hundred years, and the conception of who is included in that category has changed repeatedly. If you went back to even just the beginning of the last century, you’d witness a completely different racial configuration of whites and non-whites. The original white Americans — those from England, certain areas of Western Europe, and the Nordic States — excluded other European immigrants from that category to deny them jobs, social standing, and legal privileges. It’s not widely known in the U.S. that several ethnic groups, such as Germans, Italians, Russians and the Irish, were excluded from whiteness and considered non-white as recently as the early 20th century.

Members of these groups sometimes sued the state in order to be legally recognized as white, so they could access a variety of rights available only to whites — specifically American citizenship, which was then limited, by the U.S. Naturalization Law of 1790, to “free white persons” of “good character.” Attorney John Tehranian writes in the Yale Law Journal that petitioners could present a case based not on skin color, but on “religious practices, culture, education, intermarriage and [their] community’s role,” to try to secure their admission to this elite social group and its accompanying advantages.

More than color, it was class that defined race. For whiteness to maintain its superiority, membership had to be strictly controlled. The “gift” of whiteness was bestowed on those who could afford it, or when it was politically expedient. In his book “How the Irish Became White,”Noel Ignatiev argues that Irish immigrants were incorporated into whiteness in order to suppress the economic competitiveness of free black workers and undermine efforts to unite low-wage black and Irish Americans into an economic bloc bent on unionizing labor. The aspiration to whiteness was exploited to politically and socially divide groups that had more similarities than differences. It was an apple dangled in front of working-class immigrant groups, often as a reward for subjugating other groups.

A lack of awareness of these facts has lent credence to the erroneous belief that whiteness is inherent and has always existed, either as an actual biological difference or as a cohesive social grouping. Some still claim it is natural for whites to gravitate to their own and that humans are tribal and predisposed to congregate with their kind. It’s easy, simple and natural: White people have always been white people. Thinking about racial identity is for those other people.

Those who identify as white should start thinking about their inheritance of this identity and understand its implications. When what counts as your “own kind” changes so frequently and is so susceptible to contemporaneous political schemes, it becomes impossible to argue an innate explanation for white exclusion. Whiteness was never about skin color or a natural inclination to stand with one’s own; it was designed to racialize power and conveniently dehumanize outsiders and the enslaved. It has always been a calculated game with very real economic motivations and benefits.

This revelation should not function as an excuse for those in groups recently accepted as white to claim to understand racism, to absolve themselves of white privilege or to deny that their forefathers, while not considered white, were still, in the hierarchy created by whites, responsible in turn for oppressing those “lower” on the racial scale. During the Civil War, Irish immigrants were responsible for some of the most violent attacks against freedmen in the North, such as the wave of lynchings during the 1863 Draft Riots, in which “the majority of participants were Irish,” according to Eric Foner’s book “Reconstruction: America’s Unfinished Revolution, 1863-1877”and various other sources.  According to historian Dominic Pacyga, Polish Americans groups in Chicago and Detroit “worked to prevent the integration of blacks into their communities by implementing rigid housing segregation” out of a fear that black people would “leap over them into a higher social status position.”

Behind every racial conversation is a complex history that extends to present-day interactions and policies, and we get nowhere fast if large swaths of our population have a limited frame of reference. An understanding of whiteness might have prevented the utter incapability of some Americans to realize that “Hispanic” is not a race — that white Hispanics do exist, George Zimmerman among them. This knowledge might have lessened the cries that Trayvon Martin’s murder could not have been racially motivated and might have led to, if not a just verdict, a less painfully ignorant response from many white Americans.

As for how all this ties into why a white history month would be wrongheaded and besides the point:

If students are taught that whiteness is based on a history of exclusion, they might easily see that there is nothing in the designation as “white” to be proud of. Being proud of being white doesn’t mean finding your pale skin pretty or your Swedish history fascinating. It means being proud of the violent disenfranchisement of those barred from this category. Being proud of being black means being proud of surviving this ostracism. Be proud to be Scottish, Norwegian or French, but not white.

Above all, such an education might help answer the question of whose problem modern racism really is. The current divide is a white construction, and it is up to white people to do the necessary work to dismantle the system borne from the slave trade, instead of ignoring it or telling people of color to “get over” its extant legacy. Critics of white studies have claimed that this kind of inquiry leads only to self-hatred and guilt. Leaving aside that avoiding self-reflection out of fear of bad feelings is the direct enemy of personal and intellectual growth, I agree that such an outcome should be resisted, because guilt is an unproductive emotion, and merely feeling guilty is satisfying enough for some. My hope in writing this is that white Americans will discover how it is they came to be set apart from non-whites and decide what they plan to do about it.

What do you think?

Map: U.S. Life Expectancy By State

Although the average American is living an impressive 30 years longer than 100 years ago — about 79.8 — by global standards, the U.S. still remains middle-of-the-road despite its great wealth; typically, we’re in the mid-thirties, usually along the same level as Cuba, Chile, or Costa Rica. Furthermore, life expectancy varies wildly from state to state, as the following map from The Atlantic clearly shows:

Life expectancy by state compared to closest matching country.

There’s profound variation by state, from a low of 75 years in Mississippi to a high of 81.3 in Hawaii. Mostly, we resemble tiny, equatorial hamlets like Kuwait and Barbados. At our worst, we look more like Malaysia or Oman, and at our best, like the United Kingdom. No state approaches the life expectancies of most European countries or some Asian ones. Icelandic people can expect to live a long 83.3 years, and that’s nothing compared to the Japanese, who live well beyond 84.

Life expectancy can be causal, a factor of diet, environment, medical care, and education. But it can also be recursive: People who are chronically sick are less likely to become wealthy, and thus less likely to live in affluent areas and have access to the great doctors and Whole-Foods kale that would have helped them live longer.

It’s worth noting that the life expectancy for certain groups within the U.S. can be much higher—or lower—than the norm. The life expectancy for African Americans is, on average, 3.8 years shorter than that of whites. Detroit has a life expectancy of just 77.6 years, but that city’s Asian Americans can expect to live 89.3 years.

But overall, the map reflects what we’d expect: People in southern states, which generally have lower incomes and higher obesity rates, tend to die sooner, and healthier, richer states tend to foster longevity.

It’s also worth adding that overall, the U.S. is far less healthy and long-lived than it should be, even when you adjust for wealth, race, and other factors (e.g. young Americans are less healthy than young people in other developed countries, rich people are typically less healthy than other rich non-Americans, etc).

Inequality Isn’t Just a Moral Problem, But a Practical One

Whenever the topic of socioeconomic inequality is brought up, the emphasis is usually placed upon its moral and humanitarian consequences. While this is certainly a valid approach — after all, there’s a lot of human suffering involved — there is another factor to consider: inequality is bad for business, the economy, and the nation’s long-term political stability (one would hope that this would also win over those for whom poverty isn’t a moral problem, but many such individuals typically either don’t recognize inequality as a growing problem, or reject that it has such consequences).

An article by Daniel Altman of Foreign Policy highlights the argument fairly succinctly, noting that the practical problems of inequality could actually just as bad, if not worse, than the social ones:

When any market has a shortage, not everyone gets the things they want. But who does get them also matters, because it’s not always the people who value those things the most.

Economists Edward Glaeser and Erzo Luttmer made this point in a 2003 paper about rent control. “The standard analysis of price controls assumes that goods are efficiently allocated, even when there are shortages,” they wrote. “But if shortages mean that goods are randomly allocated across the consumers that want them, the welfare costs from misallocation may be greater than the undersupply costs.” In other words, letting the wrong people buy the scarce goods can be even worse for society than the scarcity itself.

This problem, which economists call inefficient allocation, is present in the market for opportunities as well. It’s best for the economy when the person best able to exploit an opportunity is the one who gets it. By giving opportunities to these people, we make the economic pie as big as possible. But sometimes, the allocation of opportunity is not determined solely by effort or ability.

For example, consider all of the would-be innovators, thinkers, and other social contributors who are otherwise precluded from realizing their potential — and benefiting society further — due a lack of resources? Conversely, what happens when unqualified or immoral people are allowed to amass disproportionate amount of wealth and resources, and thus gain all the outsized political and economic influence that goes with it? As the article goes on to note, unless you meet the ever-higher financial demands needed to access the avenues of influences and personal development, you wont’t get far in America:

To a great degree, access to opportunity in the United States depends on wealth. Discrimination based on race, religion, gender, and sexual discrimination may be on the wane in many countries, but discrimination based on wealth is still a powerful force. It opens doors, especially for people who may not boast the strongest talents or work ethic.

Country club memberships, charity dinners, and other platforms for economic networking come with high price tags decided by existing elites. Their exclusion of a whole swath of society because of something other than human potential automatically creates scope for inefficient allocation. But it’s not always people who do the discriminating; sometimes it’s just the system.

For instance, consider elected office. It’s a tremendous opportunity, both for the implementation of a person’s ideas and, sad to say, for financial enrichment as well. Yet running for office takes money — lots of it — and there are no restrictions on how much a candidate may spend. As a result, the people who win have tended to be very wealthy.

Of course, political life isn’t the only economic opportunity with a limited number of spots. In the United States, places at top universities are so scant that many accept fewer than 10 percent of applicants. Even with need-blind admissions, kids from wealthy backgrounds have huge advantages; they apply having received better schooling, tutoring if they needed it, enrichment through travel, and good nutrition and healthcare throughout their youth.

The fact that money affects access to these opportunities, even in part, implies some seats in Congress and Ivy League lecture halls would have been used more productively by poorer people of greater gifts. These two cases are particularly important, because they imply that fighting poverty alone is not enough to correct inefficient allocations. With a limited number of places at stake, what matters is relative wealth, or who can outspend whom. And when inequality rises, this gap grows.

I would hope that it goes without saying  that it’s problematic when a majority of society’s policymakers, public officials, academics, corporate executives, and other influential classes come from the same small (and narrowing) economic class of people. Diversity of experience and background is valuable to informing how society should be run. If all kinds of groups are locked out of the avenues of power due to not fitting some arbitrary requirement (in this case money and the connections that it brings), then it bodes ill for our ability to solve pressing problems.

So here comes the tricky and controversial part: how do we solve this problem?

If you believe that poor people are poor because they are stupid or lazy — and that their children probably will be as well — then the issue of inefficient allocation disappears. But if you think that a smart and hardworking child could be born into a poor household, then inefficient allocation is a serious problem. Solving it would enhance economic growth and boost the value of American assets.

There are two options. The first is to remove wealth from every process that doles out economic opportunities: take money out of politics, give all children equal schooling and college prep, base country club admissions on anonymous interviews, etc. This will be difficult, especially since our society is moving rapidly in the other direction. Election campaigns are flooded with more money than ever, and the net price of a college education — after loans and grants — has jumped even as increases in list prices have finally slowed. Poor kids who do make it to college will have to spend more time scrubbing toilets and dinner trays and less time studying.

The other option is to reduce inequality of wealth. Giving poor children a leg up through early childhood education or other interventions might help, but it would also take decades. So would deepening the progressivity of the income tax, which only affects flows of new wealth and not existing stocks. In the meantime, a huge amount of economic activity might be lost to inefficient allocation.

The easiest way to redistribute wealth continues to be the estate tax, yet it is politically unpopular and applies to only about 10,000 households a year. All of this might change, however, as more research estimates the harm caused by inequality through the inefficient allocation of opportunities.

This kind of research is not always straightforward, since it measures things that didn’t happen as well as those that did. Nevertheless, some economists have already shown how value can be destroyed through inheritance and cronyism among the wealthy. Scaled up to the entire economy, the numbers are on the order of billions of dollars.

These costs are not unique to the United States. Even as globalization has reduced inequality between countries, it has often increased inequality within them; the rich are better able to capitalize on its opportunities. Where nepotism and privilege are prevalent, the costs are amplified.

Needless to say, this a complicated issue that will take a lot more than a few changes to regulatory and tax policies. What do you think of this issue or the solution posited?