What someone is paid has little or no relationship to what their work is worth to society.
Does anyone seriously believe hedge-fund mogul Steven A. Cohen is worth the $2.3 billion he raked in last year, despite being slapped with a $1.8 billion fine after his firm pleaded guilty to insider trading?
On the other hand, what’s the worth to society of social workers who put in long and difficult hours dealing with patients suffering from mental illness or substance abuse? Probably higher than their average pay of $18.14 an hour, which translates into less than $38,000 a year.
How much does society gain from personal-care aides who assist the elderly, convalescents, and persons with disabilities? Likely more than their average pay of $9.67 an hour, or just over $20,000 a year.
What’s the social worth of hospital orderlies who feed, bathe, dress, and move patients, and empty their ben pans? Surely higher than their median wage of$11.63 an hour, or $24,190 a year.
Yet what would the rest of us do without these dedicated people?
Or consider kindergarten teachers, who make an average of $53,590 a year.
Before you conclude that’s generous, consider that a good kindergarten teacher is worth his or her weight in gold, almost.
One study found that children with outstanding kindergarten teachers are more likely to go to college and less likely to become single parents than a random set of children similar to them in every way other than being assigned a superb teacher.
And what of writers, actors, painters, and poets? Only a tiny fraction ever become rich and famous. Most barely make enough to live on (many don’t, and are forced to take paying jobs to pursue their art). But society is surely all the richer for their efforts.
At the other extreme are hedge-fund and private-equity managers, investment bankers, corporate lawyers, management consultants, high-frequency traders, and top Washington lobbyists.
They’re getting paid vast sums for their labors. Yet it seems doubtful that society is really that much better off because of what they do.
I don’t mean to sound unduly harsh, but I’ve never heard of a hedge-fund manager whose jobs entails attending to basic human needs (unless you consider having more money as basic human need) or enriching our culture (except through the myriad novels, exposes, and movies made about greedy hedge-fund managers and investment bankers).
They don’t even build the economy.
Most financiers, corporate lawyers, lobbyists, and management consultants are competing with other financiers, lawyers, lobbyists, and management consultants in zero-sum games that take money out of one set of pockets and put it into another.
They’re paid gigantic amounts because winning these games can generate far bigger sums, while losing them can be extremely costly.
It’s said that by moving money to where it can make more money, these games make the economy more efficient.
In fact, the games amount to a mammoth waste of societal resources.
They demand ever more cunning innovations but they create no social value. High-frequency traders who win by a thousandth of a second can reap a fortune, but society as a whole is no better off.
Meanwhile, the games consume the energies of loads of talented people who might otherwise be making real contributions to society — if not by tending to human needs or enriching our culture then by curing diseases or devising new technological breakthroughs, or helping solve some of our most intractable social problems.
Graduates of Ivy League universities are more likely to enter finance and consulting than any other career.
For example, in 2010 (the most recent date for which we have data) close to 36 percent of Princeton graduates went into finance (down from the pre-financial crisis high of 46 percent in 2006). Add in management consulting, and it was close to 60 percent.
The hefty endowments of such elite institutions are swollen with tax-subsidized donations from wealthy alumni, many of whom are seeking to guarantee their own kids’ admissions so they too can become enormously rich financiers and management consultants.
But I can think of a better way for taxpayers to subsidize occupations with more social merit: Forgive the student debts of graduates who choose social work, child care, elder care, nursing, and teaching.
In 2007, humanity reached a major, though largely overlooked, milestone: for the first time in history, over half of all humans lived in cities. Only a century before, a mere 15 percent of the world’s population lived in urban areas. The United Nations estimates that around 64 percent of the developing world, and 85 percent of the developed world, will be urbanized.
Needless to say, the world’s future lies in its cities, which are increasingly the main drivers of everything from economic growth to cultural development. The science of cities has never been more vital: not only must we create urban areas that better promote human flourishing, but we must also take into account the impact on the environment, which is in an increasingly fragile state.
The City Lab column of The Atlantic reports on some of the findings by researchers involved in the growing field of “urban theory”, who over the years have gleaned some of these key observations and approaches:
Cities generate economic growth through networks of proximity, casual encounters and “economic spillovers.” The phenomenal creativity and prosperity of cities like New York is now understood as a dynamic interaction between web-like networks of individuals who exchange knowledge and information about creative ideas and opportunities. Many of these interactions are casual, and occur in networks of public and semi-public spaces—the urban web of sidewalks, plazas, and cafes. More formal and electronic connections supplement, but do not replace, this primary network of spatial exchange.
Through a similar dynamic, cities generate a remarkably large “green dividend.” It has long been known that cities have dramatically lower energy and resource consumption as well as greenhouse gas emissions per capita, relative to other kinds of settlements. Only some of this efficiency can be explained by more efficient transportation. It now appears that a similar network dynamic provides a synergistic effect for resource use and emissions—what have been called “resource spillovers.” Research is continuing in this promising field.
Cities perform best economically and environmentally when they feature pervasive human-scale connectivity. Like any network, cities benefit geometrically from their number of functional interconnections. To the extent that some urban populations are excluded or isolated, a city will under-perform economically and environmentally. Similarly, to the extent that the city’s urban fabric is fragmented, car-dependent or otherwise restrictive of casual encounters and spillovers, that city will under-perform—or require an unsustainable injection of resources to compensate. As Jacobs said, lowly appearing encounters on sidewalks and in other public spaces are the “small change” by which the wealth of a city grows.
Cities perform best when they adapt to human psychological dynamics and patterns of activity. Urban residents have a basic need to make sense of their environments, and to find meaning and value in them. But this issue is not as straightforward as it may appear. Research in environmental psychology, public health and other fields suggests that some common attributes promote the capacity to meet these human requirements—among them green vegetation, layering, and coherent grouping. Wayfinding and identity are also promoted by iconic structures, and meaning is enriched by art. But for most people most of the time, evolutionary psychology is a more immediate factor to be accommodated. As Jacobs cautioned, a city is not primarily a work of art. That way of thinking is bad for cities—and probably bad for art too.
Cities perform best when they offer some control of spatial structure to residents. We all need varying degrees of public and private space, and we need to control those variations at different times of the day, and over the span of our lives. In the shortest time frames, we can open or close windows and doors, draw blinds, come out onto porches and informally colonize public spaces, or retreat inside the privacy of our homes. Over longer time frames, we can remodel our spaces, open businesses, build buildings, and make other alterations that gradually form the complex dynamic growth of cities.
Interesting stuff. What do you think?
Which cities are the best places to live? The Economist Intelligence Unit (EIU) has set out to answer this question with its livability survey, which asses 140 cities based on such factors as overall stability (25% of total score), health care (20%), education (10%), infrastructure (20%) and culture and environment (25%) — the sorts of things most people agree are fundamental to individual and collective quality of life.
Here are the results for 2014, courtesy of Mic.com:
For the fourth year in a row, Melbourne took the top spot with a total score of 97.5 out of 100. The impressive score can be partially attributed to their perfect scores in the health care, infrastructure and education classifications. Several of Melbourne’s fellow Australian cities filled out much of the top 10, along with a handful from the Great White North. Combined, Australia and Canada scored big, claiming 7 out of the top 10 cities.
The remaining three cities were Vienna, Austria (2nd place), Helsinki, Finland (8th), and Auckland, New Zealand (10th).
As the article notes, while these top ten performed well in all the indicators measured, health care had a particularly strong impact:
A common factor of these livable cities was a high score in the health care category. The top nine spots all garnered scores of 100 in that category. To determine health care, the EIU looked at the availability and quality of private health care, availability and quality of public health care, availability of over-the-counter drugs, and general health care indicators.
Canada, Australia and New Zealand offer a variety of very livable cities, thanks in large part to their great health care, education, culture and environment, affording the countries general stability. Plus, as all English-speaking countries, they’re especially attractive destinations for any Americans considering a move.
Not only does being healthy have the obvious benefit of improving an individual’s mood, comfort, and longevity — all vital to life satisfaction — but in the aggregate, it improves entire communities. Healthy individuals are likelier to be more economically and socially productive, helping businesses and societies at large. They will be less burdensome to more expensive emergency services, and will have more disposable income on hand, since pooling the costs of health care through socialized insurance is less costly then spending a lot per person on expensive treatments.
But this study also highlight that there is more to quality of life than the bare necessities. Each of these cities offer an abundance of recreational and leisure options — well-kept green spaces, cultural centers, community events and facilities — that enliven individual lives and cultivate a sense of shared community. Good infrastructure provides access to these areas and events while helping to create more cohesion and interaction between various neighborhoods and enclaves. It is also telling that all the top cities are medium-sized, which suggests that being too big could present challenges to accommodating residents optimally.
All of this should be pretty obvious. But unfortunately, not enough municipal governments in the world, including in the U.S., have the vision and/or finances to make it happen, and too many city residents are apathetic, disenfranchised, or lack the community spirit to come together. Sub-national and national governments could be doing more to help local communities as well, especially as most countries, and the world at large, are either highly urbanized or becoming rapidly so. As cities begin to house more of the world’s population, and become the main drivers of economic, social, and cultural life, we need to work on making them as ideal for the human condition as possible. We have much to learn from the like of Melbourne, Vancouver, and other successful polities.
To add insult to the injury of a stagnating economy, a report by economists Dan Hamermesh and Elena Stancanelli found that Americans are not only working longer than before (partly because they are making less per hour), but are increasingly more likely to toil outside of work hours, particularly at night and on weekends. As The Atlantic reported:
They found that on a typical weeknight, a quarter of American workers did some kind of work between 10 p.m. and 6 a.m. That’s a lot, compared with about seven percent in France and the Netherlands. The U.K. is closest to the U.S. on this measure, where 19 percent work during night hours. On the weekends, one in three workers in the U.S. were on the job, compared to one in five in France, Germany, and the Netherlands.
All of this adds up: According to the OECD, the U.S. leads the way in average annual work hours at 1,790—200 more hours than France, the Netherlands, and Denmark. That works out to about 35 hours a week, but a recent Gallup poll found the average to be much higher than that—at 47 hours weekly. And perhaps that’s not surprising, when 55 percent of college grads report that they get their sense of identity from their work.
As usual, technology serves as the double-edged sword: in many respects, it has made work a lot easier, not to mention all the new leisure activities (video games, Netflix, game apps, etc). But technology also allows work to be more accessible from home or even while on vacation, making it harder for employees to ignore emails, calls, and assignments — and easier for employers to expect, if not demand, such extra labor.
The consequences of such a work-centered culture are dire: strained social life, reduced sleep, frayed romantic and sexual activity, increased stress and, with all that, worsening mental and physical health. With the boundaries between work and leisure increasingly blurring, will jobs come to dominate our lives in the same way they once did during the early Industrial Era (when child labor, 12-hour workdays, and other such practices were the norm)?
If that is the case, then the solution is more or less the same now as it was then: more solidarity and activism among workers in all the relevant spheres — economic, public, and political. There is no sense in making people work more for less, especially when employers themselves stand to lose a lot in terms of reduced productivity, moral, and health among their employees.
We also need a serious assessment of how our business culture — and culture at large — operates counter-productively for human flourishing. It is becoming accepted practice, once again, for companies to squeeze out more and more from their beleaguered workers while simultaneously offering little to nothing to recompense (on the contrary, the trend is for ever-meager benefits, raises, and upward mobility).
More distressingly, it seems that far too many Americans consider this arrangement to be, at the very least, tolerable, if not acceptable. Ours is a work-obsessed culture that celebrates sacrificing leisure and even health for the sake of being productive at some task, even if it is for a company we hate and for benefits that do not make up for it. I can devote a whole other blog to assessing why it is that the U.S. seems especially enthusiastic about toiling at our own expense, but for now I ask that we at least question what it is we value in terms of quality of life; separating work from leisure is the very least we can do to that end.
Although not a new idea, the concept of a guaranteed basic income — also known as a guaranteed minimum income or universal basic income — seems to be gaining a lot more traction lately. Amid concerns about rising poverty and inequality, as well as greater scrutiny on the failings and inefficiencies of current welfare programs, the allure of a more streamlined and equitable income for all seems obvious; hence why thinkers and activists across the political spectrum — from Martin Luther King, Jr. to Milton Friedman — have advocated one form of it or another.
If you would like a great breakdown on what this idea entails and how it would be implemented, check out this article on Vox.com. It does a pretty good job of introducing the subject in a balanced and holistic way, including analyzing the various arguments for and against a basic income by conservatives, liberals, and libertarians. What do you think?
In 1988, The Economist compiled a ranking of 50 countries according to which would be the best place to be born (or put another way, which would be the best to settle and start a family). This was determined on the basis of 11 weighted sociopolitical and economic criteria, ranging from the quantifiable (such as GDP growth) to the subjective (cultural richness). The results can be seen below.
The United States tops the list, followed by France, West Germany, Italy, Canada, and Japan. The Soviet Union managed a respectable 21st place, with communist Poland and Hungary not that far behind. The Philippines, India, and Mexico also ranked relatively higher than one would expect from developing countries. Saudi Arabia, Nigeria, Iran, Iraq, and Zimbabwe rank the lowest.
Anyway, in 2013, The Economist revisited this “where-to-be-born” index, which basically measures overall quality-of-life both presently and in the foreseeable future. As before, there are 11 indicators involved, including the results of life-satisfaction surveys, public trust, crime, and even geography (environment and natural beauty can go a long way towards leisure and comfort).
So over 25 years later, here are the world’s best places to be born:
The United States is now in 16th place along with former third-place winner Germany; France falls to 26, Italy to 21, Japan to 25, and Canada to a still-respectable 9. In their place are mostly small, northern European countries, as well as Australia, Singapore, and New Zealand. Notice how Saudi Arabia and Iran have improved, while poor Nigeria remains among the bottom five (indeed, it is dead last, although Iraq and Zimbabwe, whose fortunes have each only gotten worse over the years were not measured this time).
Granted, a direct comparison between these two charts can’t say much, since The Economist measured far more countries, and claims to have been much more rigorous in its metrics, the second time around. Moreover, the inclusion of several subjective factors leaves much in dispute; for example, even people in otherwise prosperous places (e.g. the French) often report a low rate of life satisfaction regardless. Needless to say, individuals will weigh certain factors differently depending on their personal or cultural preference: environment may not matter as much to some as, say, public trust, and visa versa.
Of course, any effort to determine where is the best place to live or start of family is going to be arguable. It touches on a macro version of what makes for a good life. Clearly, freedom from violence, starvation, poverty, and the like are nearly universally-agreed upon. But what do you think of these results? Where would you consider to be the best place in the world to live?
Much has been made of the rise of Asia and the subsequent arrival of an “Asian Century“, whereby the continent will become the dominant economic, cultural, and political force in the 21st century world. Setting aside the sheer diversity of this massive landmass — in terms of both culture and fortune — most Asian nations still face tremendous challenges, namely in the area of poverty reduction. Consider the following chart:
As The Economist goes on to note:
Asia’s rapid economic growth has put it on track to eradicate “extreme” poverty, defined by the World Bank as daily consumption of less than $1.25 per person, by 2030. However, the Asian Development Bank reckons this is too low given that nowadays, things like mobile phones are seen as necessities; so it has calculated a more suitable daily minimum of $1.51.
This lifts Asia’s 2010 poverty rate to nearly one-third of the population, adding 343m people to the ranks of the poor. The ADB believes food insecurity, and the risks of natural disasters, global economic shocks and the like, should also be taken into account when measuring poverty. This would further raise Asia’s 2010 poverty rate, to nearly 50 percent.
As with so many other parts of the world, Asia holds tremendous promise but faces daunting challenges. As the continent grows richer and more powerful, despite millions being left behind in squalor, it may be wracked by the same strife and instability that historically bedevils most unequal societies.
Throughout the recession and subsequent recovery, one of the few job opportunities that have remained largely unaffected, if not growing, has been food service. From eateries to fast-food chains, this broad industry has gained an impressive 30 percent in employment since 1990, accounting for nearly one out of ten private-sector jobs in the U.S.
Unfortunately, a recent report by the Economic Policy Institute exposes some very disquieting things about one of America’s fastest-growing employers. Here are some of the highlights courtesy of Mother Jones:
The industry’s wages have stagnated at an extremely low level. Restaurant workers’ median wage stands at $10 per hour, tips included—and hasn’t budged, in inflation-adjusted terms, since 2000. For nonrestaurant US workers, the median hourly wage is $18. That means the median restaurant worker makes 44 percent less than other workers. Benefits are also rare—just 14.4 percent of restaurant workers have employer-sponsored health insurance and 8.4 percent have pensions, vs. 48.7 percent and 41.8 percent, respectively, for other workers.
Unionization rates are minuscule. Presumably, it would be more difficult to keep wages throttled at such a low level if restaurant workers could bargain collectively. But just 1.8 percent of restaurant workers belong to unions, about one-seventh of the rate for nonrestaurant workers. Restaurant workers who do belong to unions are much more likely to have benefits than their nonunion peers.
As a result, the people who prepare and serve you food are pretty likely to live in poverty. The overall poverty rate stands at 6.3 percent. For restaurant workers, the rate is 16.7 percent. For families, researchers often look at twice the poverty threshold as proxy for what it takes to make ends meet, EPI reports. More than 40 percent of restaurant workers live below twice the poverty line—that’s double the rate of non-restaurant workers.
Opportunity for advancement is pretty limited. I was surprised to learn that for every single occupation with restaurants—from dishwashers to chefs to managers—the median hourly wage is much less than the national average of $18. The highest paid occupation is manager, with a median hourly wage of $15.42. The lowest is “cashiers and counter attendants” (median wage: $8.23), while the most prevalent of restaurant workers, waiters and waitresses, who make up nearly a quarter of the industry’s workforce, make a median wage of just $10.15. The one that has gained the most glory in recent years, “chefs and head cooks,” offers a median wage of just $12.34.
Industry occupations are highly skewed along gender and race lines. Higher-paid occupations are more likely to be held by men—chefs, cooks, and managers, for example, are 86 percent, 73 percent, and 53 percent male, respectively. Lower-paid positions tend to be dominated by women: for example, host and hostess (84.9 percent female), cashiers and counter attendants (75.1 percent), and waiters and waitresses (70.8 percent). I took up this topic in a piece on the vexed gender politics of culinary prestige last year. Meanwhile, “blacks are disproportionately likely to be cashiers/counter attendants, the lowest-paid occupation in the industry,” while “Hispanics are disproportionately likely to be dishwashers, dining room attendants, or cooks, also relatively low-paid occupations,” the report found.
Restaurants lean heavily on the most disempowered workers of all—undocumented immigrants. Overall, 15.7 percent of US restaurant workers are undocumented, nearly twice the rate for non-restaurant sectors. Fully a third of dishwashers, nearly 30 percent of non-chef cooks, and more than a quarter of bussers are undocumented, the report found. So a huge swath of the people who feed you pay payroll taxes and sales taxes yet don’t receive the rights of citizenship.
All of this reflects a rather disturbing overall trend in the U.S. economy: the loss of stable, well-paying jobs to less secure, low-wage ones. Not only has job growth not kept pace with the needs of the labor force, but the relatively few options that remain share largely the same characteristics: meager pay, little to no benefits, no paid sick leave, poor upward mobility, and so on. And since this has become standard across the industry — baring only a few examples — most companies have little incentive to offer anything better to their workers — in essence, it is a race to the bottom, one that desperate workers of all ages have no choice but to take up.
Needless to say, this is not a sustainable model for prosperity. Not only do individual employees suffer, but so do their families and communities (the poorest of which often have few options beyond food service and equally low-paying retail). The national economy as a whole cannot thrive when such a large chunk of its consumer base is too poor to afford goods and services, or too unhealthy and demoralized to work at optimal productivity. These highly profitable employers have as much an interest in investing more in their labor force as the workers themselves.
For its part, the EPI report suggests legislative solutions, including a higher minimum wage, mandated paid sick leave, and a path to legal status for undocumented workers. I would add unionization or some sort of labor collective as a big step, too. For its part, MoJo recommends that those wishing to learn more about the working conditions in America’s food industry read the 2013 book Behind the Kitchen Door by Saru Jayaraman.
As fast-food, retail, and other service work continues to take the place of increasingly obsolete but better-paying positions, we need to start adjusting the way we value such labor; otherwise, unpleasant, beggaring jobs will be the new normal, and that cannot last.
I know reports like these are a dime a dozen, especially in post-recession America, but it bears reaffirmation, if only because a fair number of Americans still seem to think that our system is vastly superior to any existing or hypothetical alternative — even though the social and economic costs are vast and growing.
Let us start with this chart, courtesy of i09, which comes from a new report by the Commonwealth Fund, a private U.S-based foundation that promotes a more efficient healthcare. It compares the results of an extensive survey of patients and physicians across ten developed countries, looking at several relevant metrics.
Notice that by all measures, the United States is either middle-of-the-road or dead last , despite spending the most per person by far — $8,508 compared runner up Norway at $5,669 (incidentally the latter also does not perform all that well). Canada, while comparatively more efficient at nearly half the cost, does not perform all that impressively either.
By contrast, the highest ranked country on average, the United Kingdom, spends just $3,405 per person on health care. Taken as a whole, it appears that per capita spending has little bearing on the overall quality and effectiveness of the healthcare system (something that has been noted in similar international studies). Another chart from the report confirms this:
Despite such astronomical spending, in both proportional and absolute terms, the report sums up the America’s performance thusly: “[the country] fails to achieve better health outcomes than the other countries, and as shown in the earlier editions, the U.S. is last or near last on dimensions of access, efficiency, and equity.”
The culprit for such inefficiency? The very fact that many Americans lack access to reliable health care (including those who are technically insured).
Not surprisingly—given the absence of universal coverage—people in the U.S. go without needed health care because of cost more often than people do in the other countries. Americans were the most likely to say they had access problems related to cost. Patients in the U.S. have rapid access to specialized health care services; however, they are less likely to report rapid access to primary care than people in leading countries in the study. In other countries, like Canada, patients have little to no financial burden, but experience wait times for such specialized services. There is a frequent misperception that trade-offs between universal coverage and timely access to specialized services are inevitable; however, the Netherlands, U.K., and Germany provide universal coverage with low out-of-pocket costs while maintaining quick access to specialty services.
However, as i09 notes, the study’s conclusion points to more than just broadening access:
The authors believe that the problems inherent in the U.S. healthcare system are so pervasive that it will take more than better access and equity to solve them. According to Karen Davis, lead author of the study, overall improvement “is a matter of accountability, having information on your performance relative to your peers and being held accountable to achieving a kind of care that patients should expect to get.”
But it’s not an intractable problem. The U.K.’s excellent result can be attributed to a number of reforms, including the hiring of more specialists, allocating bonuses to family physicians who meet quality targets, and adopting health system information that allows physicians to easily share information about their patients. Moreover, every citizen (apparently) has a doctor.
If there is any silver-lining, it is that the U.S. is moving in the right direction, if ever so slowly. In addition to the flawed but still impactful Affordable Care Act:
The U.S. has significantly accelerated the adoption of health information technology following the enactment of the American Recovery and Reinvestment Act, and is beginning to close the gap with other countries that have led on adoption of health information technology. Significant incentives now encourage U.S. providers to utilize integrated medical records and information systems that are accessible to providers and patients. Those efforts will likely help clinicians deliver more effective and efficient care.
Indeed, all of this attention towards the inefficiency of our healthcare system is leading to changes in both the political and private spheres. However, it will take a lot more than this piecemeal and hodgepodge approach to rectify what is very clearly a failing system. The solutions, while often difficult to implement, are clear, and both the necessary capital and public will is available. When will that be enough to spur necessary change?
When it comes to wealth and income inequality — a subject I have discussed at length here – the news is rarely positive. As the following graph makes succinctly clear, the issue has worsened dramatic over the last few decades.
While the most recent data in these sorts of graphs are around seven years old, newer evidence suggests the problem is still prevalent, if not worsening — at least in the United States.
According to an interesting new paper on global income distribution conducted by economists Branko Milanovic and Christoph Lakner, the global pictures regarding income inequality is far more nuanced, if not positive. As NPR reports, the study found that globalization — the same mechanism that plays a large, though hardly solitary, role in rising inequality — has had the opposite effect, broadly speaking.
Essentially, they look at inequality at a global scale, accounting for the world’s population as a whole rather than breaking it down country-to-country (as is usually the case). S what happens if you look at the change in income over the past few decades for everyone on Earth? Here’s what the graph of the data shows:
So what does this mean? Basically, people in the middle of the global income distribution — mostly concentrated in China and India, as as well as a few other developing Asian countries — have had the biggest gains in come by percentage. In fact, the average American, like most others in the developed world, would fall at the far right of this graph, at the top of the global income distribution.
So in a global context, the typical developed-world individual is capturing the lion’s share of income growth. Assuming this is truly the case (I await for more research and scrutiny to be certain one way or the other) that does not make inequality any less worrisome, now and especially in the long-term. Worldwide, we are still finding far too much wealth concentrated at the top amid austere policies, insufficient investment in the public good, and the persistent absolute poverty of hundreds of millions of people.
An increasingly transient global elite is still capturing the lion’s share of investment — as made depressingly clear by the revelation that 85 individuals hold more wealth than 3.5 billion people. Too many countries are mired in the same old problems despite the ever-growing generation of wealth that never seems to be reflected in higher wages, incomes, or public investments. Even if some people in this arrangement have it worse than others, the fact that many have it worse than they should given the capital potential is a problem, for most individual countries and the world at large.
Those are just my brief thoughts. What are your opinions?