Engellau, Volvo’s president and an engineer himself, helped push for a more effective seatbelt, after a relative died in a traffic accident due partly to the flaws of the two-point belt design—which was not even standard feature in cars at the time. This personal tragedy drove Engellau to find a better solution, hiring Bohlin to find a solution quickly.
There were two major problems with the historic two-point belt design, which crosses the lap only. First, because the human pelvis is hinged, a single strap fails to restrain the torso, leaving passengers vulnerable to severe head, chest and spinal injuries; positioned poorly, the belt can even crush internal organs on impact. Second, they were notoriously uncomfortable, so many people chose not to wear them. Bohlin’s innovation was to find a design that resolved both problems at once.
After millions of dollars and thousands of tests through the 1950s and 1960s, Volvo became the first carmaker in the world to standardize the three-point safety belt we now take for granted. More than that, Volvo pushed hard for the seatbelt to be adopted in its native Sweden, which like most places was initially resistant to having to wear seatbelts.
But Volvo didn’t stop there. While it patented the designs to protect their investment from copy-cats, the company did not charge significant license fees to rivals or keep the design to itself to give their cars an edge. Knowing that lives were at stake worldwide, Engellau made Bohlin’s patent immediately available to all. Having sponsored the costly R&D, they gifted their designs to competitors to encourage mass adoption. It is estimated that Volvo may have lost out on $400 million in additional profits, if not more.
Instead, literally millions of people have been spared injury and death by this now-ubiquitous seatbelt we take for granted. All because a couple of Swedes decided to put people over profits (which isn’t to say they didn’t reap any financial incentive, but proved you can do both).
It is odd that Americans are so reluctant, if not hostile, to looking abroad for ideas about how to do things, such as education, voting methods, healthcare, etc. The principles and ideas that underpinned this nation’s founding did not emerge from nowhere: They were inspired by, or even directly drawn from, Enlightenment thinkers from across Europe; certain elements of British law and government (ironically), such as the Magna Carta and English Bill of Rights; and of course the Greeks and Romans, from whom we borrowed specific methods, institutions, terminology, and even architecture. (The U.S. Senate is explicitly inspired by the original Roman Senate, with senatus being Latin for council of elders.)
Americans make up less than five percent of humanity. The U.S. is one of nearly 200 countries. Its history as a nation, let alone as a superpower, is a relative blink in time; as a point of reference, the Roman-Persian wars lasted over 600 years, nearly three times America’s lifespan. Conversely, many countries are much younger, including most of the world’s democracies, providing fresher or bolder perspectives on certain issues not addressed or contemplated by our more conservative system.
Given all that, it stands to reason that someone, somewhere out there, has done something that we have not thought of or figured out, something worth studying or implementing. It is statistically unlikely that we are the only people or nation to know everything, giving our narrow slice of time, humans, and experience. The fact that so many innovators, inventors, and other contributes this country have come from all over the world proves the U.S. has always tacitly accepted the idea that the rest of the world has something to offer.
In fact, this would be in accordance with the vision of most of the nation’s founders, who were far from nationalistic. Their debates, speeches, and correspondences reveal them to have been fairly worldly folks who were open to foreign ideas and perspectives and sought to integrate the country into the international system. From Jefferson’s cherished copy of the Muslim Koran, to Franklin’s open Francophilia and Madison’s insistence that we respect global public opinion and norms, the supposed dichotomy between patriotism and internationalism is a false one at odds with one’s service to the nation.
It is all the more ironic because one of the few schools of philosophy to originate in the United States was pragmatism, which emerged in the 1870s and postulated, among other things, that people promote ideas based on their practical effect and benefit (i.e., regardless of their national or foreign origin). It should not matter where our solutions to certain problems come from it matters that they are solutions, and thus beneficial to our community, in the first place.
America’s presidential system, along with its winner-take-all elections and Electoral College, tends to lead to gridlock and polarization. These mechanisms and institutions were devised before political parties were a thing—or at least as rigid as they are now—and thus never seriously took them into account. Hence, we are stuck with two big parties that are far from representative of the complex spectrum of policies and ideologies.
Rather than the proportional representation you see above, members of Congress are elected in single-member districts according to the “first-past-the-post” (FPTP) principle, meaning that the candidate with the plurality of votes—i.e. not even the majority—wins the congressional seat. The losing party or parties, and by extension their voters, get no representation at all. This tends to produce a small number of major parties, in what’s known in political science as Duverger’s Law.
With the Electoral College, there is a similar dynamic at play: a presidential candidate needs no more than half the vote plus one to win the entire state and its electors. Some states are considering making it proportional, but only Maine and Nebraska have already done so.
This is why you see so many seemingly contradictory interests lumped into one or the other party. In other systems, you may have a party centered on labor rights, another on the environment, yet another for “conventional” left-wing or right-wing platforms, etc. The fragmentation might be messy, but it also forces parties to either appeal to a larger group of voters (so they can have a majority) or form coalitions with other parties to shore up their legislative votes (which gives a voice to smaller parties and their supporters).
Note that this is a huge oversimplification, as literally whole books have been written about all the reasons we are stuck with a two-party system most do not like. And of course, a parliament would not fix all our political problems, which go as deep as our culture and society.
But I personally think we may be better off with a parliamentary-style multiparty system—uncoincidentally the most common in the world, especially among established democracies—than what we have now.
As I see folks share that they voted, I’m reminded of the idea of mandatory voting, in which all eligible citizens are required to vote unless they have a valid excuse.
In ancient Athens, it was seen as the duty of every eligible citizen to participate in politics; while there was no explicit requirement, you could be subject to public criticism or even a fine.
Today, only a few countries require citizens to vote, most of them in Latin America; but of this already small number, only a handful actually enforce it with penalties.
Moreover, just five of the world’s 35 established democracies have compulsory voting: Australia, Luxembourg, Uruguay, Costa Rica, and Belgium (which has the oldest existing compulsory voting system, dating back to 1893.) In Belgium, registered voters must present themselves at their polling station, and while they don’t have to cast a vote, those who fail to at least show up without proper justification can face prosecution and a moderate fine. (To make it easier, elections are always held on Sundays.) If they fail to vote in at least four elections, they can lose the right to vote for 10 years, and might face difficulties getting a job in government (though in practice fines are no longer issued).
The arguments for compulsory voting is that democratic elections are the responsibility of citizens—akin to jury duty or paying taxes—rather than a right. The idea is that making voting obligatory means all citizens have responsibility for the government they choose; in a sense, it makes the government more legitimate, since it represents the vast majority of people.
The counterargument is that no one should be forced to take part in a process they don’t believe in or otherwise don’t want to be a part of; basically, not voting is itself a form of expression. Unsurprisingly, this view is prevalent in the U.S., where many believe compulsory voting violates freedom of speech because the freedom to speak necessarily includes the freedom not to speak. Similarly, many citizens will vote solely because they have to, with total ignorance about the issues or candidates. In many cases, they might deliberately skew their ballot to slow the polling process and disrupt the election, or vote for frivolous or jokey candidates. This is prevalent in Brazil, the largest democracy with mandatory voting, where people increasingly have become cynical about politics, elect joke candidates, and still choose not to vote despite the penalty.
Some have argued that compulsory elections help prevent polarization and extremism, since politicians have to appeal to a broader base (i.e. the entire electorate). It does not pay to energize your base to the exclusion of all other voters, since elections cannot be determined by turnout alone. This is allegedly one reason Australian politics are relatively more balanced, with strong social policies but also a strong conservative movement.
Finally, there is the claim that making people vote might also make them more interested in politics. It’s been shown that while lots of folks resent jury duty for example, once they’re in the jury, they typically take the process seriously. Similarly, they may hate mandatory voting in theory but in practice will find themselves trying to make the best of it.
Today is UN Day, which commemorates the 75th birthday of the United Nations, a deeply flawed and troubled organization that is nonetheless more indispensable than ever—and has accomplished a lot more than most people think.
It was on this day 75 years ago, just months after the end of humanity’s bloodiest war, that the UN Charter came into force after being ratified by fifty countries. The Charter established the organization along with the framework of the international system. An audacious and idealistic document, it articulated a commitment to uphold the human rights and wellbeing of all citizens, addressing “economic, social, health, and related problems,” and “universal respect for, and observance of, human rights and fundamental freedoms for all without distinction as to race, sex, language, or religion”. The organization now counts nearly four times as many members, at 193.
Dwight D. Eisenhower, far from a bleeding-heart globalist, once said that the UN “represents man’s best organized hope to substitute the conference table for the battlefield”.
If nothing else, the organization has served as an outlet for frustrations and rivalries that would otherwise manifest on the battlefield. The constant grandstanding between the U.S. and Russia may be frustrating—and has often led to devastating deadlock during crises—but imagine the alternative course of action without an international platform? Many countries on the verge of open conflict have opted instead to take diplomatic shots at each other at the UN—an often sordid display, to be sure, but obviously better than the alternative.
Of course, we Americans know full well how hard it is to get even our one country to work together—imagine close to 200 countries spanning eight billion people and a multitude of languages, religions, cultures, types of governments, and levels of development. The UN is only as effective as its members allow it to be, and its failures and limitations are a reflection of our own as a species.
Moreover, it is worth considering the context of its emergence: A war that had killed over 60 million people (three percent of all humans at the time), following a millennia of endless conflict where violence was the norm and enslavement, rape, looting, and other things we now call war crimes (courtesy of the UN) were just the way of things. For most of our quarter of a million years of existence, we rarely knew about, much less cared, for anyone outside our immediate tribe or band. Human rights and civil liberties were alien concepts that would not have made sense to anyone. The vast majority of people lived in grinding poverty, oppression, fear, and ignorance.
From the ashes of the worst conflict in history emerges an organization trying to cultivate peace, progress, and unity among our species—not just out of idealism, but also based on the sober realism that some problems are too big for any one nation to handle. Needless to say, it has failed in its lofty aspirations time and again, as most of us know all too well—but that’s to be expected given just how bold of an undertaking it is. And for all the failures, there are plenty of successes we take for granted.
Given that most Americans do not even know how their own government works, it stands to reason that few know the workings and complexities of the international system, either.
Few people know that it was the UN Secretary-General, U Thant of Burma, who played a key role in the Cuban Missile Crisis; JFK admitted that the entire world was in the UN leader’s debt, though Thant is scarcely known today.
Many of us take for granted the modern amenities and benefits, let alone realize their origin in the UN. The ability to mail and ship things globally; to access goods and products from around the world; and to travel anywhere with relative ease are all due to UN organizations, treaties, or conferences that established uniform standards and rules for airlines, companies, and governments. Even seatbelts became widespread through deliberate UN policy.
Few know the work of UNICEF, one of the oldest UN organization, which in 2018 alone helped care for 27 million babies born in places with high infant and maternal mortality; treated four million children in 73 countries for severe acute malnutrition; and provided over 65 million children with vaccines against common killers like diphtheria, tetanus and pertussis (half the world’s children get their vaccine through UNICEF). Over the last thirty years, it has saved over 90 million children.
The much maligned WHO helped eradicate smallpox, which once killed millions annually throughout history, and is on the verge of eradicating polio as well. It has helped most people with HIV/AIDS get access to treatment, and is currently working on making insulin more available, too. With respect to the recent pandemic, it also used its diplomacy to get China to finally open itself to an international team of scientists—which included two Americans. It recently helped stem the second largest Ebola outbreak in Congo, to little fanfare.
A 1987 conference convened by the UN Environment Programme helped lead to an international treaty that has successfully repaired the ozone layer.
The World Food Programme, along with the Food and Agriculture Organization, provides food and assistance to 90 million people in 88 countries, keeping them from the brink of starvation (and getting a well deserved Nobel Peace Prize for it). FAO also eradicated rinderpest, a deadly livestock disease that is only the second infectious disease in history (besides smallpox) to be eradicated. It also maintains the world’s largest and most comprehensive statistical database on food and agriculture.
The UN Population Fund helps an average of two million women a month with their pregnancies, which could be deadly in most countries.
The UN regularly monitors elections in about fifty countries, which not only ensures a free and fair political process but has prevented numerous civil wars and conflicts.
All these achievements do not undo the very real and tragic failings of the organization, from the genocides in Rwanda and Bosnia, to the Syrian and Yemeni civil wars. But 75 years is not a long time to undo over 200,000 years of tribalism and disunity. As one UN chief put it, “the United Nations was not created to bring us to heaven, but in order to save us from hell”.
Considering that the average American pays less than two dollars a year to cover the U.S.’ regular dues to the UN, I think it is a bargain worth supporting and improving upon.
On this day in 1956, the Hungarian Revolution began as a peaceful student demonstration that drew thousands while it marched through central Budapest to the parliament building. It soon erupted into a nearly two-week violent uprising against one of the world’s superpowers, laying the seeds of its demise for decades to come.
The student marchers, who began calling out on the streets using a van with loudspeakers, sent a delegation into a radio building to try to broadcast their demands to the country. They included the withdrawal of Soviet troops, the reinstatement of democracy, and the end of Stalinist oppression.
Hungary, which had aligned with Nazi Germany in WWII, was “liberated” by the Soviets, only to come under their domination as a de facto puppet state. Amid deteriorating freedoms, state oppression, and a faltering economies, students and workers increasingly agitated for change.
What began as a peaceful demonstration erupted as a full blown war when the delegation that attempted to broadcast its demands was detained by state authorities. Protestors arrived demanding their release, only to be fired upon by the State Security Police (AVH in Hungarian). Multiple students died and one was wrapped in a flag and held above the crowd. This was the start of the next phase of the revolution, as the news spread and disorder and violence erupted throughout the capital.
The revolt spread like wildfire; the government collapsed. Thousands of ordinary Hungarians organized into militias, battling the ÁVH and Soviet troops. Some local leaders and ÁVH members were lynched or captured, while former political prisoners were broken out and armed. Radical workers’ councils wrested control from the ruling Soviet-backed Hungarian Working People’s Party and demanded political change.
The revolution was initially leaderless, but a new government was formed by Imre Nagy, a committed communist who was nonetheless opposed to Soviet control and authoritarianism. He formally disbanded the ÁVH, declared the intention to withdraw from the Warsaw Pact, and pledged to re-establish free elections. By the end of October, fighting had almost stopped, and the days of normality began to return. Some workers continued fighting against both Stalinist elements and the more “liberal” communists they distrusted.
Soviet leaders, initially appearing open to negotiating a withdrawal of Soviet forces, changed their mind and moved to crush the revolution just as it was calming. On November 4, a large Soviet force invaded Budapest and other regions of the country. The Hungarian resistance continued for another week, claiming the lives of over 2,500 Hungarians and 700 Soviet troops. Over 200,000 Hungarians fled as refugees. Mass arrests and denunciations continued for months thereafter; 26,000 people were brought to trial, 22,000 were sentenced and imprisoned, 13,000 interned, and 229 executed (including Nagy and other political leaders of the revolution and anti-Soviet government). Resistance continued for another year, mostly led by independent workers’ councils and unions.
But by January 1957, the new Soviet-installed government had suppressed all public opposition and reasserted Soviet dominion. These Soviet actions, while strengthening control over the rest of the Eastern Bloc, alienated many Western Marxists, who up until that point had at least nominally sympathized with the Soviet Union. Communist and Marxist parties split and/or lost membership across the world.
The Hungarians had led the largest and fiercest opposition against the Soviets in Eastern Europe, and it would remain one of the biggest revolts to threaten Soviet control. While it initially failed, it weakened whatever ideological currency the Soviet Union would have had abroad. Ironically, by the 1960s, Hungary became “the happiest barracks” in the Eastern Bloc, with relatively more economic and cultural freedom than most Soviet satellites. It quietly pursued reform to human and civil rights into the 1970s; in fact, its opening of the previously-restricted border with democratic Austria in 1989 is credited with hastening the collapse of the Soviet Union—meaning the Hungarians ultimately won in the end.
To many observers, especially in the United States, this year’s winner of the Nobel Peace Prize may seem uninspired, if not unfamiliar. It is an organization, rather than a person, and its work is probably not as widely known and appreciated as it should be.
Yet the United Nations World Food Programme (WFP) is no less deserving of the honor (especially since over two dozens entitieshave won the Peace Prize before, including the United Nations itself). It is the largest humanitarian organization in the world, and the largest one focused on hunger, malnutrition, and food insecurity, providing critical food assistance to nearly 100 million people across 88 countries. Tens of millions would starve without its fleet of 5,600 trucks, 30 ships, and nearly 100 planes delivering more than 15 billion rations, at just 61 cents each. Remarkably, WFP does all its work based entirely on voluntary donations, mostly from governments.
Laudable as all that might be, it’s fair to ask what this work has to do with peace? Two-thirds of WFP’s work is done in conflict zones, where access to food is threatened by instability, violence, and even deliberate war tactics. Amid war and societal collapse, people are likelier to die from starvation, or from opportunistic diseases that strike their malnourished immune systems. Since its experimental launch in 1961, WFP has delivered aid to some of the most devastating and horrific natural disasters in history, including the Rwandan genocide, the Yugoslav War and the Indian Ocean tsunami in 2004. (It became a permanent UN agency in 1965, having proven its worth by mustering substantial aid to earthquake-stricken Iran in 1962, initiating a development mission in Sudan, and launching its first school meals project in Togo.)
As The Economist points out, the focus on hunger is a sensible one: Not only have famine and malnutrition destroyed millions of lives across history, but they remaining pressing concerns in the face of the pandemic, climate change, and renewed conflict.
Governments everywhere are desperate to bring an end to the pandemic. But hunger has been growing quietly for years, and 2019 was the hungriest year recorded by the Food Security Information Network, a project of the WFP, the Food and Agriculture Organisation and other NGOs, which since 2015 has been gathering data on how many people worldwide are close to starvation. The rise was largely a consequence of wars in places like South Sudan, Yemen and the Central African Republic. This year, thanks to the covid-19 pandemic, things are likely to be far worse. Rather than war, this year it is the dramatic falls in the incomes of the poorest people that is causing hunger. There is as much food to go around, but the poor can no longer afford to buy it. The number of hungry people might double, reckons the WFP, from 135m in 2019 to 265m at the end of this year.
Unfortunately, despite the increased (and likely to increase) need for its services—more people face hunger than at anytime since 2012—the agency’s precarious budget, ever-dependent on the whims of donors, is declining. Again, from the Economist:
Last year the organisation received $8.05bn from its donors, by far the biggest of which is the United States. This year so far it has received only $6.35bn. Many countries, such as Britain, link their aid budgets to GDP figures which have fallen sharply. Britain provided roughly $700m of the WFP’s funding in 2019. This year its aid budget will fall by £2.9bn ($3.8bn). Under Mr Trump America had turned away from funding big multilateral organisations even before the pandemic hit, though the WFP has escaped the fate of the WHO, to which Mr Trump gave notice of America’s withdrawal in July. In Uganda food rations for South Sudanese and Congolese refugees have been cut. In Yemen the WFP has had to reduce rations by half.
I used to comfort myself with the fact that, compared to the vast majority of humans today and throughout history, I have it pretty damn good. Of the 107 billion people who ever lived, all but a relative handful lived short and miserable lives defined by work, disease, ignorance, fear, and repression. Hell, billions died before they even reached the age of five, and billions more before their prime. Even fewer had the chance to self-actualize, to reach certain goals of personal fulfillment and achievement, or to enjoy basic comforts and conveniences; good food, entertainment, a warm bed, etc.
It always felt kind of wrong to use others’ senseless suffering to bolster my own sense of purpose and gratitude. But it also isn’t working like it used to, because I realize what it all says about human existence. How the heck can I get solace from knowing that the default experience of most thinking and feeling animals is pointless suffering? And that the only reason I am in a better position is a series of fortunate circumstances, starting with when and where I was born?
It is madness-inducing to imagine that most living things suffer and die without any meaning. Humans across time and place have come up with all sorts of religious and spiritual beliefs and practices to explain and cope, but none of it is as verifiable, salient, and provable as the suffering right in front of us. As far as anyone can truly tell, things just come and go in and out of existence, and there is no real point to it. (I explore a lot of these beliefs and ideas, but none of them ever really stick, even if I can’t rule them out.)
I don’t know, maybe this pandemic and the general state of the world have just weakened my mental resilience. As grateful and comfortable and amazing as my life has been, it is harder to focus on the good given the more widespread and established reality of existence being really awful. I know I’m not the first to think about this, and I know most of the reassurances and counterpoints, I just feel kind of stuck. I welcome any and all perspectives on this.
For my part, all I can do is make the most of this wonderful life that has been granted to me, to embrace and indulge in its wonders and beauties, to add to its kindness and compassion, and, above all, to strive to make it as wonderful for everyone else as possible. It’s not much, but it’s something, and despite these hiccups, it has gotten me this far—for which I am eternally grateful.
Today is World Mental Health Day, launched in 1996 by the UN—at the urging of the World Mental Health Federation and with support from the WHO—to raise awareness about one of the most misunderstood but increasingly problematic issues facing humanity.
Even the concept of mental health is fairy new in human history. What we now call mental illnesses were known, studied, and treated by the ancient Mesopotamians, Egyptians, Greeks, Romans, Chinese, and Indians. Some were called “hysteria” and “melancholy” by the Egyptians, and certain Hindu texts describe symptoms associated with anxiety, depression, and schizophrenia. The Greeks coined the term “psychosis”, meaning “principle of life/animation”, in reference to the condition of the soul.
In virtually every society up until the 18th century, mental health was associated with moral, supernatural, magical and/or religious causes, usually with the victim at fault in some way. The Islamic world came closest to developing something like a mental health institution, with “bimaristans” (hospitals) as early as the ninth century having wards dedicated to the mentally ill. The term “crazy” (from Middle English meaning “cracked”) and insane (from Latin insanus meaning “unhealthy”) came to mean mental disorder in Medieval Europe.
In the mid 19th century, American doctor William Sweester coined the term “mental hygiene” as a conceptual precursor to mental health. Advances in medicine, both technologically and philosophically, quickly found the connection between mental and physical health while minimizing the idea of moral or spiritual flaws being the cause (the Greeks did come close to this, namely Hippocrates, who linked syphilis to a physical cause).
But the dark takeaway from this was the so called “social hygiene movement“, which saw eugenics, forced sterilization, and harsh experimental treatments as the solutions to mental and physical disabilities or divergences. Though the Nazis were the ultimate manifestation of this odious idea, their propaganda and policies cited most of the Western world, including the U.S., as standing with them in their efforts to cleanse populations. (In fact, the term mental health was devised after the Second World War partly to replace the now-poisoned idea of mental “hygiene”.)
While we have come a long way towards realizing the evils and horrors of how we treat mental illness—from ancient times to very recent history—abuses, misunderstandings, and neglect remain worldwide problems.
Hence I also want to take today to thank everyone throughout my life who has been so understanding, supportive, and affirming with respect to my own mental health struggles. I would never have broken through my anxiety or depression induced barriers without a loving and compassionate social support structure along the way (to say nothing of my relative socioeconomic privileges, which unfortunately remains the most common barrier to mental health treatment in the U.S.).
I am certainly luckier than most. Mental illnesses are more common in the U.S. than cancer, diabetes, or heart disease, which are far better known and addressed. Over a quarter of all Americans over the age of 18 meet the criteria for having a mental illness. Youth mental health has become especially dire, with 13% reporting a major depressive episode just over the past year, of whom only 28% get treatment. And over 90% of Americans with a substance abuse issue (which is usually tied to mental health) receive no treatment.
Worldwide, one out of four humans endure a mental health episode in their lifetimes. Depressive disorders are already the fourth leading cause of the global disease burden, and will likely rank second by the end of 2020, behind only ischemic heart disease. According to the World Health Organization (WHO), the global cost of mental illness—in terms of treatment, lost productivity, etc.—was nearly $2.5 trillion in 2010, with a projected increase to over $6 trillion by 2030.
Tragically, most mental health issues can be treated with relative ease: 80% of people with schizophrenia can be free of relapses following one year of treatment with antipsychotic drugs combined with family intervention. Up to 60% of people with depression can recover with a proper combination of antidepressant drugs and psychotherapy. And up to 70% of people with epilepsy can be seizure free with simple, inexpensive anticonvulsants. Even changing one’s diet could have an effect.
But over 40% of countries have no mental health policy, over 30% have no mental health programs, and around 25% have no mental health legislation. Nearly a third of countries allocate less than 1% of their total health budgets to mental health, while another third spend just 1% of their budgets on mental health. (The U.S. spent about 7.6% in 2001.)
Someone could meditate, think positively, or pursue therapy all they want, but if they are rationing insulin to stay alive, cannot find affordable housing, struggle to find a well paying job, and are otherwise at the mercy of external forces that leave them fundamentally deprived, such treatments—however effective and beneficial in many contexts—can only go so far.
He illustrates this perfectly with the following account:
In the early days of the 21st century, a South African psychiatrist named Derek Summerfeld went to Cambodia, at a time when antidepressants were first being introduced there. He began to explain the concept to the doctors he met. They listened patiently and then told him they didn’t need these new antidepressants, because they already had antidepressants that work. He assumed they were talking about some kind of herbal remedy.
He asked them to explain, and they told him about a rice farmer they knew whose left leg was blown off by a landmine. He was fitted with a new limb, but he felt constantly anxious about the future, and was filled with despair. The doctors sat with him, and talked through his troubles. They realised that even with his new artificial limb, his old job—working in the rice paddies—was leaving him constantly stressed and in physical pain, and that was making him want to just stop living. So they had an idea. They believed that if he became a dairy farmer, he could live differently. So they bought him a cow. In the months and years that followed, his life changed. His depression—which had been profound—went away. ‘You see, doctor,’ they told him, the cow was an ‘antidepressant’.
To them, finding an antidepressant didn’t mean finding a way to change your brain chemistry. It meant finding a way to solve the problem that was causing the depression in the first place. We can do the same. Some of these solutions are things we can do as individuals, in our private lives. Some require bigger social shifts, which we can only achieve together, as citizens. But all of them require us to change our understanding of what depression and anxiety really are.
This is radical, but it is not, I discovered, a maverick position. In its official statement for World Health Day in 2017, the United Nations reviewed the best evidence and concluded that ‘the dominant biomedical narrative of depression’ is based on ‘biased and selective use of research outcomes’ that ‘must be abandoned’. We need to move from ‘focusing on ‘chemical imbalances’, they said, to focusing more on ‘power imbalances’.
I can only hope that as mental health becomes less stigmatized—less a matter of superstition, genetic inferiority, or moral and individual failing—we can work towards building fairer and more just societies that promote human flourishing, physically, mentally, and spiritually.
Most of us are familiar with the Muller-Lyer optical illusion above, named after its creator, German psychologist Franz Carl Müller-Lyer.
Like most optical illusions, it is designed to test basic brain and visual functions, helping us learn how and why human senses, cognition, etc. work the way they do. Many folks think the second line is longer than the first, even though both are the same, which purportedly shows that humans are susceptible to certain visual guides like arrows (though explanations for why this happens vary).
But the results do not tell the whole story: while many Westerners fall for this illusion (myself included), a study of 14 indigenous cultures found that none were tricked to the same degree. In fact, some cultures, like the San people of the Kalahari Desert, knew the two lines were equal length.
That’s because most studies claiming to reflect universal traits of human psychology and physiology only do so for a small and specific demographic—people from “WEIRD” societies, or Western, Educated, Industrialized, Rich and Democratic—which represent a tiny minority of all humans (about 12 percent).
The “WEIRD” phenomenon was first described in a 2010 paper from the University of British Columbia in Vancouver, which found that 96 percent of studies in economics, psychology, and cognitive science—such as the ones on optical illusions—were performed on people with European backgrounds. A sample of hundreds of studies in leading psychology journals found close to 70 percent of subjects were from the U.S., and of these, 67 percent were undergraduates studying psychology (which further slants studies to reflect one particular age group).
All this means that a randomly selected American undergraduate is 4,000 times likelier to be a subject in a psych study—and thus reflect all of human nature—than a random non-Westerner.
Yet when scientists perform some of these experiments in other cultures, the results are very different—not just for optical illusions, but for things as diverse as moral reasoning, notions of fairness, and sexual behavior. Even mental disorders seem to manifest differently across cultures and ethnic groups: one small study found that people with schizophrenia in India and Ghana hear friendlier voices than their counterparts in the U.S., suggesting that culture and environment may play a role. (This may account for why Westerners have a harder time with the Muller-Lyer optical illusion than some indigenous people: Most Americans are raised in urban environments where horizontal lines and sharp corners are ubiquitous; this presumably influences us into making optical calibrations that can potentially misfire, which forager societies like the San do not have to worry about.)
In fact, people from WEIRD societies like the U.S. appear to be outliers among humans, with the authors of the UBC concluding that Westerners “are among the least representative populations one could find for generalizing about humans”.
As an writer for NPR blithely noted, “It was not so much that the emperor of psychology had no clothes. It was more that he was dancing around in Western garb pretending to represent all humanity”.
Fortunately, researchers have wizened to these biases over the past decade, carefully adding qualifiers and caveats such as “in college populations” or “in Western society.” But its still easy for journalists, analysts, and casual readers like ourselves to read the findings of these studies and ascribe them to all of humanity. Much of human nature, like humans themselves, is a lot more complicated and multi-variable than WEIRD folks suggest.