Geneva, capital of the world, was crowded to capacity today when representatives of nearly half a hundred nations from every corner of the globe gathered to attend the first meeting of the assembly of the League of Nations.
One hundred years ago this week, the first session of the assembly of the newly established League of Nations was held in the Reformation Hall in Geneva. The meeting brought together representatives of 42 countries representing more than half of the world’s population at the time.
Though the League of Nations is better known for its abject failure to prevent World War II—which led to its replacement by the United Nations in 1945—it is difficult to understate its bold and audacious vision: For the first time in our bloody and divided history, there was a sense of cooperation and community among our fractured civilizations. The League set in motion the growing global consciousness and interconnectedness we see to this day (however tenuously). It also brought attention to issues that were long overlooked or dismissed by most societies: poverty, slavery, refugees, epidemics, and more. It thus laid the groundwork for organizations that aid tens of millions of people worldwide.
Ironically, despite its failure to stop the bloodiest war in history, the League’s successor, the UN, has been credited with preventing any large interstate conflicts to this day—in part because it created a League-induced forum for countries to duke it out at the table rather than the battlefield (to paraphrase Eisenhower). We got a hell of a ways to go, but we have to start somewhere, and this 100-year experiment with internationalism and pan-humanism pales to thousands of years of constant war and repression.
It is odd that Americans are so reluctant, if not hostile, to looking abroad for ideas about how to do things, such as education, voting methods, healthcare, etc. The principles and ideas that underpinned this nation’s founding did not emerge from nowhere: They were inspired by, or even directly drawn from, Enlightenment thinkers from across Europe; certain elements of British law and government (ironically), such as the Magna Carta and English Bill of Rights; and of course the Greeks and Romans, from whom we borrowed specific methods, institutions, terminology, and even architecture. (The U.S. Senate is explicitly inspired by the original Roman Senate, with senatus being Latin for council of elders.)
Americans make up less than five percent of humanity. The U.S. is one of nearly 200 countries. Its history as a nation, let alone as a superpower, is a relative blink in time; as a point of reference, the Roman-Persian wars lasted over 600 years, nearly three times America’s lifespan. Conversely, many countries are much younger, including most of the world’s democracies, providing fresher or bolder perspectives on certain issues not addressed or contemplated by our more conservative system.
Given all that, it stands to reason that someone, somewhere out there, has done something that we have not thought of or figured out, something worth studying or implementing. It is statistically unlikely that we are the only people or nation to know everything, giving our narrow slice of time, humans, and experience. The fact that so many innovators, inventors, and other contributes this country have come from all over the world proves the U.S. has always tacitly accepted the idea that the rest of the world has something to offer.
In fact, this would be in accordance with the vision of most of the nation’s founders, who were far from nationalistic. Their debates, speeches, and correspondences reveal them to have been fairly worldly folks who were open to foreign ideas and perspectives and sought to integrate the country into the international system. From Jefferson’s cherished copy of the Muslim Koran, to Franklin’s open Francophilia and Madison’s insistence that we respect global public opinion and norms, the supposed dichotomy between patriotism and internationalism is a false one at odds with one’s service to the nation.
It is all the more ironic because one of the few schools of philosophy to originate in the United States was pragmatism, which emerged in the 1870s and postulated, among other things, that people promote ideas based on their practical effect and benefit (i.e., regardless of their national or foreign origin). It should not matter where our solutions to certain problems come from it matters that they are solutions, and thus beneficial to our community, in the first place.
America’s presidential system, along with its winner-take-all elections and Electoral College, tends to lead to gridlock and polarization. These mechanisms and institutions were devised before political parties were a thing—or at least as rigid as they are now—and thus never seriously took them into account. Hence, we are stuck with two big parties that are far from representative of the complex spectrum of policies and ideologies.
Rather than the proportional representation you see above, members of Congress are elected in single-member districts according to the “first-past-the-post” (FPTP) principle, meaning that the candidate with the plurality of votes—i.e. not even the majority—wins the congressional seat. The losing party or parties, and by extension their voters, get no representation at all. This tends to produce a small number of major parties, in what’s known in political science as Duverger’s Law.
With the Electoral College, there is a similar dynamic at play: a presidential candidate needs no more than half the vote plus one to win the entire state and its electors. Some states are considering making it proportional, but only Maine and Nebraska have already done so.
This is why you see so many seemingly contradictory interests lumped into one or the other party. In other systems, you may have a party centered on labor rights, another on the environment, yet another for “conventional” left-wing or right-wing platforms, etc. The fragmentation might be messy, but it also forces parties to either appeal to a larger group of voters (so they can have a majority) or form coalitions with other parties to shore up their legislative votes (which gives a voice to smaller parties and their supporters).
Note that this is a huge oversimplification, as literally whole books have been written about all the reasons we are stuck with a two-party system most do not like. And of course, a parliament would not fix all our political problems, which go as deep as our culture and society.
But I personally think we may be better off with a parliamentary-style multiparty system—uncoincidentally the most common in the world, especially among established democracies—than what we have now.
As I see folks share that they voted, I’m reminded of the idea of mandatory voting, in which all eligible citizens are required to vote unless they have a valid excuse.
In ancient Athens, it was seen as the duty of every eligible citizen to participate in politics; while there was no explicit requirement, you could be subject to public criticism or even a fine.
Today, only a few countries require citizens to vote, most of them in Latin America; but of this already small number, only a handful actually enforce it with penalties.
Moreover, just five of the world’s 35 established democracies have compulsory voting: Australia, Luxembourg, Uruguay, Costa Rica, and Belgium (which has the oldest existing compulsory voting system, dating back to 1893.) In Belgium, registered voters must present themselves at their polling station, and while they don’t have to cast a vote, those who fail to at least show up without proper justification can face prosecution and a moderate fine. (To make it easier, elections are always held on Sundays.) If they fail to vote in at least four elections, they can lose the right to vote for 10 years, and might face difficulties getting a job in government (though in practice fines are no longer issued).
The arguments for compulsory voting is that democratic elections are the responsibility of citizens—akin to jury duty or paying taxes—rather than a right. The idea is that making voting obligatory means all citizens have responsibility for the government they choose; in a sense, it makes the government more legitimate, since it represents the vast majority of people.
The counterargument is that no one should be forced to take part in a process they don’t believe in or otherwise don’t want to be a part of; basically, not voting is itself a form of expression. Unsurprisingly, this view is prevalent in the U.S., where many believe compulsory voting violates freedom of speech because the freedom to speak necessarily includes the freedom not to speak. Similarly, many citizens will vote solely because they have to, with total ignorance about the issues or candidates. In many cases, they might deliberately skew their ballot to slow the polling process and disrupt the election, or vote for frivolous or jokey candidates. This is prevalent in Brazil, the largest democracy with mandatory voting, where people increasingly have become cynical about politics, elect joke candidates, and still choose not to vote despite the penalty.
Some have argued that compulsory elections help prevent polarization and extremism, since politicians have to appeal to a broader base (i.e. the entire electorate). It does not pay to energize your base to the exclusion of all other voters, since elections cannot be determined by turnout alone. This is allegedly one reason Australian politics are relatively more balanced, with strong social policies but also a strong conservative movement.
Finally, there is the claim that making people vote might also make them more interested in politics. It’s been shown that while lots of folks resent jury duty for example, once they’re in the jury, they typically take the process seriously. Similarly, they may hate mandatory voting in theory but in practice will find themselves trying to make the best of it.
Today is UN Day, which commemorates the 75th birthday of the United Nations, a deeply flawed and troubled organization that is nonetheless more indispensable than ever—and has accomplished a lot more than most people think.
It was on this day 75 years ago, just months after the end of humanity’s bloodiest war, that the UN Charter came into force after being ratified by fifty countries. The Charter established the organization along with the framework of the international system. An audacious and idealistic document, it articulated a commitment to uphold the human rights and wellbeing of all citizens, addressing “economic, social, health, and related problems,” and “universal respect for, and observance of, human rights and fundamental freedoms for all without distinction as to race, sex, language, or religion”. The organization now counts nearly four times as many members, at 193.
Dwight D. Eisenhower, far from a bleeding-heart globalist, once said that the UN “represents man’s best organized hope to substitute the conference table for the battlefield”.
If nothing else, the organization has served as an outlet for frustrations and rivalries that would otherwise manifest on the battlefield. The constant grandstanding between the U.S. and Russia may be frustrating—and has often led to devastating deadlock during crises—but imagine the alternative course of action without an international platform? Many countries on the verge of open conflict have opted instead to take diplomatic shots at each other at the UN—an often sordid display, to be sure, but obviously better than the alternative.
Of course, we Americans know full well how hard it is to get even our one country to work together—imagine close to 200 countries spanning eight billion people and a multitude of languages, religions, cultures, types of governments, and levels of development. The UN is only as effective as its members allow it to be, and its failures and limitations are a reflection of our own as a species.
Moreover, it is worth considering the context of its emergence: A war that had killed over 60 million people (three percent of all humans at the time), following a millennia of endless conflict where violence was the norm and enslavement, rape, looting, and other things we now call war crimes (courtesy of the UN) were just the way of things. For most of our quarter of a million years of existence, we rarely knew about, much less cared, for anyone outside our immediate tribe or band. Human rights and civil liberties were alien concepts that would not have made sense to anyone. The vast majority of people lived in grinding poverty, oppression, fear, and ignorance.
From the ashes of the worst conflict in history emerges an organization trying to cultivate peace, progress, and unity among our species—not just out of idealism, but also based on the sober realism that some problems are too big for any one nation to handle. Needless to say, it has failed in its lofty aspirations time and again, as most of us know all too well—but that’s to be expected given just how bold of an undertaking it is. And for all the failures, there are plenty of successes we take for granted.
Given that most Americans do not even know how their own government works, it stands to reason that few know the workings and complexities of the international system, either.
Few people know that it was the UN Secretary-General, U Thant of Burma, who played a key role in the Cuban Missile Crisis; JFK admitted that the entire world was in the UN leader’s debt, though Thant is scarcely known today.
Many of us take for granted the modern amenities and benefits, let alone realize their origin in the UN. The ability to mail and ship things globally; to access goods and products from around the world; and to travel anywhere with relative ease are all due to UN organizations, treaties, or conferences that established uniform standards and rules for airlines, companies, and governments. Even seatbelts became widespread through deliberate UN policy.
Few know the work of UNICEF, one of the oldest UN organization, which in 2018 alone helped care for 27 million babies born in places with high infant and maternal mortality; treated four million children in 73 countries for severe acute malnutrition; and provided over 65 million children with vaccines against common killers like diphtheria, tetanus and pertussis (half the world’s children get their vaccine through UNICEF). Over the last thirty years, it has saved over 90 million children.
The much maligned WHO helped eradicate smallpox, which once killed millions annually throughout history, and is on the verge of eradicating polio as well. It has helped most people with HIV/AIDS get access to treatment, and is currently working on making insulin more available, too. With respect to the recent pandemic, it also used its diplomacy to get China to finally open itself to an international team of scientists—which included two Americans. It recently helped stem the second largest Ebola outbreak in Congo, to little fanfare.
A 1987 conference convened by the UN Environment Programme helped lead to an international treaty that has successfully repaired the ozone layer.
The World Food Programme, along with the Food and Agriculture Organization, provides food and assistance to 90 million people in 88 countries, keeping them from the brink of starvation (and getting a well deserved Nobel Peace Prize for it). FAO also eradicated rinderpest, a deadly livestock disease that is only the second infectious disease in history (besides smallpox) to be eradicated. It also maintains the world’s largest and most comprehensive statistical database on food and agriculture.
The UN Population Fund helps an average of two million women a month with their pregnancies, which could be deadly in most countries.
The UN regularly monitors elections in about fifty countries, which not only ensures a free and fair political process but has prevented numerous civil wars and conflicts.
All these achievements do not undo the very real and tragic failings of the organization, from the genocides in Rwanda and Bosnia, to the Syrian and Yemeni civil wars. But 75 years is not a long time to undo over 200,000 years of tribalism and disunity. As one UN chief put it, “the United Nations was not created to bring us to heaven, but in order to save us from hell”.
Considering that the average American pays less than two dollars a year to cover the U.S.’ regular dues to the UN, I think it is a bargain worth supporting and improving upon.
Like most aspiring parents, I think a lot about how I will raise my children. Obviously, I am not alone in these concerns, since raising another human being is one of the most consequential things one can do.
That is why parenting advice is a dime a dozen, and why there has been so much interest and discussion around parenting styles from Asia or France. People everywhere share the same understandable need to learn the best way to shape their children in ways that will help them flourish.
One approach that has received far less attention is Mayan parenting, which challenges many of the assumptions that underpin parenting across the world. NPR has a great piece about it, and I recommend reading the whole thing. Here are some choice excerpts highlighting the life and philosophies of a Mayan mom:
Burgos is constantly on parental duty. She often tosses off little warnings about safety: “Watch out for the fire” or “Don’t play around the construction area.” But her tone is calm. Her body is relaxed. There’s no sense of urgency or anxiety.
In return, the children offer minimal resistance to their mother’s advice. There’s little whining, little crying and basically no yelling or bickering.
In general, Burgos makes the whole parenting thing look — dare, I say it — easy. So I ask her: “Do you think that being a mom is stressful?”
Burgos looks at me as if I’m from Mars. “Stressful? What do you mean by stressful?” she responds through a Mayan interpreter.
A five-minute conversation ensues between Burgos and the interpreter, trying to convey the idea of “stressful.” There doesn’t seem to be a straight-up Mayan term, at least not pertaining to motherhood.
But finally, after much debate, the translator seems to have found a way to explain what I mean, and Burgos answers.
“There are times that I worry about my children, like when my son was 12 and only wanted to be with his friends and not study,” Burgos says. “I was worried about his future.” But once she guided him back on track, the worry went away.
In general, she shows no sense of chronic worry or stress.
“I know that raising kids is slow,” she says. “Little by little they will learn.”
I would love to channel that delicate balance of stoicism and paternalism, somewhere between “helicopter” and “free-range” parenting.
As it turns out, the Mayan approach reflects a fundamentally different paradigm to parenting. Whereas most Western cultures frame parenting as a matter of control—be it less or more, or over some things but not others—the Maya do not even have a word for control as it relates to children.
“We think of obedience from a control angle. Somebody is in charge and the other one is doing what they are told because they have to,” says Barbara Rogoff, a psychologist at the University of California, Santa Cruz, who has studied the Maya culture for 30 years.
And if you pay attention to the way parents interact with children in our society, the idea is blazingly obvious. We tend to boss them around. “Put your shoes on!” or “Eat your sandwich!”
“People think either the adult is in control or the child is in control,” Rogoff says.
But what if there is another way to interact with kids that removes control from the equation, almost altogether?
That’s exactly what the Maya — and several other indigenous cultures — do. Instead of trying to control children, Rogoff says, parents aim to collaborate with them.
“It’s kids and adults together accomplishing a common goal,” Rogoff says. “It’s not letting the kids do whatever they want. It’s a matter of children — and parents — being willing to be guided.”
In the Maya culture, even the littlest of children are treated with this respect. “It’s collaborative from the get-go.”
No doubt this collaborative and egalitarian approach would be alien to most American parents (among others I’m sure). So would the Mayan idea of what is called “alloparenting”:
Human children didn’t evolve in a nuclear family. Instead, for hundreds of thousands of years, kids have been brought up with a slew of people — grandparents, aunts, uncles, siblings, the neighbors, Lancy writes. It’s not that you need a whole village, as the saying goes, but rather an extended family — which could include biological relatives but also neighbors, close friends or paid help.
Throughout human history, motherhood has been seen as a set of tasks that can be accomplished by many types of people, like relatives and neighbors, the historian John R. Gillis writes in The World Of Their Own Making. Anthropologists call them “alloparents” — “allo” simply means “other.”
Across the globe, cultures consider alloparents key to raising children, Lancy writes.
The Maya moms value and embrace alloparents. Their homes are porous structures and all sorts of “allomoms” flow in and out. When a woman has a baby, other mothers work together to make sure she can take a break each day to take a shower and eat meals, without having to hold the baby. (How civilized is that!)
In one household with four kids that I visited, the aunt dropped off food, the grandma stopped by to help with a neighbor’s baby and, all the while, the oldest daughter looked after the toddler — while the mom fed the livestock and started to make lunch. But in Western culture, over the past few centuries, we have pushed alloparents to the periphery of the parenting landscape, Gillis writes. They aren’t as valued and sometimes even denigrated as a means for working mothers to outsource parenting duties.
It is a stark contrast to the stereotypical—and still widespread—notion of the “mom in a box”: A mother stuck at home with the kids and responsible for virtually every domestic task in addition to nearly all parental duties. Learning on dads, relatives, or close friends is more common—if only by necessity—but is still treated as a last resort or otherwise unusual.
Aside from Labor Day in the U.S., today is the first International Day of Clear Blue Skies, which was established by the United Nations General Assembly to bring awareness to the largest environmental risk to public health globally: air pollution.
Over 90% of our world is exposed to polluted air, which causes an estimated seven million premature deaths every year (more than cigarette smoking) and leaves millions more with chronic health problems like asthma and cognitive decline.
Fortunately, the world has a precedent for successful action: Over 30 years ago this month, the UN-sponsored Montreal Protocol saw literally every country commit to working together to eliminate CFCs, which were causing severe depletion of the ozone layer; it remains one of the few treaties with universal agreement. It took only 14 years between the discovery of the problem and the world committing to resolve it—and we’ve already seen the results.
A few years ago, it was confirmed that the ozone layer is slowly recovering, and most projections show it fully healing within the next four decades. In an era of rising conflict and poor global leadership, this unlikely and little known success story of international cooperation is a glimmer of hope.
Americans have created this false dichotomy between patriotism and “globalism”, as if caring about international law, global public opinion, and the ideas of other nations is somehow intrinsically “un-American”. This would have been absurd to the Founding Fathers, who by today’s standards would be labeled globalist elites.
None other than James Madison, the father of the constitution, insisted that “no nation was so enlightened that it could ignore the impartial judgments of other nations and still expect to govern itself wisely and effectively”. In Federalist 63, he stressed the importance of respecting the consensus views of other countries and even believed that global public opinion could help keep ourselves in check:
An attention to the judgment of other nations is important to every government for two reasons: The one is, that independently of the merits of any particular plan or measure, it is desirable on various accounts, that it should appear to other nations as the offspring of a wise and honorable policy: The second is, that in doubtful cases, particularly where the national councils may be warped by some strong passion, or momentary interest, the presumed or known opinion of the impartial world, may be the best guide that can be followed.
Madison even adds that America would flourish if it considered the judgments and views of the world:
What has not America lost by her want of character with foreign nations? And how many errors and follies would she not have avoided, if the justice and propriety of her measures had in every instance been previously tried by the light in which they would probably appear to the unbiassed part of mankind?
Madison was far from alone in this view. Most of the founders, including Alexander Hamilton, John Adams, John Jay, and Thomas Jefferson, shared this sentiment, which is reflected in the U.S. Constitution. The Supremacy Clause states that international treaties are the supreme law of the land, even superseding conflicting domestic laws. The little known Offences Clause commits Congress to safeguard the “law of nations”, which we now cause international law. The Supreme Court has consistently upheld America’s commitments to international law; in one of its first cases, Ware v. Hylton, it ruled that the U.S. was bound by the terms of its peace treaty with Britain—even if it meant striking down a patriotic but conflicting state law. Many other cases—Missouri v. Holland, U.S. v. Curtiss-Wright, and The Paquete Habana, among others—followed suit.
Back in Madison’s day, most nations were monarchies in some form. Yet even then Americans saw the merit in garnering their respect or learning from them. Now that we have a more diverse community of nations—including dozens of democracies and allies—we have even more reason to take seriously our commitments to the world and our openness to its ideas.
By my count, there have only been three countries (possibly four) that claimed to be founded on ideas—rather than a particular religion, culture, or ethnicity—and which believed these ideas were objective, universal, and needed to be spread across the world.
The first and most obvious is probably the United States, for reasons most of us know.
Coming shortly afterward was France, which in some ways took things even further—mostly because it was going up against a thousand years of entrenched monarchical traditions, in a continent full of hostile monarchies. For example, to this day, the French constitution forbids the government from collecting data on race, religion, or national origin to preserve the idea that all people are equal in their status as citizens (and that citizenship is not contingent on such things).
Finally, there the Soviet Union, which tried to forge an entirely new nonethnic identity (Soviet) based around an entirely new idea (communism), upon a society that had previous been deeply religious, multiethnic, and largely feudal. Soviet ideologues even devised the idea of the “New Soviet Person”—someone defined by traits and virtues that transcended nationality, language, etc. We all know how well that turned out.
Of course, all three countries did not live up their ideals in practice, with the Soviets failing altogether. “True” Americans were (and to many people remain) narrowly idealized as white Anglo-Saxon Protestants, so that even black Protestants or white Catholics were, in different ways, seen as suspect. Both France and the Soviet Union gave greater privileges to white French and Russian speakers, respectively, etc.
But these are still the only countries that had at least the pretense of being universalist and idealist in their national identity (at least to my mind).
(Switzerland comes close, uniting four different ethnic and linguistic groups, and several religious sects, on the basis of a shared alpine identity and a commitment to constitutional federalism. But it never developed anything close to the manifest destiny of the U.S., the French Republic, and Soviet Russia.)
In between bar exam prep, I just finished an interesting report on China’s leadership strategy that may explain the country’s massive and rapid economic and political rise (aside from sheer ruthlessness and all that).
Under Deng Xiaoping’s rule in the early 1980s, the Chinese Communist Party (CCP) began to recruit new members from different social and occupational backgrounds into leadership positions, hoping to adapt to the changing environment by recruiting fresh talent and thereby obtaining new legitimacy. During the past decade, China has in fact been ruled by technocrats—who are mainly engineers-turned-politicians. The three “big bosses” in the so-called third generation leadership—Jiang Zemin, Li Peng, and Zhu Rongji—and three heavyweights in the fourth generation—President Hu Jintao, Premier Wen Jiabao, and Vice President Zeng Qinghong—are all engineers by training. Among the seven members of the 15th Politburo’s Standing Committee, China’s supreme decision-making body, six were engineers and one was an architect.
This pattern continued throughout the State Council and the ministerial and provincial governments. Even more remarkably, all nine men on the current Politburo’s Standing Committee are engineers by training. The elite transformation that has taken place in China in the post-Mao era is part of a wider and more fundamental political change—a move from revolution to reform in Chinese society. Turning away from the emphasis on class struggle and ideological indoctrination that characterized the previous decades, Deng and his technocratic protégés have stressed fast-paced economic and technological development at home and increased economic integration with the outside world.”
The short version: China has made a deliberate effort to appoint scientists and engineers at all levels of government—especially at the subnational level—and to diversify the experience and expertise of government officials beyond the lawyers (and to a lesser degree businesspeople) that dominate in many other countries.
To be clear, this is not itself indicative of the government’s integrity or efficiency. Corruption and human rights abuses remain rife in China, with the latter especially worsening in recent years under Xi Jinping (who studied chemical engineering). The report notes a growing rift within both the political leadership and broader society between those who went to elite schools and everyone else. (Interesting how universal that issue is.)
While the report does not draw this conclusion outright, I do think it is worth pondering to what degree China’s rise is owed to its relatively high reliance on scientists, engineers, and other non-lawyers. Do they provide a certain degree of pragmatism and problem-solving skills different from the typical legal and business oriented political class? Do they help inform policy through their diverse and unique perspectives (assuming the repressive state system does not dampen them)?
Whatever the case may be, I for one think the U.S. (among other places) can use diversity of profession, background, experience, and the like when it comes to law- and policy-making. It’s especially more imperative in a democracy where politicians should ostensibly be representing their constituents—but in most cases could not be further removed from who they claim to represent experientially, socioeconomically, and even by age. (More on that whole other topic in a future post.)