It is odd that Americans are so reluctant, if not hostile, to looking abroad for ideas about how to do things, such as education, voting methods, healthcare, etc. The principles and ideas that underpinned this nation’s founding did not emerge from nowhere: They were inspired by, or even directly drawn from, Enlightenment thinkers from across Europe; certain elements of British law and government (ironically), such as the Magna Carta and English Bill of Rights; and of course the Greeks and Romans, from whom we borrowed specific methods, institutions, terminology, and even architecture. (The U.S. Senate is explicitly inspired by the original Roman Senate, with senatus being Latin for council of elders.)
Americans make up less than five percent of humanity. The U.S. is one of nearly 200 countries. Its history as a nation, let alone as a superpower, is a relative blink in time; as a point of reference, the Roman-Persian wars lasted over 600 years, nearly three times America’s lifespan. Conversely, many countries are much younger, including most of the world’s democracies, providing fresher or bolder perspectives on certain issues not addressed or contemplated by our more conservative system.
Given all that, it stands to reason that someone, somewhere out there, has done something that we have not thought of or figured out, something worth studying or implementing. It is statistically unlikely that we are the only people or nation to know everything, giving our narrow slice of time, humans, and experience. The fact that so many innovators, inventors, and other contributes this country have come from all over the world proves the U.S. has always tacitly accepted the idea that the rest of the world has something to offer.
In fact, this would be in accordance with the vision of most of the nation’s founders, who were far from nationalistic. Their debates, speeches, and correspondences reveal them to have been fairly worldly folks who were open to foreign ideas and perspectives and sought to integrate the country into the international system. From Jefferson’s cherished copy of the Muslim Koran, to Franklin’s open Francophilia and Madison’s insistence that we respect global public opinion and norms, the supposed dichotomy between patriotism and internationalism is a false one at odds with one’s service to the nation.
It is all the more ironic because one of the few schools of philosophy to originate in the United States was pragmatism, which emerged in the 1870s and postulated, among other things, that people promote ideas based on their practical effect and benefit (i.e., regardless of their national or foreign origin). It should not matter where our solutions to certain problems come from it matters that they are solutions, and thus beneficial to our community, in the first place.
America’s presidential system, along with its winner-take-all elections and Electoral College, tends to lead to gridlock and polarization. These mechanisms and institutions were devised before political parties were a thing—or at least as rigid as they are now—and thus never seriously took them into account. Hence, we are stuck with two big parties that are far from representative of the complex spectrum of policies and ideologies.
Rather than the proportional representation you see above, members of Congress are elected in single-member districts according to the “first-past-the-post” (FPTP) principle, meaning that the candidate with the plurality of votes—i.e. not even the majority—wins the congressional seat. The losing party or parties, and by extension their voters, get no representation at all. This tends to produce a small number of major parties, in what’s known in political science as Duverger’s Law.
With the Electoral College, there is a similar dynamic at play: a presidential candidate needs no more than half the vote plus one to win the entire state and its electors. Some states are considering making it proportional, but only Maine and Nebraska have already done so.
This is why you see so many seemingly contradictory interests lumped into one or the other party. In other systems, you may have a party centered on labor rights, another on the environment, yet another for “conventional” left-wing or right-wing platforms, etc. The fragmentation might be messy, but it also forces parties to either appeal to a larger group of voters (so they can have a majority) or form coalitions with other parties to shore up their legislative votes (which gives a voice to smaller parties and their supporters).
Note that this is a huge oversimplification, as literally whole books have been written about all the reasons we are stuck with a two-party system most do not like. And of course, a parliament would not fix all our political problems, which go as deep as our culture and society.
But I personally think we may be better off with a parliamentary-style multiparty system—uncoincidentally the most common in the world, especially among established democracies—than what we have now.
As I see folks share that they voted, I’m reminded of the idea of mandatory voting, in which all eligible citizens are required to vote unless they have a valid excuse.
In ancient Athens, it was seen as the duty of every eligible citizen to participate in politics; while there was no explicit requirement, you could be subject to public criticism or even a fine.
Today, only a few countries require citizens to vote, most of them in Latin America; but of this already small number, only a handful actually enforce it with penalties.
Moreover, just five of the world’s 35 established democracies have compulsory voting: Australia, Luxembourg, Uruguay, Costa Rica, and Belgium (which has the oldest existing compulsory voting system, dating back to 1893.) In Belgium, registered voters must present themselves at their polling station, and while they don’t have to cast a vote, those who fail to at least show up without proper justification can face prosecution and a moderate fine. (To make it easier, elections are always held on Sundays.) If they fail to vote in at least four elections, they can lose the right to vote for 10 years, and might face difficulties getting a job in government (though in practice fines are no longer issued).
The arguments for compulsory voting is that democratic elections are the responsibility of citizens—akin to jury duty or paying taxes—rather than a right. The idea is that making voting obligatory means all citizens have responsibility for the government they choose; in a sense, it makes the government more legitimate, since it represents the vast majority of people.
The counterargument is that no one should be forced to take part in a process they don’t believe in or otherwise don’t want to be a part of; basically, not voting is itself a form of expression. Unsurprisingly, this view is prevalent in the U.S., where many believe compulsory voting violates freedom of speech because the freedom to speak necessarily includes the freedom not to speak. Similarly, many citizens will vote solely because they have to, with total ignorance about the issues or candidates. In many cases, they might deliberately skew their ballot to slow the polling process and disrupt the election, or vote for frivolous or jokey candidates. This is prevalent in Brazil, the largest democracy with mandatory voting, where people increasingly have become cynical about politics, elect joke candidates, and still choose not to vote despite the penalty.
Some have argued that compulsory elections help prevent polarization and extremism, since politicians have to appeal to a broader base (i.e. the entire electorate). It does not pay to energize your base to the exclusion of all other voters, since elections cannot be determined by turnout alone. This is allegedly one reason Australian politics are relatively more balanced, with strong social policies but also a strong conservative movement.
Finally, there is the claim that making people vote might also make them more interested in politics. It’s been shown that while lots of folks resent jury duty for example, once they’re in the jury, they typically take the process seriously. Similarly, they may hate mandatory voting in theory but in practice will find themselves trying to make the best of it.
Americans have created this false dichotomy between patriotism and “globalism”, as if caring about international law, global public opinion, and the ideas of other nations is somehow intrinsically “un-American”. This would have been absurd to the Founding Fathers, who by today’s standards would be labeled globalist elites.
None other than James Madison, the father of the constitution, insisted that “no nation was so enlightened that it could ignore the impartial judgments of other nations and still expect to govern itself wisely and effectively”. In Federalist 63, he stressed the importance of respecting the consensus views of other countries and even believed that global public opinion could help keep ourselves in check:
An attention to the judgment of other nations is important to every government for two reasons: The one is, that independently of the merits of any particular plan or measure, it is desirable on various accounts, that it should appear to other nations as the offspring of a wise and honorable policy: The second is, that in doubtful cases, particularly where the national councils may be warped by some strong passion, or momentary interest, the presumed or known opinion of the impartial world, may be the best guide that can be followed.
Madison even adds that America would flourish if it considered the judgments and views of the world:
What has not America lost by her want of character with foreign nations? And how many errors and follies would she not have avoided, if the justice and propriety of her measures had in every instance been previously tried by the light in which they would probably appear to the unbiassed part of mankind?
Madison was far from alone in this view. Most of the founders, including Alexander Hamilton, John Adams, John Jay, and Thomas Jefferson, shared this sentiment, which is reflected in the U.S. Constitution. The Supremacy Clause states that international treaties are the supreme law of the land, even superseding conflicting domestic laws. The little known Offences Clause commits Congress to safeguard the “law of nations”, which we now cause international law. The Supreme Court has consistently upheld America’s commitments to international law; in one of its first cases, Ware v. Hylton, it ruled that the U.S. was bound by the terms of its peace treaty with Britain—even if it meant striking down a patriotic but conflicting state law. Many other cases—Missouri v. Holland, U.S. v. Curtiss-Wright, and The Paquete Habana, among others—followed suit.
Back in Madison’s day, most nations were monarchies in some form. Yet even then Americans saw the merit in garnering their respect or learning from them. Now that we have a more diverse community of nations—including dozens of democracies and allies—we have even more reason to take seriously our commitments to the world and our openness to its ideas.
By my count, there have only been three countries (possibly four) that claimed to be founded on ideas—rather than a particular religion, culture, or ethnicity—and which believed these ideas were objective, universal, and needed to be spread across the world.
The first and most obvious is probably the United States, for reasons most of us know.
Coming shortly afterward was France, which in some ways took things even further—mostly because it was going up against a thousand years of entrenched monarchical traditions, in a continent full of hostile monarchies. For example, to this day, the French constitution forbids the government from collecting data on race, religion, or national origin to preserve the idea that all people are equal in their status as citizens (and that citizenship is not contingent on such things).
Finally, there the Soviet Union, which tried to forge an entirely new nonethnic identity (Soviet) based around an entirely new idea (communism), upon a society that had previous been deeply religious, multiethnic, and largely feudal. Soviet ideologues even devised the idea of the “New Soviet Person”—someone defined by traits and virtues that transcended nationality, language, etc. We all know how well that turned out.
Of course, all three countries did not live up their ideals in practice, with the Soviets failing altogether. “True” Americans were (and to many people remain) narrowly idealized as white Anglo-Saxon Protestants, so that even black Protestants or white Catholics were, in different ways, seen as suspect. Both France and the Soviet Union gave greater privileges to white French and Russian speakers, respectively, etc.
But these are still the only countries that had at least the pretense of being universalist and idealist in their national identity (at least to my mind).
(Switzerland comes close, uniting four different ethnic and linguistic groups, and several religious sects, on the basis of a shared alpine identity and a commitment to constitutional federalism. But it never developed anything close to the manifest destiny of the U.S., the French Republic, and Soviet Russia.)
In between bar exam prep, I just finished an interesting report on China’s leadership strategy that may explain the country’s massive and rapid economic and political rise (aside from sheer ruthlessness and all that).
Under Deng Xiaoping’s rule in the early 1980s, the Chinese Communist Party (CCP) began to recruit new members from different social and occupational backgrounds into leadership positions, hoping to adapt to the changing environment by recruiting fresh talent and thereby obtaining new legitimacy. During the past decade, China has in fact been ruled by technocrats—who are mainly engineers-turned-politicians. The three “big bosses” in the so-called third generation leadership—Jiang Zemin, Li Peng, and Zhu Rongji—and three heavyweights in the fourth generation—President Hu Jintao, Premier Wen Jiabao, and Vice President Zeng Qinghong—are all engineers by training. Among the seven members of the 15th Politburo’s Standing Committee, China’s supreme decision-making body, six were engineers and one was an architect.
This pattern continued throughout the State Council and the ministerial and provincial governments. Even more remarkably, all nine men on the current Politburo’s Standing Committee are engineers by training. The elite transformation that has taken place in China in the post-Mao era is part of a wider and more fundamental political change—a move from revolution to reform in Chinese society. Turning away from the emphasis on class struggle and ideological indoctrination that characterized the previous decades, Deng and his technocratic protégés have stressed fast-paced economic and technological development at home and increased economic integration with the outside world.”
The short version: China has made a deliberate effort to appoint scientists and engineers at all levels of government—especially at the subnational level—and to diversify the experience and expertise of government officials beyond the lawyers (and to a lesser degree businesspeople) that dominate in many other countries.
To be clear, this is not itself indicative of the government’s integrity or efficiency. Corruption and human rights abuses remain rife in China, with the latter especially worsening in recent years under Xi Jinping (who studied chemical engineering). The report notes a growing rift within both the political leadership and broader society between those who went to elite schools and everyone else. (Interesting how universal that issue is.)
While the report does not draw this conclusion outright, I do think it is worth pondering to what degree China’s rise is owed to its relatively high reliance on scientists, engineers, and other non-lawyers. Do they provide a certain degree of pragmatism and problem-solving skills different from the typical legal and business oriented political class? Do they help inform policy through their diverse and unique perspectives (assuming the repressive state system does not dampen them)?
Whatever the case may be, I for one think the U.S. (among other places) can use diversity of profession, background, experience, and the like when it comes to law- and policy-making. It’s especially more imperative in a democracy where politicians should ostensibly be representing their constituents—but in most cases could not be further removed from who they claim to represent experientially, socioeconomically, and even by age. (More on that whole other topic in a future post.)
Among the grim arsenal of tools used by authoritarians is “disappearing” someone, in which they are secretly abducted or imprisoned by a government or its allies—say, by having unmarked men dragging them into an unmarked vehicle—followed by a refusal to acknowledge the person’s fate and whereabouts. The intent is to place the victim outside the protection of the law and to sow terror, fear, and anxiety among the populace as to the fate of their loved ones or fellow citizens.
One of the first references to forced disappearance is in the Declaration of the Rights of Man and the Citizen, drafted during the French Revolution to protect people from common tools of oppression employed by the monarchy. The French called for any government actions against citizens to be public, as doing something secret disguises bad intentions and is clearly intended to strike fear into citizens.
However, term’s origins and most infamous use are from Argentina’s “Dirty War” (1976-1983), in which the U.S.-backed military junta used both government forces and allied right-wing death squads to hunt down or “disappear” anyone suspected of being leftist, communist, or otherwise opposed to the government. (The Dirty War was part of the larger Operation Condor, an American-led campaign that supplied training and intelligence to right-wing military dictatorships throughout South America to suppress dissidents.)
Up to 30,000 people disappeared over several years, from suspected guerrilla fighters to students and journalists. Some were even dragged out of classrooms, workplaces, and buses. Most were kept in clandestine detention centers, where they were questioned, tortured, and sometimes killed. Argentina’s de facto dictator announced that such people “are neither dead nor alive, they are desaparecidos (missing)”—which is arguably more chilling, as intended.
It was later revealed that many captives met their end in so-called “death flights”, in which they were heavily drugged, loaded onto aircraft, and tossed into the Atlantic Ocean so as to leave no trace of their death. Without any dead bodies, the government could easily deny any knowledge of their whereabouts and any accusations that they had been killed.
Unfortunately for the junta, the mothers of the disappeared formed an activist group, Mothers of the Plaza de Mayo, that demanded accountability. Not only was their courage and persistence a factor in the regime’s downfall, but they and other Argentines helped led the global movement against forced disappearances, including devising the legal principles and international criminal statutes.
The name comes from a medieval prison where political prisoners were held by the royal government for arbitrary reasons and without a chance to appeal. For over a thousand years, France had maintained one of the world’s most authoritarian and hierarchical regimes, and the last place that ideals such as liberty and civic rights would emerge (the U.S. had the advantage of being a much younger place, and from inheriting the fairly liberal traditions of the U.K., whose monarchy was already weak by the 18th century).
Bastille became a symbol of this oppressive tradition, and hence it was targeted by the people of Paris on July 14, 1789, after they grew fed up with high taxes (that concentrated on the poor) and famine. While only seven prisoners were held in Bastille that day, the revolt was hugely symbolic of liberation of the French public, which continues to be a central part of France’s core ideals—represented by its three-color flag and official motto—of Liberty, Equality and Fraternity for all.
As French officials in the capital cowered before these newly empowered peasants, popular momentum built up into the French Revolution, which took on both the powerful monarchy and literally all of Europe (whose monarchies felt threatened by the fall of their principal kingdom). While the revolution descended into barbarism and bloodshed, and was eventually put down, the ideals that emerged remained in French hearts and minds, precipitating the reemergence of the republic in the late 19th century.
In fact, France’s well known tradition of protests and civil disobedience, which was on full display just a few months ago, can be traced back to this action.
Heck, this year’s Bastille Day was commemorated with officials honors and higher wages for essential workers—following months of agitation and negotiation with unions and workers.
This is par for the course in France, which has about 10 political marches every day.
There is even an “unofficial working manual” for French demonstrations, which is observed by all sides, including the government. (Those who fail to observe these rules are ostracized as casseurs or “smashers”.)
The protesters—who are generally up of a wide variety of folks, including steelworkers, winegrowers, students, lawyers, and chefs—would agree to an itinerary, provide their own security staff, and march on the agreed route. They would throw a few harmless objects police, usually for symbolic purposes; the police would respond, usually halfheartedly, with tear gas or baton charges.
The usual doctrine of French riot police is to stand back and protect the biggest public buildings. Tear gas, rubber bullets, and stun grenades are used to keep the crowds at bay—and on their declared routes. Riot police are trained to act only in groups and only on direct orders; in theory, they have no right of individual action or initiative. They are supposed to aim their nonlethal weapons below the waist and not use stun grenades in densely packed crowds.
Occasionally, more radical protests do emerge, resulting in serious scuffles or brawls; for their part, French riot police are known for lacking deescalation techniques. But overall sentiment underpinning these practices—that demonstrations and popular assembly are core to both political and social culture—remain robust and admirable.
The national discussion on U.S. policing has me thinking about my semester seminar with Leipzig University, where we worked with German law students to do a comparative analysis on each country’s approach to certain policies and legal issues. I’ve also been to Germany a few times and seen firsthand how police operate and are regarded.
Like so much else in Germany, law enforcement is heavily shaped by the past. As in every authoritarian state, the police were a key instrument of Nazi oppression. Cops spied on and arrested political enemies, deported Jews, guarded ghettos, and helped kill more than a million people on the eastern front.
Ironically, some of the postwar reforms of German policing was also influenced by the Allies, including the United States. Since Germany is a federal constitutional republic much like our own, and relatively large and diverse, it offers a fairly good point of comparison. Here are some key points:
➡️ There is no German FBI. Law enforcement is handled at the state level but with similar national standards The closest equivalent—the delightfully named Office for the Protection of the Constitution—cannot make arrests, has limited surveillance powers, and all its actions can be challenged in court or by any German citizen. It is also banned from exchanging information with police except through a dedicated counterterrorism forum.
➡️ Before they even start, police applicants must pass personality and intelligence tests. Cops usually endure up to two and a half years of training, whereas U.S. training can vary wildly from 11 weeks to eight months (the latter being the average). In addition to weapons training, German police are required to visit a concentration camp; take classes in law, ethics, and police history; and learn techniques in deescalation and nonlethal force.
➡️ German police officers do not handle minor infractions like parking tickets nor respond to calls about noise and the like. Non-emergencies are handled by unarmed but uniformed city employees. (This was an idea of the Allies, who wanted to “demilitarize and civilize police matters”.)
➡️ Controversially, German police have what is known as a “monopoly of force”. Gun ownership in Germany is low—with about 5.5 million private firearms, mostly for hunting and sport—and shootings are thus rare. Fewer guns on the streets means officers feel less threatened and are less likely to pull out their weapons or respond with force. Moreover, violence is generally frowned upon in German society; the head of Berlin’s police forced noted that “even drawing a gun can lead to a police officer requesting psychological support.”
➡️ Regardless of the reasons, the use of weapons, let alone fatal police shootings, is rare in Germany. In 2011, German police fired only 85 bullets in total; in the U.S., 84 shots were fired at just one murder suspect in NYC. In 2018, German police fatally shot 11 people and injured 34; in the U.S., with a population four times Germany’s, over 100 times as many people (1,098) were killed by police. One state alone, Minnesota, saw 13 fatal shootings—two more than all Germany (with 88 million people versus Minnesota’s 5.6 million).
➡️ Of course, German law enforcement, like any human institution, is not perfect. Some have questioned whether the country’s approach is too passive, especially in the face of terrorism and political violence. There have been plenty of scandals concerning excessive violence, particularly towards immigrants; hence the country recently had the biggest protests regarding racism outside the U.S.
As one German police academy instructor advised, the most important lesson is that institutions like the police cannot change unless a society’s values change with it. “The police are a mirror of society. You cannot turn the police upside down and leave society as it is”.
Happy 75th birthday to the United Nations, a deeply flawed and troubled organization that is nonetheless more indispensable than ever—and has accomplished a lot more than most people think.
It was on this day in 1945 that fifty countries ratified the UN Charter, which established the organization along with the framework of the international system. An audacious and idealistic document, it articulated a commitment to uphold the human rights and wellbeing of all citizens, addressing “economic, social, health, and related problems”, and “universal respect for, and observance of, human rights and fundamental freedoms for all without distinction as to race, sex, language, or religion”. The organization now counts nearly four times as many members, at 193.
Of course, we Americans know full well how hard it is to get even this one country to work together—imagine close to 200 countries spanning eight billion people and a multitude of languages, religions, cultures, types of governments, and levels of development. The UN is only as effective as its members allow it to be, and its failures and limitations are a reflection of our own as a species.
Moreover, it is worth considering the context of its emergence: A war that had killed over 60 million people (three percent of all humans at the time), after a millennia of endless conflict where violence was the norm and enslavement, rape, looting, and other things we now call war crimes (courtesy of the UN) were just the way of things. For most of our quarter of a million years of existence, we rarely knew about, much less cared, for anyone outside our immediate tribe or band. Human rights and civil liberties were alien concepts that would not have made sense to anyone. The vast majority of people lived in grinding poverty, oppression, fear, and ignorance.
From the ashes of the worst conflict in history emerges an organization trying to cultivate peace, progress, and unity among our species—not just out of idealism, but also based on the sober realism that some problems are too big for any one nation to handle. Needless to say, it has failed in its lofty aspirations time and again, as most of us know all too well—but that’s to be expected given just how bold of an undertaking it is. And for all the failures, there are plenty of successes we take for granted.
Eisenhower, far from a bleeding-heart globalist, once said that the UN “represents man’s best organized hope to substitute the conference table for the battlefield”. If nothing else, the organization has served as an outlet for frustrations and rivalries that would otherwise manifest on the battlefield. The constant grandstanding between the U.S. and Russia may be frustrating—and has often led to devastating deadlock during crises—but imagine the alternative course of action without an international platform? Iran and Saudi Arabia were on the verge of war some months ago, but instead to the UN to make their cases and take diplomatic shots at each other instead. It likely no coincidence that despite so many close calls, the UN-centered world has seen an unprecedented decline in the large scale interstate wars that were once so common (though this is not to make light of the numerous proxy and civil wars that have continued to exact a heavy toll).
Given that most Americans do not even know how their own government works, it stands to reason that few know the workings and complexities of the international system, either.
Few people know that it was the UN Secretary-General, U Thant of Burma, who played a key role in defusing the Cuban Missile Crisis; JFK admitted in private that “U Thant has put the world deeply in his debt” — though Thant is scarcely known today.
Many of us take for granted the modern amenities and benefits, let alone realize their origin in the UN. The ability to mail and ship things globally; to access goods and products from around the world; and to travel anywhere with relative ease are all due to UN organizations, treaties, or conferences that established uniform standards and rules for airlines, companies, and governments. Heck, even seat belts became widespread through deliberate UN policy.
Few know the work of UNICEF, one of the oldest UN organization, which in 2018 alone helped care for 27 million babies born in places with high infant and maternal mortality; treated four million children in 73 countries for severe acute malnutrition;\and provided over 65 million children with vaccines against common killers like diphtheria, tetanus and pertussis (half the world’s children get their vaccine through UNICEF). Over the last thirty years, it has saved over 90 million children.
The UN Population Fund helps an average of two million women a month with their pregnancies, which could be deadly in most countries.
The UN regularly monitors elections in about fifty countries, which not only ensures a free and fair political process but has prevented numerous civil wars and conflicts.
All these achievements no doubt come with caveats, and do not undo the very real and tragic failings, from Rwanda to the Syrian and Yemeni civil wars. But 75 years is not a long time to undo 250,000 years of tribalism and disunity. As one UN chief put it, “the United Nations was not created to bring us to heaven, but in order to save us from hell”. And considering that the average American pays less than two dollars a year to cover the U.S.’ regular budget dues to the UN, I think it is a work in progress worth supporting and improving upon.