The Hero of Two Worlds

On this day in 1777, after offering to serve the United States without pay, the Second Continental Congress passed a resolution allowing French nobleman the Marquis de Lafayette to join American revolutionary forces as a major general.

Barely two years before, when he was only 18, Lafayette professed that his “heart was dedicated” to the American cause of liberty—hence his willingness to fight for the Patriots for free, and to even purchase his own ship to cross the Atlantic.

While Congress was overwhelmed with French volunteers, Lafayette was by far among the most promising. He learned English within a year of his arrival, had won over Benjamin Franklin, and bounded well with George Washington, to whom he was a close advisor.

During the Battle of Brandywine against a superior British force, he was wounded in action but still managed to organize an orderly retreat, for which Washington commended him and recommended he be given command of American troops. He served with distinction in several more battles in Pennsylvania, New Jersey, and Rhode Island (some of which bear his name) before sailing back home in 1779 to lobby more French support.

Lafayette returned a year later to a hero’s welcome in Boston, having secured thousands of French troops as well as naval forces and supplies. He was given senior positions in the Continental Army, and was so popular among Americans that Washington and Hamilton had him write letters to state officials urging them to send troops.

In 1781, Lafayette played a pivotal role in the decisive Siege of Yorktown, where troops under his command in Virginia blocked forces led by Cornwallis until other American and French forces could position themselves to strike. This victory—which involved almost as many French as Americans—is credited with ending the war.

After the war, Lafayette remained committed to the cause of liberty for the rest of his life. He played a pivotal role in the French Revolution, with Jefferson’s help contributing to the drafting of the Declaration of the Rights of Man and of the Citizen, one of the earliest republican and civil rights documents in history. He was opposed to slavery, the murderous excesses of the revolution, and the subsequent autocracy of Napoleon. He was invited by James Madison to visit all 24 states of the Union—to which he still received popular praise and love—and he turned down calls to be the head of France.

Because he was foreign. did not live in the U.S., and fought across all regions out of ideology rather than money, Lafayette was seen as a unifying figure and American icon to the fragmented colonies. His legacy in both sides of the Atlantic earned him the moniker of “The Hero of Two Worlds.”

Swearing an Oath

In light of the fact that some incoming Muslim congresswomen may be sworn in on a Quran (by Mike Pence no less), it is worth remembering that the U.S. Constitution does not require an oath of office to be sworn on a Bible, or on any religious text for that matter.

Article VI, Clause 3, which covers oaths of office, states that while elected officials in both state and federal governments, as well executive and judicial officers throughout the country, are bound “by oath or affirmation” to support the Constitution, “no religious Test shall ever be required as a Qualification to any Office or public Trust under the United States.”

Of course, we have already had Jewish, Hindu, Buddhist, and Muslim representatives and officials, both in Congress and throughout various other state and federal offices, swear on their respective religious texts (Fun fact: In 2007, Keith Ellison of Minnesota, the first Muslim congressperson, was sworn in on a Quran formerly owned and cherished by Thomas Jefferson.)

But many other officials, both religious and secular, have sworn on nonreligious texts, or nothing at all. John Quincy Adams and Franklin Pierce swore on a book of law; Lyndon B. Johnson was sworn in on a Roman Catholic missal; and Teddy Roosevelt, who had to take the oath in a hurry after William McKinley’s assassination, did without anything, since there was no Bible on hand.

Moreover, many Christians are forbidden by their teachings to swear on anything; both Herbert Hoover and Richard Nixon, who as Quakers could technically not swear on anything, could have “affirmed” rather than “sworn” during their oaths, though it appears they did not do so.

The Offenses Clause and America’s Commitment to International Law

Article I, Section 8 of the U.S. Constitution contains the obscure but significant “Offenses Clause“, which empowers Congress to “define and punish … Offenses against the Law of Nations.” The law of nations was the 18th century term for what we now call international law.

As the time, these “offenses” would have included “attacks on foreign nations, their citizens, or shipping;” failing to honor “the flag of truce, peace treaties, and boundary treaties” (including unauthorized entry across national borders); and mistreating prisoners of war. The law of nations also obliged states to prosecute pirates, protect wrecked ships and their crew (regardless of their nationality); and protect foreign dignitaries and merchants in their territory.

Thus, the Framers clearly sought to convey to the world that the U.S. would be a responsible actor among the global community, enshrining in its highest legal instrument a commitment to safeguarding foreign nationals, property, and interests, even if it means ostensibly prosecuting American perpetrators.

Some jurists have argued that this provision, in theory, permits Congress to criminalize private conduct in the U.S. that violates international law.

U.S. Healthcare Stands Out

American exceptionalism certainly has its merits: when it comes to healthcare, the U.S. is most definitely exceptional, albeit not in a good way.

Virtually no country comes close to spending so much on healthcare with so little payoff: a little over twenty years ago, the U.S. spent about 13 percent of GDP on healthcare compared to a developed-world average of about 9.5 percent; by 2016, our spending hit 17.5 percent of GDP–or $3 trillion

As Foreign Policy explained:

As you can see, Americans are spending more money – but they are not receiving results using the most basic metric of life expectancy. The divergence starts just before 1980, and it widens all the way to 2014.

It’s worth noting that the 2015 statistics are not plotted on this chart. However, given that healthcare spend was 17.5% of GDP in 2015, the divergence is likely to continue to widen. U.S. spending is now closing in on $10,000 per person.

Perhaps the most concerning revelation from this data?

Not only is U.S. healthcare spending wildly inefficient, but it’s also relatively ineffective. It would be one thing to spend more money and get the same results, but according to the above data that is not true. In fact, Americans on average will have shorter lives people in other high income countries.

Life expectancy in the U.S. has nearly flatlined, and it hasn’t yet crossed the 80 year threshold. Meanwhile, Chileans, Greeks, and Israelis are all outliving their American counterparts for a fraction of the associated costs.

I am not sure how much more data we need to prove that our healthcare system is broken. So many other countries with fewer resources have managed to extend average life expectancy without breaking the bank. Yet for all our innovation and wealth, we are breaking the bank by a wide a margin and still having little to show for it.

America and its International Commitments

One of the biggest objections to the Iran Nuclear Deal is that it violated U.S. law because it was never approved by two-thirds of the Senate, as required by the “Treaty Clause” (Art. II, Sec. 2) of the U.S. Constitution. (Contrary to the beliefs of many red-blooded Americans, the Constitution gives ratified treaties the same force as domestic law, per the Supremacy Clause.) However, this reflects a fundamental misunderstanding of the deal, the Constitution, and international law.

First, the deal was never binding: It is classified as a “nonbinding political commitment”, which, by definition, and in contrast to a treaty, requires no congressional approval nor is legally binding. Throughout U.S. history, presidents of all parties have made international agreements without the approval of a supermajority of Senators, either through “congressional-executive agreements”—which are ratified by only a simple majority of Congress—or through “executive agreements”, which are made solely by the president without any congressional involvement.

Between 1946 and 1999 alone, the U.S. completed nearly 16,000 international agreements—of which only 912 (5.7 percent) were treaties ratified by the Senate. (Most were congressional-executive agreements.)

While the Constitution does not explicitly provide for these alternatives, these alternatives have long been considered legitimate. Thomas Jefferson, a globalist sellout if we ever saw one, argued that the Treaty Clause procedure is not always necessary; short-term agreements without Senate approval may be better since “when they become too inconvenient, [they] can be dropped at the will of either party”. Most of the Founders did not objective this, because they recognized pragmatic and expedient reasons to allow the president to make international agreements without going through the long and politicized channels of the legislature.

In fact, when Jefferson sought to purchase the massive Louisiana Territory from France, there was some debate as to whether expanding U.S. territory was legal, since the Constitution was silent on the matter. He ultimately prevailed on the argument—backed by the “Father of the Constitution” James Madison—that the executive’s broad foreign policy powers allowed him to acquire the territory through treaty; he subsequently signed an agreement with France in April, announced it publicly in July, and finally got it ratified by the Senate in October.

The Supreme Court has repeatedly affirmed these powers. In Missouri v. Holland, it held that the federal government can use treaties to legislate in areas that would otherwise fall within the exclusive authority of the states. That is because the Supremacy Clause of the Constitution gives treaties the same force as federal law, which is binding on the states. In American Insurance Association v. Garamendi, the Court reaffirmed that “the President has authority to make ‘executive agreements’ with other countries, requiring no ratification by the Senate or approval by Congress, this power having been exercised since the early years of the Republic.”

America Less Respected Worldwide

According to polls of international relations (IR) experts and the American public, the U.S. is believed to have far less global respect than in the past.

A breakdown of the results and methodology by Pew:

Among the foreign affairs experts, 93% say the U.S. is less respected by other countries today compared with the past, according to a survey of international relations (IR) scholars conducted in October 2018 by the Teaching, Research and International Policy (TRIP) Project at the College of William and Mary. The poll included 1,157 respondents who are employed at a U.S. college or university in a political science department or professional school and who teach or conduct research on international issues. Only 4% of these experts believe the U.S. is as respected as in the past, with a mere 2% saying the U.S. gets more respect from abroad than it has previously received.

The American public also has seen a decline in other countries’ respect for the U.S., though it is less unified than IR experts in its assessment, according to a separate survey of 1,504 adults conducted in October 2017 by Pew Research Center. Roughly seven-in-ten Americans (68%) said the U.S. is less respected by other countries today compared with the past. About two-in-ten (17%) thought America had maintained its global level of respect, while 13% said the U.S. is more respected. It should be noted, however, that a majority of the American public has expressed belief that the U.S. is less respected every time the question has been asked since 2004, ranging from a low of 56% to a high of 71% holding this opinion.


Moreover, the majority of those who said the U.S. is less respected (around three quarters) believe this is a major problem.

Among IR scholars, there are two prevailing schools of thought: realism, which emphases the constant competition between countries pursuing their own ends; and constructivism (also called liberalism), which stresses shared ideas and/or mutual cooperation among states. A majority of both schools — around 82% of realists and 95% of liberals — thought respect in America was declining. (Most of those who do not adhere to either school also agreed that the U.S. is less respected.)

By contrast, the American public saw far sharper divides depending on party affiliation:

Among the public overall, there are sharp partisan differences over whether the U.S. is less respected today than in the past and whether it’s a major problem. Around four-in-ten Republicans and Republican-leaning independents (42%) asserted in the 2017 survey that the U.S. is less respected than in the past, and about a quarter (28%) deemed this a major problem. Yet more than twice as many Democrats and Democratic leaners (87%) thought global respect for the U.S. had diminished, and seven-in-ten said this is a major problem.

While majorities of Democrats viewed the U.S. as less respected internationally at various points during the Obama administration, there was a 29-percentage-point increase in the share saying this between 2016 and 2017 following Donald Trump’s election. Similarly, the share of Republicans saying that the U.S. is less respected abroad dropped by 28 percentage points from the end of the Obama administration to when Trump took office.

Unfortunately, other polling data by Pew back up this perception: as of 2017, America had indeed suffered from a declining image among over two dozen countries polled across the world:

Many Americans may not think this matters, given that the U.S. remains a powerful country militarily, economically, and even culturally. But with the balance of power shifting across several different countries and regions, and global problems like terrorism and climate change warranting more international cooperation, having the rest of the world on your side is more crucial than ever. We need allies and partners, whether for trade, scientific research, economic development, or military defense. That will be a lot harder to achieve if we keep alienating ourselves from the rest of the world, while rivals and emerging powers fill in the gaps.

What are your thoughts?

Source: Pew Research Center

The Groundbreaking But Largely Forgotten Apollo 8 Mission

On this day in 1969, the U.S. launched Apollo 8, the second manned spaceflight mission in the Apollo space program and the first crewed launch of the Saturn V rocket. Astronauts Frank Borman, James Lovell, and William Anders became the first humans to travel beyond low Earth orbit, see all of Earth, orbit another celestial body, see the far side of the Moon, witness and photograph an “Earthrise” (first photo), escape the gravity of another celestial body (the Moon), and reenter Earth’s gravitational well. Apollo 8 was also the first human spaceflight from the Kennedy Space Center, located adjacent to Cape Canaveral Air Force Station in Florida.

Originally planned as a test of the Apollo Lunar Module, since the module was not yet ready for its first flight, the mission profile was abruptly changed in August 1968 to a more ambitious flight to be flown in December. Thus, the crew led by Jim McDivitt crew, who were training Apollo Lunar Module, instead became the crew for Apollo 9, while Borman and his men were moved to the Apollo 8 mission. This meant the new Apollo 8 crew had two to three months’ less training and preparation than originally planned, not to mention having to take up translunar navigation training. The crew themselves believed there was only a 50% chance of the mission succeeding.

Fortunately, things went off without a hitch: after almost three days, Apollo 8 reached the Moon. The crew orbited the Moon ten times in 20 hours, during which they made a Christmas Eve television broadcast in which they read the first ten verses from the Book of Genesis—at the time the most watched TV program ever. (In fact, it is estimated that one out of four people alive at the time saw it either live or shortly after.) Even the Chairman of the Soviet Interkosmos program was quoted describing the flight as an “outstanding achievement of American space sciences and technology”.

Although largely forgotten today, Apollo 8 was seen as the joyful culmination of a tumultuous year, rife with political assassinations, instability, and other tragedies worldwide. For a moment, humanity received a well needed morale boost. the success of the mission paved the way for Apollo 11 to fulfill America’s goal of landing a man on the Moon before the end of the 1960s. The Apollo 8 astronauts returned to Earth on December 27, 1968, when their spacecraft splashed down in the northern Pacific Ocean. They were later named TIME’s “Men of the Year” for 1968.

The iconic Earthrise photo has been credited as one of the inspirations of the first Earth Day in 1970; it was selected as the first of Life magazine’s 100 Photographs That Changed the World.

Photos courtesy of Wikimedia.

Most Americans Know Next to Nothing About the Constitution

For a document that is practically deified as the greatest legal instrument in the world, the U.S. Constitution is woefully misunderstood, or not understood at all. Those are the depressing results of a 2017 poll from the University of Pennsylvania’s Annenberg Public Policy Center. (Though the data are one year old, I doubt the results have changed, except maybe for the worse.)

More than one in three people (37%) could not name a single right protected by the First Amendment. Only one in four (26%) can name all three branches of the government (down from 38% in 2011), and one in three (33%) cannot name any branch of government. A majority (53%) believe that undocumented immigrants have no rights under the Constitution, despite the Supreme Court ruling repeatedly that everyone in the U.S. is entitled to due process and the right to make their case before the courts.

As Chris Cillizza over at CNN points out, these dismal results aren’t limited to just one poll:

Take this Pew Research Center poll from 2010 When asked to name the chief justice of the Supreme Court, less than three in 10 (28%) correctly answered John Roberts. That compares unfavorably to the 43% who rightly named William Rehnquist as the chief justice in a Pew poll back in 1986.

What did the 72% of people who didn’t name Roberts as the chief justice in 2010 say instead, you ask? A majority (53%) said they didn’t know. Eight percent guessed Thurgood Marshall, who was never a chief justice of the Court and, perhaps more importantly, had been dead for 17 years when the poll was taken. Another 4% named Harry Reid, who is not now nor ever was a Supreme Court Justice.What we don’t know about the government — executive, legislative and judicial branches — is appalling.

It’s funny — until you realize that lots and lots of people whose lives are directly affected by what the federal government does and doesn’t do have absolutely no idea about even the most basic principles of how this all works. The level of civil ignorance in the country allows our politicians — and Donald Trump is the shining example of this — to make lowest common denominator appeals about what they will do (or won’t do) in office. It also leads to huge amounts of discontent from the public when they realize that no politician can make good on the various and sundry promises they make on the campaign trail.

While I think more and better civics curricula is a solution, I also suspect that the visceral hatred of all things government dissuades people from caring about these things in the first place. At the same time, I can also see how (often understandable) cynicism towards our political system might breed apathy, too. Why bother to know about a system that you are convinced does nothing good for you or society?

Lessons from Estonia on E-Voting

Count on the country that helped invent Skype to lead the way when it comes to digital government. The nation of just 1.5 million, also known for having been the first to break away from the Soviet Union, has made a name for itself as a pioneer in utilizing technology to improve civic engagement. As Forbes reports:

Modern-day Estonia has become synonymous with the notion of reimagining how citizens interact with their government, making nearly every governmental service available from home or on the go via a mouse click. Since 2005 the country has allowed its citizens to cast their votes in pan-national elections via a secure online portal system, growing to over 30% of votes cast in the last several elections, according to Tarvi Martens, Chairman of the Estonian Electronic Voting Committee.

Citizens can vote as many times as they like up to election day, with only the final vote counting. Those who do not have access to a computer or who prefer old fashioned paper ballots can still vote by paper – evoting is an option rather than a mandate.

Interestingly, nearly a quarter of evotes in recent elections have been cast by people over the age of 55, with another 20% of evotes from the 45-54 age range. This suggests evoting enjoys broad support not just among young digital native millennials, but across the societal spectrum, especially among those who, at least in the US, are not typically viewed as early adopters of digital services.

To vote in Estonia, one simply visits the national election website and downloads and installs the voting application. Then you insert your national identity card into your computer’s card reader, fill out your digital ballot, confirm your choices and digitally sign and submit your eballot. You can do all of this from the comfort of your own home in the seven days leading up to election day.

Pretty amazing stuff, especially for a country that still largely relies on woefully outdated tech to cast its votes. 

Granted, with cybersecurity being one of the penultimate concerns of 21st century technology, lots of skeptics would be right to question whether something like voting is yet another activity that should be put in the realm of potential hackers. But Estonia seems to have addressed this issue, too.

Of course, one of the most common concerns regarding internet voting is the potential that one’s vote could be changed either by a virus on your computer or as your ballot transits the internet on its way to the central government servers. To address this, Estonia’s evoting system adds a novel twist: the ability to use your mobile phone to separately connect to the electoral servers via a different set of tools and services to see how your vote was recorded and verify that it is correct.

After casting your vote using your desktop computer you can thus pull out your smartphone and verify the results that were actually received by the central electoral servers. The results are encrypted so that no government official can see how you individually voted, only you can see your individual voting choices, even as they are aggregated into the national totals.

By physically separating vote casting and vote checking to two different devices (votes are cast via a desktop computer, while checking your vote must be performed on your phone), it makes it highly unlikely that even the most motivated attacker could compromise both devices in such a way that your vote could be changed without your knowledge. And of course, even after voting online, you can always show up at a polling station on election day and vote via paper ballot if you want.

The ability to verify through a physically separate channel that the data received by the government is what you sent goes a long way towards addressing many of the most common concerns about electronic voting

I think this is perfectly doable in the U.S., at least on a technical level; constitutionally, every state handles voting its own way, so whether we can implement a nationwide standard of e-voting remains to be seen. But if even a handful of states give it a try, it might set a good example and get the ball rolling.

What are your thoughts?

Food Stamps Are an Investment in the Future

That, in essence, is the finding of one of the largest studies of its kind on what is officially known as the Supplemental Nutritional Assistance Program (SNAP). Since being nationally mandated in 1975, SNAP has remained the largest national anti-hunger program: last year along, more than 40 million poor working families, people with disabilities and seniors received assistance averaging to about $125 monthly; 70 percent live in households with children.

Whatever the moral case for supporting SNAP, there is certainly an economic one, as one of the largest studies of its kind recently proved.

From Bloomberg:

The economists focus on people born between 1956 and 1981, who were young children when the program was expanding, and who grew up in families with a parent with less than a high school education. They find that access to the program as a young child significantly improved economic outcomes and health status as an adult.

In particular, food stamp access as a child was associated with much lower risk of metabolic syndrome as an adult and, especially for women, higher levels of educational attainment and income along with lower participation on means-tested benefit programs. For example, food stamp access during childhood is linked to a 5 percentage point reduction in heart disease and an 18 percentage point increase in high school completion rates, compared to those who lacked access.

This evidence contradicts some critiques of food stamps, which misleadingly argue that it’s an inefficient and ineffective program.

The authors also highlight that access seems to matter most in utero and up until age 5. Gaining access to food stamps after age 5, by contrast, didn’t improve health outcomes as an adult, perhaps because the person had already been put on a particular health trajectory by that age.

As typical in such studies, there is a question of “correlation versus causation”, but the gradual rollout of SNAP allowed the researchers to account for this because “children living in otherwise similar families either did or didn’t receive benefits depending on whether their county voluntarily participated at the time. (The researchers show that county choice seems to be unrelated to other factors that may have substantially affected children living there.)”

The study also demonstrates the importance of taking a long-term view of these sorts of programs, especially when children are involved. Various other studies suggest that investing in the formative early years of one’s life pays huge dividends later; that is obviously lost on those who focus only during the year the benefit is received.