How America treats foreigners, regardless of their legal status, is of supreme importance morally, politically, and even diplomatically. It speaks to our values, impacts our standing in the world, and may even influenced the way our own citizens are treated abroad. This is not a bleeding heart talking point, but the sober and matter-of-fact conclusion of the U.S. Supreme Court in Arizona v. U.S. (2012), as cited and recounted by the Fifth Circuit Court in Hernandez v. U.S. (2014): Continue reading
As reported by The Guardian, an international multidisciplinary team led by Oxford archaeologist Dr. Eleanor Scerri has claimed that a comprehensive survey of fossil, archaeological and genetic evidence shows humans “mosaic-like across different populations spanning the entire African continent”. Thus, modern humans did not come from a specific area — namely East Africa, where the oldest confirmed Homo Sapiens fossils have been found — but are the end result of millennia of interbreeding and cultural exchange between semi-isolated groups.
The telltale characteristics of a modern human – globular brain case, a chin, a more delicate brow and a small face – seem to first appear in different places at different times. Previously, this has either been explained as evidence of a single, large population trekking around the continent en masse or by dismissing certain fossils as side-branches of the modern human lineage that just happened to have developed certain anatomical similarities.
The latest analysis suggests that this patchwork emergence of human traits can be explained by the existence of multiple populations that were periodically separated for millennia by rivers, deserts, forests and mountains before coming into contact again due to shifts in the climate. “These barriers created migration and contact opportunities for groups that may previously have been separated, and later fluctuation might have meant populations that mixed for a short while became isolated again,” said Scerri.
The trend towards more sophisticated stone tools, jewellery and cooking implements also supports the theory, according to the paper in the journal Trends in Ecology & Evolution.
Scerri assembled a multidisciplinary group to examine the archaeological, fossil, genetic and climate data together, with the aim of eliminating biases and assumptions. Previously, she said, scientific objectivity had been clouded by fierce competition between research groups each wanting their own discoveries to be given a prominent place on a linear evolutionary ladder leading to the present day. Disputes between rival teams working in South Africa and east Africa had become entrenched, she said.
“Someone finds a skull somewhere and that’s the source of humanity. Someone finds some tools somewhere, that’s the source of humanity,” she said, describing the latest approach as: “‘Let’s be inclusive and construct a model based on all the data we have available.”
Like any study, the claims will need to be confirmed, but from my layman’s perspective, it makes sense. What are your thoughts? (Especially if you have a background in this area.)
It seems that any institution that is global or multilateral in nature or name elicits visceral opposition by huge swathes of the American public. While there has long been an undercurrent of insularity and outright hostility in America towards the rest of the world, it goes without saying that under the present administration — which came to power on a platform of nationalism, protectionism, and revanchism against foreigners — the sentiment has been worsened to the point of absurdity.
The most salient recent example is our strange response to a sensible resolution at the World Health Organization (WHO) that no one would have imagined was controversial. Continue reading
Today the Fourteenth Amendment to the U.S. Constitution turns 150; as it happens, it is the same day that President Donald Trump will nominate a new Supreme Court justice to replace Anthony Kennedy, whose three-decade tenure in the court included many refinements and defenses of the often-beleaguered and contentious amendment.
More from The Atlantic:
Ratified in 1868, the Fourteenth Amendment was originally intended to allow Congress and the courts to protect three fundamental values: racial equality, individual rights, and economic liberty. But the amendment was quickly eviscerated by the Court, and for nearly a century it protected economic liberty alone. Justice Kennedy embraced all three values of the Fourteenth Amendment, invoking it to protect reproductive autonomy and some forms of affirmative action, as well as to establish marriage equality, but also to limit federal economic regulations, such as the Affordable Care Act. His replacement will determine which vision of the amendment prevails for decades to come.
Of the five sections that make up the amendment, the one most often in contention is the first, which reads:
All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.
Given the context of its passage, this language is very significant, as The Atlantic again explains:
After the Civil War, many of the former Confederate states passed laws known as the “Black Codes,” which sharply limited the rights of former enslaved people. In response, on July 9, 1868, Congress ratified the Fourteenth Amendment, which guarantees equal protection under the law and also denies any state the right to deprive people of liberty without due process.
Only five years later, the Supreme Court eviscerated the amendment in the 5–4 Slaughterhouse Cases decision. As drafted by the Ohio congressman John Bingham, the amendment was intended to require states as well as the federal government to respect the fundamental liberties guaranteed by the Bill of Rights.
A decade later, in a lopsided 8–1 decision, the Court struck down the Civil Rights Act of 1875, which banned discrimination in public accommodations and transportation. Finally, in 1896, the Court upheld the doctrine of “separate but equal” in Plessy v. Ferguson, standing aside as the South constructed the Jim Crow regime. Justice John Marshall Harlan provided the only dissent. In one of the most famous passages in the history of Supreme Court opinions, he wrote: “There is no caste here. Our Constitution is color-blind, and neither knows nor tolerates classes among citizens. In respect of civil rights, all citizens are equal before the law. The humblest is the peer of the most powerful.”
At the same time that the Court turned away from the Framers’ vision of equal civil rights, it invoked the Fourteenth Amendment to protect economic liberties, such as freedom of contract. This period is remembered as the Lochner era, named after a 1905 decision striking down a maximum-hour law for bakers in New York. It culminated in decisions in the early 1930s that struck down the core of Franklin D. Roosevelt’s New Deal.
I remember learning a lot of this in my constitutional law class, and being quite surprised at how immediately resisted and controversial the amendment was, even to the courts. It is even more disconcerting to learn that it would be until fairly recently in American history that the Fourteenth Amendment was enacted as its framers ostensibly intended:
It wasn’t until Brown v. Board of Education in 1954 that the Court resurrected the Fourteenth Amendment’s promise of racial equality, overturning Plessy and attacking school segregation. It struck down state laws banning interracial marriage in Loving v. Virginia. And it upheld landmark civil-rights laws like the Civil Rights Act of 1964 and the Voting Rights Act of 1965. While the Court stopped short of guaranteeing equal funding for education, it did much to attack the jurisprudential foundation of Jim Crow.
At the same time, Chief Justice Earl Warren’s Court resurrected John Bingham’s vision of national enforcement of fundamental rights—most notably, by extending the protections of the Bill of Rights to the states, thereby safeguarding free speech, religious liberty, the right to counsel, and the right to be free of unreasonable searches and seizures.
More controversially, the Warren Court laid the foundation for rights not explicitly mentioned in the text of the Constitution, such as the right to privacy. In later years, the Supreme Court would build on these privacy decisions to issue decisions such as Roe v. Wade—which led to a conservative backlash against the Court.
These competing visions of economic liberty, racial equality, and personal autonomy came to a head in 1987. Justice Lewis Powell—the swing justice on Warren E. Burger’s Court—resigned. President Ronald Reagan, nearing the end of his second term, sought to place his enduring stamp on the Court by nominating the conservative legal intellectual Robert Bork. Following a bruising battle, the Senate rejected the Bork nomination, in part because he refused to recognize a constitutional right to privacy. When Anthony Kennedy embraced the right to privacy, the Senate unanimously confirmed him.
While perhaps not as well known as the first ten amendments enshrined as the Bill of Rights, the implications of the Fourteenth Amendment — and how it will be applied, broadened, or restricted — are vast, especially in light of the replacement of one of its greatest proponents.
With Kennedy leaving the Court, the future of this 150-year-old amendment is at stake. His successor will determine whether the Supreme Court interprets the amendment as allowing or prohibiting laws and policies regulating abortion, marriage, voting rights, and affirmative action. Also at stake are the scope of the Bill of Rights’ protections for free speech, gun rights, religious liberty, freedom from unreasonable government searches and seizures, and economic liberty. Strong constitutional arguments can be made for both sides of all these issues, and Justice Kennedy often held the decisive vote. His successor could determine the shape of the Fourteenth Amendment until its 200th anniversary in 2038.
What are your thoughts?
Who knew that the workings of a tapeworm could provide some very relevant implications about human nature and social control? Like many parasites, Schistocephalus solidus has a complex life cycle: it reproduces in the guts of waterbirds, from whose droppings its eggs are deposited; they hatch and the larva infect small crustaceans, which are eaten by stickleback fish, which are then eaten by the waterbirds, and…you get the picture.
So far, so typical of parasites. But as The Atlantic reports, the transition from one lifeform to another is facilitated by a pretty insidious form of mind control, which works far beyond the immediately infected animals. Continue reading
According to a Gallup poll published in January, 65 out of 134 countries surveyed saw a decline in U.S. leadership approval ratings by 10 points or more between 2016 and 2017. This includes many longtime allies and partners.
Portugal, Belgium, Norway and Canada led the declines worldwide, with approval ratings of U.S. leadership dropping 40 points or more in each country. In contrast, U.S. approval rating increased 10 points or more in just four countries: Liberia (+17), Macedonia (+15), Israel (+14), and Belarus (+11).
Americans typically brush off, if not disparage, what the rest of the world thinks of us. But in a rapidly globalizing and multipolar society, with many rising rivals, global public opinion matters. It also matters that the majority of fellow democracies — as well as countries with longstanding political, economic, historical, and cultural ties — have a lower opinion of us than dysfunctional and/or authoritarian regimes. We should want approval by fellow ostensible democratic-minded societies, not authoritarian ones.
What are your thoughts?
Since I’m pressed for time today, I figured I would stick to something light and cheeky: while most people know that the Chinese invented fireworks over a millennia ago, they may not realized that China (perhaps ironically) remains the main source of the fireworks most Americans will be using to celebrate their nation’s independence.
Senegal and Japan would seem as far apart culturally as they are geographically: the West African nation of 15 million is poor, highly diverse ethnically and linguistically, and predominately Muslim; the East Asian island nation of 125 million is among the wealthiest and most homogeneous societies in the world, and is heavily influenced by Buddhist and Confucian thought.
Yet this year’s World Cup brought to light one unlikely but endearing similarity: both cultures share an appreciation for cleanliness and etiquette, even amid the highly competitive (and often very messy) environment of federation football. Continue reading
In the spring of 1947, the eve of India’s independence from the U.K., the leaders of its independence movement made the fateful decision for their new country to be a secular, constitutional republic with suffrage for every adult citizen: more than 170 million in total. Overnight, India became the world’s largest democracy, a distinction it retains to this day, with an incredible 900 million eligible voters (nearly three times the total U.S. population).
The logistics of Indian democracy were daunting: at the time, some 85% of its electorate were illiterate, requiring political parties to get clever with the use of pictographs and symbols to communicate their platform. Tens of thousands of civil servants worked for two full years just to compile the rolls for India’s first general election, conducted in 1951; the voter lists would be 200 meters (656 feet) thick. Today the list is five times that amount. Continue reading
On this day in 1954, the CIA executed Operation PBSUCCESS, which overthrew the democratically elected Guatemalan President Jacobo Arbenz and installed military officer Carlos Castillo Armas, the first in a series of brutal U.S.-backed desposts who lasted until the 1990s.
Arbenz was only the second Guatemalan leader to be elected democratically; in 1944, a popular uprising toppled the previous U.S.-backed dictator, Jorge Ubico, paving the way for the nation’s first democratic election, which placed Juan Jose Arevalo in power. He introduced a minimum wage, near-universal suffrage, literacy programs, and a new constitution that aimed to turn Guatemala into a liberal democracy. Arbenz succeeded Arevalo in 1951, continuing his social and political reforms, including popular land policies that granted property to landless peasants.
This “Guatemalan Revolution” was disliked by the U.S., which was predisposed by the Cold War to see it — like every leftist or socially oriented movement — as communist and Soviet-backed. It did not help that Arbenz, though himself not a communist, Continue reading