The Birth of Solidarity

On this day in 1980, Solidarity, a Polish trade union, was founded as the first independent labor union in a Soviet-bloc country. It gave rise to a larger nonviolent and anti-authoritarian social movement that claimed over nine million members and ultimately contributed to the fall of regimes across the Soviet bloc.

Though Poland’s government attempted to destroy Solidarity instituting martial law in 1981, followed by several years of political repression, it was forced into negotiation by the sheer weight of union’s influence and popularity. The subsequent talks resulted in semi-free elections in 1989—the closest Poland came to democracy since the 1930s. Continue reading

How Iranians Use New Media to Empower Civil Society

The tenacity and resourcefulness of the Iranian people–and indeed of oppressed people the world over–is incredible.

One of the latest apps is Hafez, which translates as “to protect”. Named after the famous Persian poet whose words frequently targeted religious hypocrisy, the app offers users a collection of human rights-related information.

Foremost, it is a virtual rolodex of human rights lawyers in Iran, which allows users to access legal information regarding human rights.

However, Hafez is more than just a list of telephone numbers, Keyvan Rafiee, an Iranian human rights activist, told Al Jazeera.

“Users receive daily human rights news; [it] allows them to send news of human rights violations securely; [it] disseminates important legal information to users if they are arrested, and provides the contact information for attorneys who can assist,” said Rafiee, the founder of Human Rights Activists Iran (HRAI).

Rafiee, who has been arrested for his activism six times, said having a record of human rights violations is instrumental for protesters in Iran.

“Monitoring violations that take place on a daily basis can improve human rights conditions since independent organisations are not permitted to work in Iran,” Rafiee said.

Source: Al Jazeera

Americans Won’t Take Back the Jobs that Immigrants “Stole”

All these Americans talking about foreigners taking their jobs, and they still won’t walk the walk about filling those jobs, according to Bloomberg:

American farmers have been complaining of labor shortages for several years now. Given a multi-year decline in illegal immigration, and a similarly sustained pickup in the U.S. job market, the complaints are unlikely to stop without an overhaul of immigration rules for farm workers.

Efforts to create a more straightforward agricultural-workers visa that would enable foreign workers to stay longer in the U.S. and change jobs within the industry have so far failed in Congress. If this doesn’t change, American businesses, communities and consumers will be the losers.

Perhaps half of U.S. farm laborers are undocumented immigrants. As fewer such workers enter the U.S., the characteristics of the agricultural workforce are changing. Today’s farm laborers, while still predominantly born in Mexico, are more likely to be settled, rather than migrating, and more likely to be married than single. They are also aging. At the start of this century, about one-third of crop workers were over the age of 35. Now, more than half are. And crop picking is hard on older bodies.

One oft-debated cure for this labor shortage remains as implausible as it has been all along: Native U.S. workers won’t be returning to the farm.

Continue reading

Fifty Cents to Avoid a Lifetime of Debilitation

Some weeks ago, I read a piece in The Economist that has stayed with me. It was about the efforts of Sierra Leone, among the world’s poorest countries, to combat “neglected tropical diseases” (NTD), a family of 17 diverse communicable diseases that afflict over 1.5 billion in tropical and subtropical areas worldwide.

It featured one victim named Hannah Taylor, who woke up one day with a fever, followed by her legs swelling up to four times their normal size. The physical damage was irreversible, and the subsequent appearance and putrid smell led to her being ostracized by her community. She was a victim of lymphatic filariasis (a.k.a. elephantiasis), a mosquito-borne infection that could have been treated safely with a pill costing no more than fifty cents before it progressed.

But instead, the microscopic worms infested her body, debilitating her. For years she thought she had been a victim of evil witchcraft and was deeply depressed.

Eventually, Taylor put on a brave face and campaigned to raise awareness about the disease, its causes, and why victims shouldn’t be stigmatized. She passed away some weeks prior to the publishing of the article; she was quoted as expressing  happiness that her children would not suffer the way she would, thanks to Sierra Leone’s remarkable progress in fighting the disease.

Progress or not, it is incredible to think that billions of lives are negatively impacted by something as mundane to most of us as a mosquito bite. It is even more incredible that a mere fifty cents – spare change we’d throw in a tip jar without a thought – is all that stands between someone and a debilitating disease. It is utterly senseless that in a world with so much wealth and resources sloshing around that we have not been able to address this vast disparity in health outcomes and quality of life.

 

Delegative Democracy and Presidentialization

There is no shortage of think pieces out there diagnosing the state of American politics and conjecturing as to where things are heading. But one recently caught my attention presenting a pretty interesting, if disconcerting, analysis and thesis about the changes unfolding in American democracy. It came to me via The Interpreter newsletter, named after the New York Times column by Max Fisher and Amanda Taub. (I highly recommend signing up if you want a regular dose of thought-provoking political analysis in your inbox.) My time is short, so I’ll just share the relevant excerpts:

What happens if that foundational democratic assumption in the separation of powers collapses? What would American democracy become?

There’s actually a term for it: delegative democracy.

“Delegative democracy is an old concept is political science,” Amy Erica Smith, an Iowa State political scientist, told us. It emerged after a series of Latin American dictatorships transitioned to democracies in the 1980s — but to a sort that seemed less than fully democratic.

“There was this collective head-scratching over what sort of democracy we have in Latin American,” said Dr. Smith, who studies the region. “We had free and fair elections that met the minimum criteria for democracy. But they didn’t look exactly like what we think democracies are supposed to be.”

The key difference, the experts decided, was separation of powers. It existed on paper in Latin America democracies, most of whose constitutions were modeled on that of the United States. But, in practice, the courts and the Congresses did what they were told. They delegated their power to the president — hence, delegative democracies.

That, Dr. Smith said, became an important lesson: “Norms matter more than formal institutions.”

These countries, for the most part, were still democracies. But they didn’t function all that well. They had what’s called “vertical accountability” — leaders had to answer to voters, who could kick them out of office — but not “horizontal accountability” from other branches of government.

That tends to degrade governance. There’s little to keep the president from putting her interests first. Corruption and abuses of power become more common. Apolitical agencies get politicized, hurting their ability to function. The president’s support base tends to get preferential treatment; those not in her support base can face discrimination or worse.

(This is a good reminder that, although Americans tend to think of democracy as a binary — you’re a democracy or you’re not — it’s better to think of it as existing on a spectrum. Delegative democracies tend to fall near the fuzzy middle.)

(This is also a good reminder that presidential systems, like those in Latin America or the United States, are unusually prone to backsliding. Delegative democracy is a risk more or less exclusive to presidential systems. Parliamentary systems have more formal, and historically more reliable, ways to check the executive’s power.)

To that last point, there has been interesting, if disconcerting, scholarship on the inherent weaknesses of our presidential system, which is limited mostly to the Americas. In most cases it leads to polarization and gridlock that culminates into coups, civil wars, and other forms of political violence. The only thing that has kept the system more or less uniquely functioning in the United States is the near-universal adherence to democratic norms touched on by Dr. Smith.

Many presidential systems also endure a transformation called “presidentialization”, in which the checks and balances of the system melt away or become folded into the executive. Again from The Interpreter:

For much of American history, voters thought about their Congressional votes as a separate issue from their support for the president. That is no longer really the case. Research shows that Americans increasingly place nearly all votes — for any office — based on how they feel about the president. If they like the president, they vote for members of her party. If they don’t, they punish her by voting against her party.

As voters have come to treat Congressional Republicans like the president’s subordinates rather than as members of a distinct body, Congress has done the same.“When a party becomes presidentialized, the separation of powers ceases to apply, effectively,” Matthew Shugart, a political scientist at University of California, Davis,wrote on Twitter.“Presidentialization has various manifestations, but fundamentally it’s about electoral incentives.”

Republican lawmakers know that their fate is tied to the president’s. If he succeeds, they succeed; if he fails, they fail. That makes Congress less a separate institution than one subservient to the president.

And no doubt the same calculation applies to any party, now that this has become  standard practice among American voters. I am not quite sure where we go from here — what are your thoughts?

The Rights of Immigrants in the U.S.

How America treats foreigners, regardless of their legal status, is of supreme importance morally, politically, and even diplomatically. It speaks to our values, impacts our standing in the world, and may even influenced the way our own citizens are treated abroad. This is not a bleeding heart talking point, but the sober and matter-of-fact conclusion of the U.S. Supreme Court in Arizona v. U.S. (2012), as cited and recounted by the Fifth Circuit Court in Hernandez v. U.S. (2014): Continue reading

America’s Baffling Opposition to the WHO’s Breastfeeding Resolution

It seems that any institution that is global or multilateral in nature or name elicits visceral opposition by huge swathes of the American public. While there has long been an undercurrent of insularity and outright hostility in America towards the rest of the world, it goes without saying that under the present administration — which came to power on a platform of nationalism, protectionism, and revanchism against foreigners — the sentiment has been worsened to the point of absurdity.

The most salient recent example is our strange response to a sensible resolution at the World Health Organization (WHO) that no one would have imagined was controversial. Continue reading

Happy 150th Anniversary to the Fourteenth Amendment

Today the Fourteenth Amendment to the U.S. Constitution turns 150; as it happens, it is the same day that President Donald Trump will nominate a new Supreme Court justice to replace Anthony Kennedy, whose three-decade tenure in the court included many refinements and defenses of the often-beleaguered and contentious amendment.

More from The Atlantic:

Ratified in 1868, the Fourteenth Amendment was originally intended to allow Congress and the courts to protect three fundamental values: racial equality, individual rights, and economic liberty. But the amendment was quickly eviscerated by the Court, and for nearly a century it protected economic liberty alone. Justice Kennedy embraced all three values of the Fourteenth Amendment, invoking it to protect reproductive autonomy and some forms of affirmative action, as well as to establish marriage equality, but also to limit federal economic regulations, such as the Affordable Care Act. His replacement will determine which vision of the amendment prevails for decades to come.

Of the five sections that make up the amendment, the one most often in contention is the first, which reads:

All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.

Given the context of its passage, this language is very significant, as The Atlantic again explains:

After the Civil War, many of the former Confederate states passed laws known as the “Black Codes,” which sharply limited the rights of former enslaved people. In response, on July 9, 1868, Congress ratified the Fourteenth Amendment, which guarantees equal protection under the law and also denies any state the right to deprive people of liberty without due process.

Only five years later, the Supreme Court eviscerated the amendment in the 5–4 Slaughterhouse Cases decision. As drafted by the Ohio congressman John Bingham, the amendment was intended to require states as well as the federal government to respect the fundamental liberties guaranteed by the Bill of Rights.

A decade later, in a lopsided 8–1 decision, the Court struck down the Civil Rights Act of 1875, which banned discrimination in public accommodations and transportation. Finally, in 1896, the Court upheld the doctrine of “separate but equal” in Plessy v. Ferguson, standing aside as the South constructed the Jim Crow regime. Justice John Marshall Harlan provided the only dissent. In one of the most famous passages in the history of Supreme Court opinions, he wrote: “There is no caste here. Our Constitution is color-blind, and neither knows nor tolerates classes among citizens. In respect of civil rights, all citizens are equal before the law. The humblest is the peer of the most powerful.”

At the same time that the Court turned away from the Framers’ vision of equal civil rights, it invoked the Fourteenth Amendment to protect economic liberties, such as freedom of contract. This period is remembered as the Lochner era, named after a 1905 decision striking down a maximum-hour law for bakers in New York. It culminated in decisions in the early 1930s that struck down the core of Franklin D. Roosevelt’s New Deal.

I remember learning a lot of this in my constitutional law class, and being quite surprised at how immediately resisted and controversial the amendment was, even to the courts. It is even more disconcerting to learn that it would be until fairly recently in American history that the Fourteenth Amendment was enacted as its framers ostensibly intended:

It wasn’t until Brown v. Board of Education in 1954 that the Court resurrected the Fourteenth Amendment’s promise of racial equality, overturning Plessy and attacking school segregation. It struck down state laws banning interracial marriage in Loving v. Virginia. And it upheld landmark civil-rights laws like the Civil Rights Act of 1964 and the Voting Rights Act of 1965. While the Court stopped short of guaranteeing equal funding for education, it did much to attack the jurisprudential foundation of Jim Crow.

At the same time, Chief Justice Earl Warren’s Court resurrected John Bingham’s vision of national enforcement of fundamental rights—most notably, by extending the protections of the Bill of Rights to the states, thereby safeguarding free speech, religious liberty, the right to counsel, and the right to be free of unreasonable searches and seizures.

More controversially, the Warren Court laid the foundation for rights not explicitly mentioned in the text of the Constitution, such as the right to privacy. In later years, the Supreme Court would build on these privacy decisions to issue decisions such as Roe v. Wade—which led to a conservative backlash against the Court.

These competing visions of economic liberty, racial equality, and personal autonomy came to a head in 1987. Justice Lewis Powell—the swing justice on Warren E. Burger’s Court—resigned. President Ronald Reagan, nearing the end of his second term, sought to place his enduring stamp on the Court by nominating the conservative legal intellectual Robert Bork. Following a bruising battle, the Senate rejected the Bork nomination, in part because he refused to recognize a constitutional right to privacy. When Anthony Kennedy embraced the right to privacy, the Senate unanimously confirmed him.

While perhaps not as well known as the first ten amendments enshrined as the Bill of Rights, the implications of the Fourteenth Amendment — and how it will be applied, broadened, or restricted — are vast, especially in light of the replacement of one of its greatest proponents.

With Kennedy leaving the Court, the future of this 150-year-old amendment is at stake. His successor will determine whether the Supreme Court interprets the amendment as allowing or prohibiting laws and policies regulating abortion, marriage, voting rights, and affirmative action. Also at stake are the scope of the Bill of Rights’ protections for free speech, gun rights, religious liberty, freedom from unreasonable government searches and seizures, and economic liberty. Strong constitutional arguments can be made for both sides of all these issues, and Justice Kennedy often held the decisive vote. His successor could determine the shape of the Fourteenth Amendment until its 200th anniversary in 2038.

What are your thoughts?

What Parasites Can Teach Us About Society

Who knew that the workings of a tapeworm could provide some very relevant implications about human nature and social control? Like many parasites, Schistocephalus solidus has a complex life cycle: it reproduces in the guts of waterbirds, from whose droppings its eggs are deposited; they hatch and the larva infect small crustaceans, which are eaten by stickleback fish, which are then eaten by the waterbirds, and…you get the picture.

So far, so typical of parasites. But as The Atlantic reports, the transition from one lifeform to another is facilitated by a pretty insidious form of mind control, which works far beyond the immediately infected animals. Continue reading

The Decline of U.S. Global Approval

According to a Gallup poll published in January, 65 out of 134 countries surveyed saw a decline in U.S. leadership approval ratings by 10 points or more between 2016 and 2017. This includes many longtime allies and partners.

1wgounfrmucug07jc0r0xa

Portugal, Belgium, Norway and Canada led the declines worldwide, with approval ratings of U.S. leadership dropping 40 points or more in each country. In contrast, U.S. approval rating increased 10 points or more in just four countries: Liberia (+17), Macedonia (+15), Israel (+14), and Belarus (+11).

Americans typically brush off, if not disparage, what the rest of the world thinks of us. But in a rapidly globalizing and multipolar society, with many rising rivals, global public opinion matters. It also matters that the majority of fellow democracies — as well as countries with longstanding political, economic, historical, and cultural ties — have a lower opinion of us than dysfunctional and/or authoritarian regimes. We should want approval by fellow ostensible democratic-minded societies, not authoritarian ones.

What are your thoughts?