The "Madman" Who Advocated Hand-washing

Ignaz Semmelweis is not a household name. But the Hungarian doctor may be one of the history’s greatest and most consequential medical pioneers. As the man who proposed the now-universally accepted importance of handwashing for healthy, now is as good a time as any to commemorate him.

Image may contain: 1 person

Especially since he died broken and ostracized in an insane asylum for daring to devise what we now take for granted. An unfitting fate for a man called the “savior of mothers”.

Semmelweis was a man of his time. The 19th century was the “golden age of the physician scientist”, when doctors were expected to have scientific training and perspective. Gone were the days when illness was an imbalance of “humors” or caused by “bad air” or divine will. Autopsies, once taboo, were more common. Anatomy was taking off, as we began to connect ailments with actual physical causes in the body. Doctors—like the young Dr. Semmelweis—were driven to collect data, crunch numbers, and find evidence to inform their practice.

When he began his new job in the maternity clinic at the General Hospital in Vienna (then the capital of the Austro-Hungarian Empire) he immediately started gathering data on something that troubled him: Why so many women in maternity wards were dying from “puerperal fever”, commonly known as childbed fever, a horrible and painful illness.

Dr. Semmelweis wasted no time. He studied two maternity wards in the hospital—one staffed by all male doctors and medical students, the other by female midwives—and counted the number of deaths on each ward.

After crunching the numbers, he discovered that the clinic staffed by male doctors and medical students had a death rate nearly five times higher than the midwives’ clinic.

Semmelweis was appalled. It “made me so miserable that life seemed worthless”, he remarked. The reputation of the first clinic was so bad that women literally begged not to go there, with some reportedly preferring to give birth on the streets. He had to get to the bottom of it.

Semmelweis carefully assessed the data and tried to find empirical evidence. He ruled out various hypotheses—overcrowding, climate, etc.—and discovered one key difference: The male doctors and medical students did autopsies; the midwives did not. The germ theory of disease was not yet widely accepted, so the doctor proposed that it was “cadaverous materials” that were causing the infections.

The solution was simple: He decreed that doctors needed to wash their hands after autopsies, not just with water, but with a chlorine based chemical solution he devised.

The result was dramatic: The mortality rate in the first clinic dropped an astonishing 90 percent. After hand washing was instituted in mid-May of 1847, death rates continued dropping precipitously: 2.2 percent in June, 1.2 percent in July, and—for the first time ever—zero in two months in the year following this discovery.

No photo description available.

Semmelweis wasted no time getting the word out to doctors and hospitals everywhere. Yet despite his evidence, the idea that all that mattered was cleanliness was considered extreme at the time. How could this one factor be the cause? The doctor was largely ignored, rejected, or even ridiculed.

In fact, he was ultimately dismissed from the hospital for political reasons and was so horribly harassed by the medical community in Vienna that he was forced to move to Budapest.

Semmelweis was outraged. He began writing open and increasingly angry letters to prominent European obstetricians, sometimes denouncing them as irresponsible murderers. His colleagues, including his own wife, believed he was losing his mind. In 1865, 1865, nearly twenty years after his breakthrough, he was committed to an insane asylum. Ironically, he died there of septic shock—similar to the infectious deaths he had worked to prevent—just two weeks later, possibly from being severely beaten by guards.

It was ignoble and cruelly ironic end to a man whose findings are now the bedrock of public health and sanitation worldwide. Semmelweis was ridiculed, marginalized, and ultimately forgotten because his observations conflicted with the established scientific and medical opinions of the time; indeed, many doctors took offense at the idea that they should wash their hands — at the cost of their patients’ lives.

It was only two decades after his sad death that Semmelweis’s recommendations gained widespread acceptance: Louis Pasteur’s confirmation of the germ theory of disease, followed by Joseph Lister’s use of hygienic methods during surgery both validated the Hungarian doctor, who lacked the scientific means to explain his findings.

But given his selfless and righteous dedication to the well-being of patients, I like to imagine Semmelweis would be pleased to see his ideas become conventional wisdom. He might also be amused that his name is used for the eponymous “Semmelweis reflex” or “Semmelweis effect”, which describes a tendency for new evidence or knowledge to be viscerally rejected because it contradicts established norms, beliefs, or paradigms.

COVID-19 and the Impartial Judgments of Nations

With the world responding to the pandemic in a variety of ways—and many countries learning from each other or from the U.N. World Health Organisation (itself made up of experts all over the world)—I am reminded of the largely forgotten words of James Madison, the architect of the U.S. Constitution.

This darling of patriots and conservatives—the Federalist Society uses his silhouette as its logo—once said that “no nation was so enlightened that it could ignore the impartial judgments of other nations and still expect to govern itself wisely and effectively.”

In the Federalist Papers, which were published to promote ratification of the Constitution, he emphasized the importance of respecting global public opinion:

An attention to the judgment of other nations is important to every government for two reasons: the one is, that, independently of the merits of any particular plan or measure, it is desirable, on various accounts, that it should appear to other nations as the offspring of a wise and honorable policy; the second is, that in doubtful cases, particularly where the national councils may be warped by some strong passion or momentary interest, the presumed or known opinion of the impartial world may be the best guide that can be followed. What has not America lost by her want of character with foreign nations? And how many errors and follies would she not have avoided, if the justice and propriety of her measures had in every instance been previously tried by the light in which they would probably appear to the unbiased part of mankind?

This was at a time when the U.S. was virtually the only republic in the world. Even the most patriotic and liberty-loving Founders recognized that whatever the political or cultural differences between the nations of the world, mere pragmatism should permit us to take whatever ideas or resources we can.

Consider that unlike other nations, we declined to use the W.H.O.’s test kits. Back in January, over a month before the first COVID-19 case, the Chinese published information on this new mysterious virus. Within a week, German scientists had produced the first diagnostic test. By the end of February, the U.N. shipped out tests to 60 countries.

As I’ve said ad naseum, global cooperation is not merely idealistic or Utopian: It’s the sober reality of living in a globalized society where we face problems that affect all humans, regardless of where they happen to be born. Even in the 18th century, our political founders and leaders understood this. We ignore it at our peril.

Remember Death

Since ancient times, all across the world, it’s been understood that we should always be aware of death. Socrates said that proper philosopher is “about nothing else but dying and being dead.”

Early Buddhist texts use the term maranasati, which translates as ‘remember death.’

Some Muslim Sufis are known as the “people of the graves” for their practice of visiting graveyards to ponder death, as Mohammad once advised.

The ancient Egyptians, well known for their obsession with death, had a custom where, during festivities, they would bring out a skeleton and cheer to themselves, “Drink and be merry for when you’re dead you will look like this.”

Throughout the Middle Ages, Europe developed an entire genre of artwork dedicated to memento mori, literally remembering death.

To my mind, the most famous and articulate proponents of this idea were the Stoics, a Greco-Roman school of philosophy that emerged in the third century B.C.E. In his private journal, known as the Meditations, the Roman philosopher king Marcus Aurelius advised to himself that “You could leave life right now. Let that determine what you do and say and think.” The famed Roman statesman and orator Seneca said that we should go to bed thinking “You may not wake up tomorrow” and start the day thinking “You may not sleep again”. He also recommended that we

“… prepare our minds as if we’d come to the very end of life. Let us postpone nothing. Let us balance life’s books each day. … The one who puts the finishing touches on their life each day is never short of time.”

All this probably sounds pretty morbid and depressing, not to mention counterintuitive: Thinking about death all the time is no way to live, and would probably paralyze us with fear. But as another famous Stoic, Epictetus, explained:

Keep death and exile before your eyes each day, along with everything that seems terrible—by doing so, you’ll never have a base thought nor will you have excessive desire.

Extrapolating from this, some modern Stoics advise that we remember that the people we fight with will die; the strangers tick us off on the road will die; that every time we say bye to a loved one, we keep in mind they may die before we see or speak with them again.

Again, the point isn’t to be depressed, despairing, or even nihilistic, but to allow us to put things in perspective and value each finite second we have. The people we hate will end up just like us one day, which both humanizes them and reminds us not to waste precious little time occupied by them. The people we love will end up the same way, so better that we make the most of our time and fill it with happiness.

Of course, all this is easier said than done: It’s why we’re still trying to keep this advice thousands of years later.

Raising the Flag on Iwo Jima

On this day in 1945, Joe Rosenthal of the Associated Press took the iconic photograph “Raising the Flag on Iwo Jima”, which depicts six U.S. Marines raising the American flag atop Mount Suribachi during the Battle of Iwo Jima in the final stages of the Pacific Theater of the Second World War.

The U.S. had invaded Iwo Jima four days prior as part of its island-hopping strategy to defeat Japan. The island was located halfway between Japan and the Mariana Islands, where American long-range bombers were based, and was used by the Japanese as an early warning station. Capturing the island would weaken this warning system and also provide an emergency landing for damaged bombers.

As the highest point on the island, Mount Suribachi allowed the Japanese to spot and target American forces, and was thus the tactical priority. There was never any question the U.S. would win—the Americans had overwhelming numerical and logistical superiority, plus complete air supremacy—while the Japanese were low on food and supplies nor could retreat or reinforced. Yet the battle was nonetheless brutal, grinding on for over another month after the photograph was taken.

In fact, half the marines later identified in the photo were killed shortly after: Sergeant Michael Strank, Corporal Harlon Block, and Private First Class Franklin Sousley.

Uniquely among Pacific War Marine battles, total American casualties (both dead and wounded) exceeded those of the Japanese, though Japanese combat deaths were three times higher than American fatalities. (Of the 21,000 Japanese on the island, only 216 were ultimately taken prisoner, with many fighting to the death, often through various cave systems.)

This was actually the second time the U.S. flag was raised on the mountain; the first instance had occured earlier in the morning, but in the early afternoon, Sergeant Strank was ordered to take Marines from his rifle squad to bring supplies and raise a larger flag on the summit.

“Raising the Flag on Iwo Jima” was the only photograph to win the Pulitzer Prize for Photography in the same year as its publication. It is perhaps just as well known for its the construction of the Marine Corps War Memorial in 1954, which honors all Marines who died since the founding of the Continental Marines of the Revolutionary War in 1775.

For me, one of the more compelling stories from the episode was that of Ira Hayes, a Pima Native American from Arizona who, like so many indigenous Americans, volunteered readily to fight for the county. He disliked the fame he received, feeling survivor’s guilt for the marines who didn’t make it back, descended into alcoholism, most likely due to what we now know as PTSD.

Johnny Cash, known for his advocacy for Native Americans, dedicated a song to him that remains one of my favorite.

Happy 23rd Birthday Dolly!

Well, sort of. Technically, she was born July 5, 1996, but it was on this day in 1997 that scientists at the Roslin Institute in Scotland announced the birth of Dolly, a female sheep who was the first mammal to have successfully been cloned from an adult cell. She was the only lamb that survived to adulthood from 277 attempts.

The funding for Dolly’s cloning was provided by PPL Therapeutics —a Scottish biotech firm near the University of Edinburgh, where the institute is based—and the British Ministry of Agriculture.

Dolly was born the summer before and had three mothers: one provided the egg, another the DNA, and a third carried the cloned embryo to term. She was created using the technique of “somatic cell nuclear transfer”, where the cell nucleus from an adult cell is transferred into a developing egg cell (called an unfertilized oocyte) that has had its cell nucleus removed. An electric shock stimulates the hybrid cell to divide, and when it develops into a blastocyst (which will eventually form the embryo) it is implanted in a surrogate mother.

Dolly lived only about half as long as her breed, leading some to speculate that her cloning had something to do with it. However, an analysis of her DNA found no anomalies, and her death by lung disease is particularly common for sheep kept indoors (Dolly had to sleep inside for security reasons). None of Dolly’s six children—the result of conventional mating with another sheep—bore any unusual defects of properties. As of 2016, scientists reported no defects in thirteen cloned sheep, including four from the same cell line as Dolly.

Dolly’s legacy has far outlived her, and will likely continue to into the 21st century. She quickly paved the way for the successful cloning of other large mammals, including pigs, deer, horses, and bulls. Making cloned mammals was initially highly inefficient, but by 2014, Chinese scientists reported a 70–80% success rate cloning pigs, while in 2016, a Korean company, Sooam Biotech, was producing 500 clones embryos a day.

As recently as 2018, a primate species was successfully cloned using the same method for producing Dolly: Two identical clones of a macaque monkey, Zhong Zhong and Hua Hua, were created by researchers in China. Just last year, Chinese scientists reported the creation of five identical cloned gene-edited monkeys, using the same cloning technique as for Dolly and Zhong Zhong and Hua Hua.

There have also been attempts to clone extinct species back into existence. The most famous attempt was in 2009, when Spanish scientists announced the cloning of the Pyrenean ibex, a form of wild mountain goat, which was officially declared extinct in 2000. Although the newborn ibex died shortly after birth due to physical defects, it is the first time an extinct animal has been cloned, and may open doors for saving endangered and newly extinct species by resurrecting them from frozen tissue.

International Mother Language Day

In honor of International Mother Language Day—created to promote linguistic diversity and preservation—check out this beautiful and very detailed chart of the world’s languages. A lot of the data might surprise you!

It’s too big too fit here, but below is a little snapshot to give you an idea.

Here are some fun and colorful language infographics that do fit here!

The-100-Most-Spoken-Languages-in-the-World_Supplemental

As the name suggests, the massive Indo-European family includes every language from northern India through Iran and nearly all of Europe between Portugal and Russia (with Hungarian, Estonian, and Finnish being notable exceptions).

The language with the most speakers is, probably not surprisingly, English; about 15 percent of humanity can speak!

However, the vast majority of people who speak English learn it as a second language (as you might have noticed with the top infographic). Here are the languages with the most native speakers compared to second language (2L) speakers:

Here’s an interesting breakdown from the source:

Nearly 43% of the world’s population is bilingual, with the ability to switch between two languages with ease.

From the data, second language (L2) speakers can be calculated by looking at the difference between native and total speakers, as a proportion of the total. For example, 66% of English speakers learned it as a second language.

Swahili surprisingly has the highest ratio of L2 speakers to total speakers—although it only has 16 million native speakers, this shoots up to 98 million total speakers. Overall, 82% of Swahili speakers know it as a second language.

Swahili is listed as a national or official language in several African countries: Tanzania, Kenya, Uganda, and the Democratic Republic of Congo. It’s likely that the movement of people from rural areas into big cities in search of better economic opportunities, is what’s boosting the adoption of Swahili as a second language.

Indonesian is another similar example. With a 78% proportion of L2 speakers compared to total speakers, this variation on the Malay language has been used as the lingua franca across the islands for a long time. In contrast, only 17% of Mandarin speakers know it as a second language, perhaps because it is one of the most challenging languages to learn

Tragically, the U.N. has good reason to dedicate a day for the preservation of languages: The 100th most common language is “Sanaani Spoken Arabic”, spoken primarily in Yemen by around 11 million people. Yet there are a total of 7,111 languages still spoken today, meaning the vast majority of them—all but 100—have less than 11 million speakers.

In fact, approximately 3,000 all languages (40 percent) are at risk of being lost, or are already in the process of dying out today. (By one estimate, a language dies every two weeks.) Fortunately, growing awareness and advanced technology are helping to document and preserve these unique aspects of human existence, and all the unique ideas, stories, and concepts they each contain.

Happy Birthday to Mir

On this day in 1986, the Soviet Union launched Mir, the first modular space station, the largest spacecraft by mass at that time, and the largest artificial satellite until the International Space Station (ISS) in 1998.

No photo description available.

Assembled in orbit from 1986 to 1996, the station was the result of efforts to improve upon the Soviet Salyut program, which produced history’s first space station. It served as a microgravity research laboratory where crews conducted experiments in biology, human biology, physics, astronomy, meteorology, and spacecraft systems, all with the ultimate goal of preparing humanity for the permanent occupation of space.

Through the “Intercosmos” program, Mir also helped train and host cosmonauts from other countries, including Syria, Bulgaria, Afghanistan, France, Germany, and Canada.

Mir was the first continuously inhabited long-term research station in orbit and held the record for the longest continuous human presence in space at 3,644 days (roughly 10 years), until it was surpassed by the ISS in 2010. It also holds the record for the longest single human spaceflight, with Valeri Polyakov spending 437 days and 18 hours on the station between 1994 and 1995.

No photo description available.

This is all the more remarkable considering that Mir lasted three times longer than planned, and even survived the Soviet Union itself, which collapsed just years after it was launched. The fact that Russia managed to keep it afloat despite its tumultuous post-Soviet transition speaks to both ingenuity and the goodwill of global partners like NASA.

In fact, the U.S. had planned to launch its own rival station, Freedom, while the Soviets were working on Mir-2 as a successor. But both countries faced budget constraints and a lack of political will that ultimately quashed these projects. Instead, the erstwhile rivals came together through the Shuttle–Mir, an 11-mission space program that involved American Space Shuttles visiting Mir, Russian cosmonauts flying on the Shuttle, and an American astronaut flying aboard a Russian Soyuz spacecraft for long range expeditions aboard Mir.

No photo description available.

With various other nations, from Canada to Japan, also cancelling their own space station programs due to budget constraints, Russia and the U.S. soon brought them into the fold to create a new international space station—today the ISS we all know and love.

Thus, by the time the aging Mir was finally cut loose and allowed to deorbit in 2001, the ISS had already begun taking occupants, building upon the old station’s technical, scientific, and political legacy. (In fact, Russia has contributed most portions of the ISS after the U.S., and both its spaceport and its spacecraft serve as the primary—and for many years, only—source of crew and supplies.)

In its detailed tribute to Mir, NASA notes its importance to all of humanity as a milestone for human space exploration:

“The Russian Space Station Mir endured 15 years in orbit, three times its planned lifetime. It outlasted the Soviet Union, that launched it into space. It hosted scores of crewmembers and international visitors. It raised the first crop of wheat to be grown from seed to seed in outer space. It was the scene of joyous reunions, feats of courage, moments of panic, and months of grim determination. It suffered dangerous fires, a nearly catastrophic collision, and darkened periods of out-of-control tumbling.

Mir soared as a symbol of Russia’s past space glories and her potential future as a leader in space. And it served as the stage—history’s highest stage—for the first large-scale, technical partnership between Russia and the United States after a half-century of mutual antagonism.”

Despite all the geopolitical rivalry and grandstanding that motivated incredible breakthroughs like Mir (and for that matter the Moon landing), the value and legacy of these achievements go far beyond whatever small-mindedness spurred them. Wrapped up in all this brinkmanship was—and still is—a vision of progress for all of humanity.

A fun note about the name: The word mir is Russian for “peace”, “world”, or “village”, and has historical significance: When Tsar Alexander II abolished serfdom (virtual slavery) in 1861, freeing over 23 million people, mir was used to describe peasant communities that thereafter managed to actually own their land, rather than being tied to the land of their lord.

Photos courtesy of Wikimedia.

The Corrupt Bargain of 1825

Today is the anniversary of a largely forgotten episode that reminds us how the U.S. has always struggled with messy politics and ambiguous or flawed electoral rules.

It was on this day in 1825 that the House of Representatives chose John Quincy Adams to be president, after no candidate received a majority of electoral votes in the previous year’s presidential election.

There had been four candidates on the ballot: Adams, Henry Clay, Andrew Jackson, and William H. Crawford. Pursuant to the Twelfth Amendment, only the top three candidates in the electoral vote were admitted as candidates, eliminating Henry Clay.

Many were surprised that Adams was elected over Jackson, who still had the most electoral votes. The representatives of all the states that had gone for Clay in the Electoral College supported Adams.

Clay was the Speaker of the House and arguably the most powerful person in Congress. It was widely believed he used his influenced to convince Congress to elect Adams, who then made Clay his Secretary of State (then and now considered the most prestigious and influential office after the presidency itself). Jackson’s supporters denounced this as a “corrupt bargain” and launched a four-year campaign of revenge, claiming that the people had been cheated of their choice.

In a now familiar refrain, these “Jacksonians” attacked the Adams administration at every turn as illegitimate and tainted by elitism and corruption.

More to the point, as the son of the second president, John Adams, the election of John Quincy to only the sixth presidential office began a discussion about political dynasties that recently came up with the Bush and Clinton candidacies.

Towards a Universal Time

On this day in 1879, at a meeting of the Royal Canadian Institute in Toronto, Scottish-Canadian engineer and inventor Sandford Fleming proposed the idea of standard time zones based on a single universal world time. He suggested that standard time zones could be used locally, but would follow a single world time, which he called “Cosmic Time”. The proposal divided the world into twenty-four time zones, each one covering 15 degrees of longitude. All clocks within each zone would be set to the same time as the others but differed by one hour from those in the neighboring zones.

An amazing innovation we take for granted. Wikimedia Commons.

We take time zones for granted today, but for most of human history, time was kept locally, based on how people in each town or city measured the position of the sun. Most humans did not travel beyond their community, and the few who did would takes days or weeks, so it never really made a difference whether one city was hours off from another one.

But the development of rail travel in the late 19th century posed a huge challenge. For the first time in history, people were crossing through multiple towns in the span of hours, leading to the absurd practice of having to continually reset clocks throughout the day.

An 1853 “Universal Dial Plate” showing the relative times of “all nations” before the adoption of universal time. Wikimedia Commons.

As the first country to industrialize, Great Britain was the first to deal with this problem on a large scale; in response, it established Greenwich Mean Time, which was the mean solar time on the Prime Meridian at Greenwich, England. (Ironically, this had been developed to resolve the same issue with respect to ocean navigation. Each country had its own prime meridian in its navigational charts to serve as something of a starting point; it usually passed through the country in question, until the navally dominant British developed the Prime Meridian that most others would soon follow.) Clocks in Britain were set to this time irrespective of local solar noon.

Anyway, Fleming’s proposal gave way to a flurry of international discussion about how to address this issue. The British government even forwarded his work to eighteen foreign countries and several scientific bodies to determine a solution. The United States called an “International Meridian Conference” in 1884 that gathered delegates from around the world to set up a universally recognized basis for time. It was announced the Greenwich Mean Time would be used, for the simple reason that by then, two-thirds of all nautical charts and maps already used it as their prime meridian

By 1972, all major countries adopted time zones based on the Greenwich meridian (since 1935, called “Universal Time”). The saga of universal time is yet another example of our species’ move towards a more global and interconnected community.

The Tiny African Country Taking on a Genocidal Government

Meet Abubacarr Tambadou, the Justice Minister of The Gambia—a tiny African country barely twice the size of Delaware and with fewer people than Miami-Dade County—who is taking on one of the worst genocides in the 21st century.

Under his direction, The Gambia is the only country to file a claim in the International Court of Justice (ICJ) against Myanmar for violating the Genocide Convention through its persecution of the Rohingya Muslims, which has killed tens of thousands and driven out over a million more. Tambadou also convinced the 57-member Organisation of Islamic Cooperation to back the effort, bringing a fourth of the world behind him.

Born in 1972 as one of the middle children of 18 siblings, he considered himself lucky for his middle-class upbringing. He had no intention of studying law—having excelled in sports all his life—but the first offer he got was a law program at a British university. After graduating in the 1990s, he returned home to be a public prosecutor.

At the time, Gambia was ruled by a vicious dictator who frequently killed and tortured real or perceived political opponents. In 2000, when security forces killed over a dozen student protestors, Tambadou was roused into pursuing human rights work.

To that end, he soon left Gambia to join the United Nations’ Tanzania-based International Criminal Tribunal for Rwanda (ICTR), where he successfully prosecuted some of the genocide’s most notorious perpetrators, including former army chief Augustin Bizimungu, who was sentenced to 30 years in prison.

As he told the BBC, what he was doing “was not just prosecuting the Rwandan genocidaires”, but “was a way for us Africans to send a message to our leaders… I saw it as more of an African struggle for justice and accountability than a Rwandan one.”

Sure enough, in 2017, Gambia’s dictator fell after 22 years of power. Opposition leader Adama Barrow took power promising to restore human rights and address corruption, prompting Tambadou to return to help lead this effort.

“Twenty-two years of a brutal dictatorship has taught us how to use our voice. We know too well how it feels like to be unable to tell your story to the world, to be unable to share your pain in the hope that someone out there will hear and help.”

A devout Muslim with a prominent prayer bump on his forehead, Tambadou acknowledged that Islamic solidarity was a factor behind Gambia and the OIC’s actions but emphasized that “this is about our humanity ultimately”.

Indeed, it was after visiting a refugee camp full in Bangladesh of genocide survivors that he was spurred to act. Last spring, Gambia foreign minister pulled out at the last minute from the annual conference of the OIC in Bangladesh, sending Tambadou instead. While there, he joined an OIC delegation visiting overcrowded refugee camps, hearing stories of children burnt alive and women systematically raped; he claimed to even smell the stench of dead bodies from across the border.

“I saw genocide written all over these stories”, he said in an interview, no doubt making the connection between these accounts and what he had learned after ten years prosecuting Rwandan perpetrators for similar crimes.

To that end, his case against Myanmar—which took the world by storm—has for the first time forced its leaders to answer for their alleged crimes. Though the case will no doubt take years to resolve—given the high bar set to prove genocide—the ICJ has since ordered Myanmar to cease its actions against the Rohingya, not buying the argument that it’s simply the result of a broader military conflict.

Yes, I know: It’s a toothless order given the nature of international law. But it’s powerful nonetheless, as many Rohingya themselves agree:

Yet the mere fact that it took place at all counts as a huge moral victory for the Rohingya. For the first time, this group — which has endured decades of systematic discrimination at the hands of its own government — experienced a fair hearing from an impartial tribunal. The power of that realization prompted tearful reactions from Rohingya activists in The Hague.

“It was very emotional to see the military facing charges in a court for the first time,” U.K.-based Rohingya activist Tun Khin told me. “The military have been getting away with human rights violations against us for decades. We have worked so hard for this day.”

And to think it began with a public prosecutor of a small country most have never heard of.

To that end, Mr Tambadou thinks this is the time for The Gambia to reclaim its position on the world stage. “We want to lead by example” in human rights. “The case at ICJ is Gambia showing the world you don’t have to have military power or economic power to denounce oppressions. Legal obligation and moral responsibility exist for all states, big or small.”

Sources: BBC, Reuters, CFR