Happy Anniversary to a Famously Humanist Take on Christmas

On this day in 1843, A Christmas Carol by English author Charles Dickens was first published (first edition pictured below), arguably influencing Christmas as we know it more than any pagan tradition. In fact, the phrase “Merry Christmas” was popularized by the story!

Left-hand page shows Mr and Mrs Fezziwig dancing; the right-hand page shows the words "A Christmas Carol. In Prose. Being a Ghost Story of Christmas by Charles Dickens. With illustrations by John Leech
Wikimedia Commons

Dickens was ambiguous about religion; while he was likely a Christian and admired Jesus, he openly disliked rigid orthodoxy, evangelicalism, and organized religion. (He once published a pamphlet opposing the banning of games on the Sabbath, arguing that people had a right to pleasure.)

To that end, a Christmas Carol placed less emphasis on faith and observance and instead focused on family, goodwill, compassion, and joy. Dickens sought to incorporate his more humanist approach to the holiday, constructing Christmas as a family-centered festival that promotes generosity, feasting, and social cohesion. Some scholars have even termed this “Carol Philosophy”.

So when religious and nonreligious folks alike think of loved ones and the “Christmas spirit”, they are basically channeling Dickens’ once-unique take on the holiday. (Though in his time, other British writers had begun to reimagine Christmas as a celebratory holiday, rather than a strictly religious occasion.)

The Seeds of the International Order

Posted @withregram • @ungeneva

Geneva, capital of the world, was crowded to capacity today when representatives of nearly half a hundred nations from every corner of the globe gathered to attend the first meeting of the assembly of the League of Nations.

One hundred years ago this week, the first session of the assembly of the newly established League of Nations was held in the Reformation Hall in Geneva. The meeting brought together representatives of 42 countries representing more than half of the world’s population at the time.

Image may contain: one or more people and crowd
Archive photo/League of Nations

Though the League of Nations is better known for its abject failure to prevent World War II—which led to its replacement by the United Nations in 1945—it is difficult to understate its bold and audacious vision: For the first time in our bloody and divided history, there was a sense of cooperation and community among our fractured civilizations. The League set in motion the growing global consciousness and interconnectedness we see to this day (however tenuously). It also brought attention to issues that were long overlooked or dismissed by most societies: poverty, slavery, refugees, epidemics, and more. It thus laid the groundwork for organizations that aid tens of millions of people worldwide.

Ironically, despite its failure to stop the bloodiest war in history, the League’s successor, the UN, has been credited with preventing any large interstate conflicts to this day—in part because it created a League-induced forum for countries to duke it out at the table rather than the battlefield (to paraphrase Eisenhower). We got a hell of a ways to go, but we have to start somewhere, and this 100-year experiment with internationalism and pan-humanism pales to thousands of years of constant war and repression.

Thank you for your time!

The Swedes Who Saved Millions of Lives

Meet the Nils Bohlin and Gunnar Engellau, whose work at Swedish carmaker Volvo has helped save millions of lives worldwide.

Engellau, Volvo’s president and an engineer himself, helped push for a more effective seatbelt, after a relative died in a traffic accident due partly to the flaws of the two-point belt design—which was not even standard feature in cars at the time. This personal tragedy drove Engellau to find a better solution, hiring Bohlin to find a solution quickly.

There were two major problems with the historic two-point belt design, which crosses the lap only. First, because the human pelvis is hinged, a single strap fails to restrain the torso, leaving passengers vulnerable to severe head, chest and spinal injuries; positioned poorly, the belt can even crush internal organs on impact. Second, they were notoriously uncomfortable, so many people chose not to wear them. Bohlin’s innovation was to find a design that resolved both problems at once.

After millions of dollars and thousands of tests through the 1950s and 1960s, Volvo became the first carmaker in the world to standardize the three-point safety belt we now take for granted. More than that, Volvo pushed hard for the seatbelt to be adopted in its native Sweden, which like most places was initially resistant to having to wear seatbelts.

But Volvo didn’t stop there. While it patented the designs to protect their investment from copy-cats, the company did not charge significant license fees to rivals or keep the design to itself to give their cars an edge. Knowing that lives were at stake worldwide, Engellau made Bohlin’s patent immediately available to all. Having sponsored the costly R&D, they gifted their designs to competitors to encourage mass adoption. It is estimated that Volvo may have lost out on $400 million in additional profits, if not more.

Instead, literally millions of people have been spared injury and death by this now-ubiquitous seatbelt we take for granted. All because a couple of Swedes decided to put people over profits (which isn’t to say they didn’t reap any financial incentive, but proved you can do both).

The World Food Programme

To many observers, especially in the United States, this year’s winner of the Nobel Peace Prize may seem uninspired, if not unfamiliar. It is an organization, rather than a person, and its work is probably not as widely known and appreciated as it should be.

Yet the United Nations World Food Programme (WFP) is no less deserving of the honor (especially since over two dozens entities have won the Peace Prize before, including the United Nations itself). It is the largest humanitarian organization in the world, and the largest one focused on hunger, malnutrition, and food insecurity, providing critical food assistance to nearly 100 million people across 88 countries. Tens of millions would starve without its fleet of 5,600 trucks, 30 ships, and nearly 100 planes delivering more than 15 billion rations, at just 61 cents each. Remarkably, WFP does all its work based entirely on voluntary donations, mostly from governments.

Laudable as all that might be, it’s fair to ask what this work has to do with peace? Two-thirds of WFP’s work is done in conflict zones, where access to food is threatened by instability, violence, and even deliberate war tactics. Amid war and societal collapse, people are likelier to die from starvation, or from opportunistic diseases that strike their malnourished immune systems. Since its experimental launch in 1961, WFP has delivered aid to some of the most devastating and horrific natural disasters in history, including the Rwandan genocide, the Yugoslav War and the Indian Ocean tsunami in 2004. (It became a permanent UN agency in 1965, having proven its worth by mustering substantial aid to earthquake-stricken Iran in 1962, initiating a development mission in Sudan, and launching its first school meals project in Togo.)

As The Economist points out, the focus on hunger is a sensible one: Not only have famine and malnutrition destroyed millions of lives across history, but they remaining pressing concerns in the face of the pandemic, climate change, and renewed conflict.

Governments everywhere are desperate to bring an end to the pandemic. But hunger has been growing quietly for years, and 2019 was the hungriest year recorded by the Food Security Information Network, a project of the WFP, the Food and Agriculture Organisation and other NGOs, which since 2015 has been gathering data on how many people worldwide are close to starvation. The rise was largely a consequence of wars in places like South Sudan, Yemen and the Central African Republic. This year, thanks to the covid-19 pandemic, things are likely to be far worse. Rather than war, this year it is the dramatic falls in the incomes of the poorest people that is causing hunger. There is as much food to go around, but the poor can no longer afford to buy it. The number of hungry people might double, reckons the WFP, from 135m in 2019 to 265m at the end of this year.

Unfortunately, despite the increased (and likely to increase) need for its services—more people face hunger than at anytime since 2012—the agency’s precarious budget, ever-dependent on the whims of donors, is declining. Again, from the Economist:

Last year the organisation received $8.05bn from its donors, by far the biggest of which is the United States. This year so far it has received only $6.35bn. Many countries, such as Britain, link their aid budgets to GDP figures which have fallen sharply. Britain provided roughly $700m of the WFP’s funding in 2019. This year its aid budget will fall by £2.9bn ($3.8bn). Under Mr Trump America had turned away from funding big multilateral organisations even before the pandemic hit, though the WFP has escaped the fate of the WHO, to which Mr Trump gave notice of America’s withdrawal in July. In Uganda food rations for South Sudanese and Congolese refugees have been cut. In Yemen the WFP has had to reduce rations by half.

WFP estimates that seven million people have already died from hunger this year, and will need almost seven billion dollars over the next six months to avert looming famines worldwide. WFP’s head, a former U.S. Republican governor, is using the agency’s higher profile from the Nobel Prize to urge more funding from governments and especially billionaires (whose collective health increased by over ten trillion this past year).

World Mental Health Day

Today is World Mental Health Day, launched in 1996 by the UN—at the urging of the World Mental Health Federation and with support from the WHO—to raise awareness about one of the most misunderstood but increasingly problematic issues facing humanity.

Even the concept of mental health is fairy new in human history. What we now call mental illnesses were known, studied, and treated by the ancient Mesopotamians, Egyptians, Greeks, Romans, Chinese, and Indians. Some were called “hysteria” and “melancholy” by the Egyptians, and certain Hindu texts describe symptoms associated with anxiety, depression, and schizophrenia. The Greeks coined the term “psychosis”, meaning “principle of life/animation”, in reference to the condition of the soul.

In virtually every society up until the 18th century, mental health was associated with moral, supernatural, magical and/or religious causes, usually with the victim at fault in some way. The Islamic world came closest to developing something like a mental health institution, with “bimaristans” (hospitals) as early as the ninth century having wards dedicated to the mentally ill. The term “crazy” (from Middle English meaning “cracked”) and insane (from Latin insanus meaning “unhealthy”) came to mean mental disorder in Medieval Europe.

In the mid 19th century, American doctor William Sweester coined the term “mental hygiene” as a conceptual precursor to mental health. Advances in medicine, both technologically and philosophically, quickly found the connection between mental and physical health while minimizing the idea of moral or spiritual flaws being the cause (the Greeks did come close to this, namely Hippocrates, who linked syphilis to a physical cause).

But the dark takeaway from this was the so called “social hygiene movement“, which saw eugenics, forced sterilization, and harsh experimental treatments as the solutions to mental and physical disabilities or divergences. Though the Nazis were the ultimate manifestation of this odious idea, their propaganda and policies cited most of the Western world, including the U.S., as standing with them in their efforts to cleanse populations. (In fact, the term mental health was devised after the Second World War partly to replace the now-poisoned idea of mental “hygiene”.)

While we have come a long way towards realizing the evils and horrors of how we treat mental illness—from ancient times to very recent history—abuses, misunderstandings, and neglect remain worldwide problems.

Hence I also want to take today to thank everyone throughout my life who has been so understanding, supportive, and affirming with respect to my own mental health struggles. I would never have broken through my anxiety or depression induced barriers without a loving and compassionate social support structure along the way (to say nothing of my relative socioeconomic privileges, which unfortunately remains the most common barrier to mental health treatment in the U.S.).

I am certainly luckier than most. Mental illnesses are more common in the U.S. than cancer, diabetes, or heart disease, which are far better known and addressed. Over a quarter of all Americans over the age of 18 meet the criteria for having a mental illness. Youth mental health has become especially dire, with 13% reporting a major depressive episode just over the past year, of whom only 28% get treatment. And over 90% of Americans with a substance abuse issue (which is usually tied to mental health) receive no treatment.

Worldwide, one out of four humans endure a mental health episode in their lifetimes. Depressive disorders are already the fourth leading cause of the global disease burden, and will likely rank second by the end of 2020, behind only ischemic heart disease. According to the World Health Organization (WHO), the global cost of mental illness—in terms of treatment, lost productivity, etc.—was nearly $2.5 trillion in 2010, with a projected increase to over $6 trillion by 2030.

Tragically, most mental health issues can be treated with relative ease: 80% of people with schizophrenia can be free of relapses following one year of treatment with antipsychotic drugs combined with family intervention. Up to 60% of people with depression can recover with a proper combination of antidepressant drugs and psychotherapy. And up to 70% of people with epilepsy can be seizure free with simple, inexpensive anticonvulsants. Even changing one’s diet could have an effect.

But over 40% of countries have no mental health policy, over 30% have no mental health programs, and around 25% have no mental health legislation. Nearly a third of countries allocate less than 1% of their total health budgets to mental health, while another third spend just 1% of their budgets on mental health. (The U.S. spent about 7.6% in 2001.)

In his book, Lost Connections: Uncovering the Real Causes of Depression – and the Unexpected Solutions, Johann Hari explores the environmental and socioeconomic factors that contribute to poor mental health, and how these are often neglected in discussions and approaches to depression and anxiety.

Someone could meditate, think positively, or pursue therapy all they want, but if they are rationing insulin to stay alive, cannot find affordable housing, struggle to find a well paying job, and are otherwise at the mercy of external forces that leave them fundamentally deprived, such treatments—however effective and beneficial in many contexts—can only go so far.

He illustrates this perfectly with the following account:

In the early days of the 21st century, a South African psychiatrist named Derek Summerfeld went to Cambodia, at a time when antidepressants were first being introduced there. He began to explain the concept to the doctors he met. They listened patiently and then told him they didn’t need these new antidepressants, because they already had antidepressants that work. He assumed they were talking about some kind of herbal remedy.

He asked them to explain, and they told him about a rice farmer they knew whose left leg was blown off by a landmine. He was fitted with a new limb, but he felt constantly anxious about the future, and was filled with despair. The doctors sat with him, and talked through his troubles. They realised that even with his new artificial limb, his old job—working in the rice paddies—was leaving him constantly stressed and in physical pain, and that was making him want to just stop living. So they had an idea. They believed that if he became a dairy farmer, he could live differently. So they bought him a cow. In the months and years that followed, his life changed. His depression—which had been profound—went away. ‘You see, doctor,’ they told him, the cow was an ‘antidepressant’.

To them, finding an antidepressant didn’t mean finding a way to change your brain chemistry. It meant finding a way to solve the problem that was causing the depression in the first place. We can do the same. Some of these solutions are things we can do as individuals, in our private lives. Some require bigger social shifts, which we can only achieve together, as citizens. But all of them require us to change our understanding of what depression and anxiety really are.

This is radical, but it is not, I discovered, a maverick position. In its official statement for World Health Day in 2017, the United Nations reviewed the best evidence and concluded that ‘the dominant biomedical narrative of depression’ is based on ‘biased and selective use of research outcomes’ that ‘must be abandoned’. We need to move from ‘focusing on ‘chemical imbalances’, they said, to focusing more on ‘power imbalances’.

I can only hope that as mental health becomes less stigmatized—less a matter of superstition, genetic inferiority, or moral and individual failing—we can work towards building fairer and more just societies that promote human flourishing, physically, mentally, and spiritually.

Source: WHO

The Little Satellite that Triggered the Space Age

On this day in 1957, the Soviet spacecraft Sputnik 1, the first artificial satellite to orbit the Earth, was launched from the Baikonur Cosmodrome (the first, largest, and most active space port to this day). Thus, began a series of pioneering Soviet firsts—from nonhuman lunar landings to explorations of Venus—that would in turn trigger the Space Race with America culminating in the Moon landings.

60 Years Since Sputnik | Space | Air & Space Magazine

Ironically, despite the centralized and authoritarian nature of the Soviet political system, the U.S.S.R. never developed a single coordinating space agency like NASA. Instead it relied on several competing “design bureaus” led by brilliant and ambitious chief engineers vying to produce the best ideas. In other worlds, these Cold War rivals embraced space exploration with the other side’s philosophy: the Americans were more government centered, while the Russians went with something closer to a free market. (Of course, this oversimplifies things since the U.S. relied and still relies on independent contractors.)

Sergei Korolev - Wikipedia

Hence Sputnik was the product of six different entities, from the Soviet Academy of Science to the Ministry of Defense and even the Ministry of Shipbuilding. The satellite had been proposed and designed by Sergei Korolev, a visionary rocket scientist who also designed its launcher, the R-7, which was the world’s first intercontinental ballistic missile. He is considered the father of modern aeronautics, playing a leading role in launching the first animal and human into space, with plans to land on the Moon before his unexpected death in 1966—three years before the U.S. would achieve that feat (who knows if the Russians would have made it had Korolev lived).

As many of us know, Sputnik’s launch led to the so called “Sputnik crisis”, which triggered panic and even hysteria among Americans, who feared the “free world” was outdone by the communists and that American prestige, leadership, scientific achievement, and even national security were all at stake. (After all, the first ICBM had just been used to launch the satellite and could very well do the same with nukes.)

Surprisingly, neither the Soviet nor American governments put much importance in Sputnik, at least not initially. The Russian response was pretty lowkey, as Sputnik was not intended for propaganda. The official state newspaper devoted only a few paragraphs to it, and the government had kept private its advances in rocketry and space science, which were well ahead of the rest of the world.

The U.S. government response was also surprisingly muted, far more so than the American public. The Eisenhower Administration already knew what was coming due to spy planes and other intelligence. Not only did they try to play it down, but Eisenhower himself was actually pleased that the U.S.S.R., and not the U.S., would be the first to test the waters of this new and uncertain frontier of space law.

But the subsequent shock and concern caught both the Soviet and American governments off guard. The U.S.S.R. soon went all-in with propaganda about Soviet technological expertise, especially as the Western world had long propagandized its superiority over the backward Russians. The U.S. pour money and resources into science and technology, creating not only NASA but DARPA, which is best known for planting the seeds of what would become the Internet. There was a new government-led emphasis on science and technology in American schools, with Congress enacting the 1958 National Defense Education Act, which provided low-interest loans for college tuition to students majoring in math and science.

After the launch of Sputnik, one poll found that one in four Americans thought that Russian sciences and engineering were superior to American; but the following year, this stunningly dropped to one out of ten, as the U.S. began launching its own satellites into space. The U.S.-run GPS system was largely the result of American physicists realizing Sputnik’s potential for allowing objects to be pinpointed from space.

The response to Sputnik was not entirely political, fearful, or worrisome. It was also a source of inspiration for generations of engineers, scientists, and astronauts across the world, even in the rival U.S. Many saw it optimistically as the start of a great new space age. The aeronautic designer Harrison Storms—responsible for the X-15 rocket plane and a head designer for major elements of the Apollo and Saturn V programs—claimed that the launch of Sputnik moved him to think of space as being the next step for America. Astronauts Alan Shepard, the first American in space, and Deke Slayton, one of the “Mercury Seven” who led early U.S. spaceflights, later wrote of how the sight of Sputnik 1 passing overhead inspired them to pursue their record-breaking new careers.

Who could look back and imagine that this simple, humble little satellite would lead us to where we are today? For all the geopolitical rivalry involved, Sputnik helped usher in tremendous hope, progress, and technological achievement.

International Day of Clear Blue Skies

Aside from Labor Day in the U.S., today is the first International Day of Clear Blue Skies, which was established by the United Nations General Assembly to bring awareness to the largest environmental risk to public health globally: air pollution.

Over 90% of our world is exposed to polluted air, which causes an estimated seven million premature deaths every year (more than cigarette smoking) and leaves millions more with chronic health problems like asthma and cognitive decline.

Fortunately, the world has a precedent for successful action: Over 30 years ago this month, the UN-sponsored Montreal Protocol saw literally every country commit to working together to eliminate CFCs, which were causing severe depletion of the ozone layer; it remains one of the few treaties with universal agreement. It took only 14 years between the discovery of the problem and the world committing to resolve it—and we’ve already seen the results.

No photo description available.

A few years ago, it was confirmed that the ozone layer is slowly recovering, and most projections show it fully healing within the next four decades. In an era of rising conflict and poor global leadership, this unlikely and little known success story of international cooperation is a glimmer of hope.

The Developing Countries Winning Against COVID-19

It’s been heartening to see that many poorer countries or regions are faring a lot better than expected. For all the death and suffering that’s occured, it’s important to acknowledge the deaths and pain that haven’t—and to derive some important lessons, since these are places that don’t have our wealth and resources.

Costa Rica has had one of the most successful pandemic responses in the world. It was the first Latin American country to record a case—which is actually indicative of its open and efficient monitoring—and citizens have been able to lean on its universal healthcare system, on which it spends a higher proportion of its GDP than the average rich country (and subsequently has one of the world’s highest life expectancies). It implemented nationwide lockdowns and tests quickly, and has done a good enough job that it stared partially lifting restrictions as early as May 1st—albeit with strict restrictions (only a quarter of seats can be filled in sporting venues, while small businesses are limited in the number of customers they can serve).

The country’s President Carlos Alvarado has been transparent: “We have had relative and fragile success, but we cannot let our guard down.” Hence the borders will remain closed until at least this Friday, while restrictions will remain on driving to keep the virus from spreading: Driving at night is banned and drivers may only drive on certain days depending on their license plate number.

Ghana and Rwanda—which hardly come to mind as world-class innovators—each teamed up with an American company to become the first countries in the world to deliver medical aid and tests via drones to out-of-reach rural areas. Doctors and health facilities use an app to order blood, vaccines, and protective equipment that get delivered in just minutes. Rwanda, which has become a little known but prominent tech hub, started using drones as early as 2016 for 21 hospitals; now the drones are used to serve close to 2,500 hospitals and health facilities across Rwanda and Ghana.

Vietnam (with almost 100 million people) and the Indian state of Kerala (roughly the size of California), both learned from previous outbreaks and acted quickly and decisively to contain the outbreak. As the Economist magazine put it, despite their poverty, they have “a long legacy of investment in public health and particularly in primary care, with strong, centralised management, an institutional reach from city wards to remote villages and an abundance of skilled personnel.” Lack of wealth did not stop them from making the necessary investments.

Uzbekistan, a former Soviet republic that’s hardly a household name, has pioneered remote learning. Two days after its lockdown, the Ministry of Public Education announced an unprecedented plan to roll out virtual courses and resources for its 6.1 million school students. In a matter of days, it made available over 350 video lessons to go live on national TV channels; the lessons are available in the dominant languages of Uzbek and Russian as well as sign language. Free data access has been granted to educational platforms, making them accessible for all school students and their parents. An average of 100 video classes are being prepared daily.

While it is too soon to tell what’s in store for these nations in the long term, they have proven that you don’t need lots of wealth and power to develop an effective and humane response to crises. If anything, their poverty and historic challenges have made them more resourceful and decisive, thus providing useful lessons for the rest of the world.

The Singapore of Africa

It’s amazing how the fate of nations could change in the span of decades. In 1994, the tiny central African nation of Rwanda seemed to suddenly succumb to a level of carnage not seen since the Second World War (notwithstanding other under-the-radar conflicts like the Congo War).

Over a period of just 100 days, up to 1 million people were slaughtered by paramilitaries and even friends and neighbors, mostly by machetes and small arms. Already poor and politically dysfunctional, Rwanda was ignored and let down by the international community even in its most calamitous state—how could it ever recover from such an orgy of bloodshed and neglect?

Well, close to thirty years later, recover it has. While undemocratic and undeveloped, it has made incredible strides for a nation that faced one of the most horrific genocides in human history. How could Rwanda, of all places, become so stable and economically sophisticated?

“There are a few fundamentals you have to understand. Firstly, our country is the same size as the U.S. state of Maryland, but our population is around 12 million people. Secondly, we have no natural resources — no oil or gold or anything else that countries benefit from,” explains Claudette Irere, director general at Rwanda’s Ministry of Youth and Information and Communication Technology. “This means the only way for us to move forward and to build our future is to empower people and make good use of technology. With this strategy, we are shifting our country from an agrarian economy to a knowledge-based economy.”

Rwanda is beginning to leapfrog developed countries in fundamental areas such as smart city infrastructure, vocational training, and strategic foreign investment. As of January this year, 4G/LTE networks cover more than 95 percent of the country, and a mix of public and private players are working together on a national roll-out of fiber-optic broadband. As its citizens and businesses get connected, Kigali is becoming an African hub for multinational tech companies, including Google, Facebook, and Amazon.

[…]

Between 2001 and 2014, Rwanda achieved an annual growth rate of 9 percent and earned a global reputation as an attractive business destination. According to the World Bank’s 2018 Ease of Doing Business Index, Rwanda has risen above countries like Italy, Belgium, and Israel to become the 41st most business-friendly nation on earth. Rwanda was also the index’s biggest business reformer, with activities like starting up, registering property, paying taxes, and enforcing contracts all becoming increasingly easier in the country.

“Urbanization is becoming more of a challenge for things like traffic and public transportation. This creates a lot of opportunities for technology and innovation,” Irere says. “Working with global companies that lead in areas such as the internet of things (IoT) is helping us understand the problems we must solve before our city grows beyond our control.”

Kigali has even developed a culture of digital entrepreneurship that seems straight out of Silicon Valley.

One local company that’s making the most of Kigali’s digital infrastructure is ride-hailing app SafeMotos, founded by a Canadian entrepreneur who fell in love with the city. Road traffic collisions are a significant problem in Rwanda and its neighboring countries, with 40 percent more road deaths occurring per 100,000 people than in low- and middle-income nations in any other part of the world. To combat this problem, SafeMotos provides drivers with smartphones and pulls data from an app to measure their performance on trips. Customers are connected only with drivers who meet a certain safety threshold — an algorithmic score of at least 90 out of 100.

To be sure, Rwanda is far from idyllic. The moniker “Singapore of Africa” is apt in more ways than one: Not only is it an island of relative stability, development, and technological progress, but like the southeast Asian city state, it is also low-key authoritarian. Its president, Paul Kagame, who is credited with helping defeat the genocidal regime and carrying the country though to its recovery, has been in power since 1994—and is slated to remain in power until 2034, thanks to changes in the constitution that he has presided over. He won 99 percent of the vote in the most recent election, while critics from journalists to government officials have been imprisoned for their insolence; some may even have been assassinated.

The country is even pioneering the use of drones to deliver medical supplies to remote communities and enforce its COVID-19 lockdown. By the standards of Africa and even the world, Rwanda has one of the lowest rates of corruption, which no doubt accounts for a lot of its business success.

“The most obvious example of this was the inquest into the assassination of Patrick Karegeya, Rwanda’s former head of external intelligence, who was strangled in a South African five-star hotel on New Year’s Eve 2013. In January 2019 the case opened in a courtroom on the outskirts of Johannesburg, a city where every news agency and broadcaster, from AP to Xinhua, AFP to Reuters, the BBC to DPA, has an office. It was a story on a par with the murder of Jamal Khashoggi or the poisoning of Alexander Litvinenko in terms of international interest, massively diplomatically embarrassing to the Rwandan government — the South African Hawks hold it directly responsible — and the courtroom was a ten minute taxi ride from various well-staffed newsrooms. When I turned up, I was astonished by the pathetic press turnout. At first I assumed that the victim’s family and lawyers just hadn’t been very efficient at getting the word out, but if anything, press attendance got worse as the days passed and more and more hugely awkward details — all of them wonderfully quotable as they were being revealed in court — were brought to light. A massive opportunity missed.”

Such repression is not only obviously unjust and problematic in its own right, but it threatens the country’s incredible progress over the last 26 years since the genocide. Rwandans have demonstrated remarkable resilience, innovation, and creativity, but all that will be hard to maintain under the shadow of such a paranoid and stifling regime. I can only hope this promising success story can expand to more than just economics and include a free and democratic country—where such potential and prosperity can truly be unleashed.

Source: Lauren Razavi

What an Ancient Broken Femur Says About Civilization

There is an apocryphal story about the anthropologist Margaret Mead that has a timeless and universal message, though it’s relevant now than ever.

Years ago, she was asked by a student what she considered to be the first sign of civilization in a culture. The student expected Mead to talk about clay pots, tools for hunting, grinding-stones, or religious artifacts.

But no. Mead said that the first evidence of civilization was a 15,000 years old fractured femur found in an archaeological site. A femur is the longest bone in the body, linking hip to knee. In societies without the benefits of modern medicine, it takes about six weeks of rest for a fractured femur to heal. This particular bone had been broken and had healed.

Mead explained that in the animal kingdom, if you break your leg, you die. You cannot run from danger, you cannot drink or hunt for food. Wounded in this way, you are meat for your predators. No creature survives a broken leg long enough for the bone to heal. You are eaten first.

A broken femur that has healed is evidence that another person has taken time to stay with the fallen, has bound up the wound, has carried the person to safety and has tended them through recovery. A healed femur indicates that someone has helped a fellow human, rather than abandoning them to save their own life.

Many thanks to my friend Arthur K Burditt for sharing this.