The End of Smallpox

smallpox

Yesterday, December 9th, came and went like any other day. But on that day in 1979, one of the most groundbreaking endeavors in human history was accomplished: a group of eminent scientists commissioned by the United Nations World Health Organization (WHO) certified the global eradication of smallpox, the only human disease thus far to have been completely eliminated from nature. The WHO officially confirmed and announced this momentous achievement a few moments later:

Having considered the development and results of the global program on smallpox eradication initiated by WHO in 1958 and intensified since 1967 … Declares solemnly that the world and its peoples have won freedom from smallpox, which was a most devastating disease sweeping in epidemic form through many countries since earliest time, leaving death, blindness and disfigurement in its wake and which only a decade ago was rampant in Africa, Asia and South America.

Less than a decade before, the end of smallpox would have seemed the remotest possibility. As recently as 1967, the WHO had estimated that 15 million people contracted the disease, and that two million had died that year alone — the average number of annual deaths since the turn of the century. Continue reading

Advertisements

Progress Across Boundaries

It is telling that all the Nobel Prizes this year — as in recent years — have thus far been awarded to multiple laureates, often of different nationalities and/or for research done in a country different from their birthplace. Like so much else nowadays, science is becoming an increasingly globalized endeavor, conducted across an international network of institutes, universities, labs, and other academic and scientific organizations.

Of course, this is nothing new: almost every human achievement, regardless of time or place, can trace its origins to gradual, supplementary, or parallel developments elsewhere. Mathematical principles, political concepts, artistic expressions — all of the contributors to these and other fields built (and continue to build) upon the work of predecessors or contemporaries, adding to or refining the growing pool of ideas along the way. Thanks to advances in technology, expanding access to education of all levels (especially in the developing world), and a growing sense of global consciousness, this historical development is accelerating.

Knowledge and talent know no boundaries, whether political, linguistic, or ethnic, and the more we facilitate the exchange of ideas and the collaboration, the closer we will come to greater human progress. This is not easy, due to both practical and cultural challenges, but neither is it utopian; there is thousands of years worth of cross-cultural progress persisting to this very day proving it can be done, and the world has a lot to show for it. Given how much more needs to be done — socially, scientifically, ideologically, etc. — we have all the more reasons to keep it up.

World War Three?

I think people are too quick to invoke World War Three after every diplomatic scuffle, arms race, or rising tensions.

Over the last two centuries, since the advent of the international system, there have been literally hundreds, if not thousands, of potential flash points for global war. Only twice did it result in global conflict, and each of those were interrelated and stemmed from the intersection of factors unique to that time and place. Plus, it is obviously easier to notice the wars that occurred rather than the numerous potential wars that were averted or preempted.

Granted, those two wars killed over 70 million people and unleashed a level of destruction and barbarity that still remain incomprehensible. So, fear of something like that happening again is perfectly justified, and we mustn’t be complacent – war has long been the natural state of humanity, and the last few decades have been unusual in their relative peacefulness.

But we should be measured in our caution and tone down the apocalyptic rhetoric, which all too often feels dangerously fatalistic, if not eager (there is a subset of people, generally religious, who seem to welcome world-ending events).

What are your thoughts?

New Human Ancestor Discovered

From the New York Times.

The new hominin species was announced on Thursday by an international team of more than 60 scientists led by Lee R. Berger, an American paleoanthropologist who is a professor of human evolution studies at the University of the Witwatersrand in Johannesburg. The species name, H. naledi, refers to the cave where the bones lay undisturbed for so long; “naledi” means “star” in the local Sesotho language.

In two papers published this week in the open-access journal eLife, the researchers said that the more than 1,550 fossil elements documenting the discovery constituted the largest sample for any hominin species in a single African site, and one of the largest anywhere in the world. Further, the scientists said, that sample is probably a small fraction of the fossils yet to be recovered from the chamber. So far the team has recovered parts of at least 15 individuals.

“With almost every bone in the body represented multiple times, Homo naledi is already practically the best-known fossil member of our lineage”, Dr. Berger said.

Besides introducing a new member of the prehuman family, the discovery suggests that some early hominins intentionally deposited bodies of their dead in a remote and largely inaccessible cave chamber, a behavior previously considered limited to modern humans. Some of the scientists referred to the practice as a ritualized treatment of their dead, but by “ritual” they said they meant a deliberate and repeated practice, not necessarily a kind of religious rite.

“It’s very, very fascinating”, said Ian Tattersall, an authority on human evolution at the American Museum of Natural History in New York, who was not involved in the research. “No question there’s at least one new species here”, he added, “but there may be debate over the Homo designation, though the species is quite different from anything else we have seen”.

Learn more about this seminal finding from National Geographicwith which Berger is associated (apparently, he is quite a prominent and accomplished figure in his field).

Nepal’s Citizens Step Up To Heal Nation

An often unreported part of almost any disaster response is the pivotal role played by the victims themselves. Whether directly impacted or not, citizens from all overall the affected country come together to help one another and recovery.

NPR highlights how the beleaguered people of Nepal, long misgoverned and impoverished, have persevered through collaboration and generosity against one of the deadliest disasters in their nation’s history. Continue reading

The Centennial of the Armenian Genocide

Today marks the 100th anniversary of one of the modern world’s first genocides, in which 800,000 to 1.5 million Armenian men, women, and children were systematically slaughtered by the Ottoman government. Here is an excerpt from Wikipedia:

The starting date is conventionally held to be 24 April 1915, the day Ottoman authorities rounded up and arrested, subsequently executing, some 250 Armenian intellectuals and community leaders in Constantinople. The genocide was carried out during and after World War I and implemented in two phases: the wholesale killing of the able-bodied male population through massacre and subjection of army conscripts to forced labour, followed by the deportation of women, children, the elderly and infirm on death marches leading to the Syrian desert. Driven forward by military escorts, the deportees were deprived of food and water and subjected to periodic robbery, rape, and massacre. Other indigenous and Christian ethnic groups such as the Assyrians and the Ottoman Greeks were similarly targeted for extermination by the Ottoman government, and their treatment is considered by many historians to be part of the same genocidal policy. The majority of Armenian diaspora communities around the world came into being as a direct result of the genocide.

Though once overshadowed by the better known and more well documented Holocaust, the Armenian Genocide has in recent years been one of the most studied and publicized organized mass killing in the 20th century. Much of that can be attributed, ironically, to successive Turkish governments refusing to recognize the event as a genocide, instead downplaying the death toll and ascribing it to wartime conditions and concerns:

The Republic of Turkey‘s formal stance is that the deaths of Armenians during the “relocation” or “deportation” cannot aptly be deemed “genocide”, a position that has been supported with a plethora of diverging justifications: that the killings were not deliberate or systematically orchestrated; that the killings were justified because Armenians posed a Russian-sympathizing threat as a cultural group; that the Armenians merely starved to death, or any of various characterizations referring to marauding “Armenian gangs”. Some suggestions seek to invalidate the genocide on semantic or anachronistic grounds (the word genocide was not coined until 1943). Turkish World War I casualty figures are often cited to mitigate the effect of the number of Armenian dead.

Nevertheless, twenty-three countries have thus far officially recognized the mass killings as genocide, which is the consensus among most genocide scholars and historians (including many in Turkey).

As evidenced by my reliance on Wikipedia, I lack the time to devote myself to this issue as it deserves. Instead, I will link you to some great sources beyond the well-written Wiki article.

The Guardian offers a quick rundown of how the genocide transpired and the current controversy regarding Turkey’s official denial. Vox goes a bit more in-depth with both the genocide and the subsequent campaign of denialism. NPR has an interesting story about one of the last Armenian-majority villages in Turkey and how it is faring, while the New York Times similarly explores the complex identity issues facing descendents of the genocide who still live in Turkey; the Times also has a collection of accounts from survivors or their descendents, including a couple by Turkish civilians who tried to help Armenians (as many certainly did).

I encourage everyone to do their part to learn about this human catastrophe, and for that matter to be aware of the many other genocides, before and since, that continue to blight our species to this day.

Why Do People Do The Opposite Of What They Are Told?

What is it about being told something, even politely or for good intentions, that makes us keen to do the opposite, at least on occasion? We all know about reverse psychology, which is perhaps one of the most mainstream and widely observed aspects of human behavior — but what makes us so stubborn about following advice or directions, whether from loved ones or authority figures?

Business Insider highlights three research-backed factors that explain this interestingly widespread practice.

1. Reactance: forbidden fruit tastes so much sweeter

When someone discourages you from doing something, you often feel that your freedom is being threatened, which motivates you to regain choice and control by doing exactly the opposite. Experiments show that children become more interested in a toy after they’re put under severe rather than mild pressure not to play with it, and children and adults become more likely to taste fatty foods when labels explicitly warn against them. One classic study even found support for the Romeo & Juliet effect: the more parents interfered with a romantic relationship, the stronger the feelings of love the couples developed over the next year. As Mark Twain once wrote, “Adam was but human… He did not want the apple for the apple’s sake, he wanted it only because it was forbidden.”

2. Rebound: whatever you do, don’t think about a white bear

When someone tells you not to think about something, your mind has a sneaky way of returning to that very thought. In a brilliant study led by psychologist Daniel Wegner, people were told not to think about a white bear. They spent the next 5 minutes thinking aloud, saying everything that came to mind, and ringing a bell if they spoke or thought of a white bear. They couldn’t escape the white bear: on average, it appeared in their thoughts every minute, and most people accidentally uttered “white bear” out loud once or twice. When the 5-minute suppression period was over, things got even worse: they thought about it more than twice as often as people who had been directly instructed to think about a white bear. When we try to suppress a thought, two things happen. The productive effect is that we consciously search for thoughts that don’t involve white bears. The counterproductive effect is that we unconsciously monitor for failures. In the back of our minds, we’re keeping an eye out for pale furry creatures in case they prove to be of the polar variety.

3. Curiosity: I wonder what’s inside…

When a behavior is forbidden or discouraged, it’s hard not to become intrigued. As Chip and Dan Heath write in Made to Stick, “it’s like having an itch we need to scratch.” Experiments reveal, for example, that people are more likely to watch violent TV shows and play violent video games when labels warn against them. And there are many examples of books becoming more popular after they’re banned. There’s a mystery to be unraveled: what could be so bad about this? When you started surfing the internet today, chances are that you carried an implicit expectation that a writer would be encouraging you to read his writing. If so, my headline surprised you by violating that expectation. “Why in the world would an author tell me not to read something he wrote? That doesn’t make any sense. Is he out of his mind?”

These principles make intuitive sense, especially the one about “forbidden fruit” and the allure of doing something illegal, prohibited, or otherwise authoritatively placed out of our reach. These reasons are important to keep in mind, not only to reign in on this habit, but to avoid the more insidious applications of reverse psychology:

In one study, psychologists asked 159 people if they had ever deliberately tried to get people to do something by recommending the opposite. More than two thirds generated a convincing example, and reported using reverse psychology an average of 1-2 times a month, with relatively little difficulty and high effectiveness. One respondent admitted, “One time I said that my friend had a good haircut when she didn’t. Usually, she disagrees with my opinion so she changed it. Which was good.”

Is this ethical? Some might say that in the case of a haircut, the (split) ends justify the means. When people are resistant to us or our ideas, and we have their best interests at heart, it’s acceptable to mislead them for their own good. Others would argue that a meaningful relationship allows, or even requires, transparency. If we can’t be honest with someone about our intentions, how much of a bond do we really have?

Wherever you stand on this spectrum, my hope is that you’ll be more attuned to reverse psychology when it wanders into your interactions. I also you’ll prevent it from biasing your choices. Next time you find yourself opposing a recommendation or warning, it’s worth asking whether it’s genuinely a bad idea. Maybe you’re just trying to fight for your freedom or scratch an itch.

Granted, like most aspects of human psychology and behavior, it takes a lot of continuous effort and conscientiousness to get the better of this habit. But the results — for both ourselves and those who are trying to help us — are well worth it. And needless to say, recognizing the motivations of our disobedience helps us better determine whether those who tell us what to do or not do mean well or are just trying to manipulate us.

The Greatest Threat to the World?

There seems to be no shortage of candidates for greatest threat to the world (by which we usually mean humanity specifically) — climate change, world war, nuclear weapons, a pandemic, an asteroid, or maybe even a combination of these factors. As it turns out, however, where you live determines what you consider to be most dangerous to the rest of the world.

That is the conclusion of a recent survey by the Pew Research Center, which asked 48,643 respondents in 44 countries what is the greatest danger to the global community (note, this took place before the breakout of Ebola but after events like the Syrian Civil War and the showdown between the West and Russia over Ukraine).

As Mic.com reports:

In the United States and Europe, income inequality came out on top. In the Middle East, religious and ethnic was considered the biggest threat. While Asia listed pollution and the environment, Latin America cited nuclear weapons, and Africa chose AIDS and other diseases.

Unsurprisingly, the concerns fell largely within geographic and regional boundaries. The United States and Europe are home to some of the largest and most advanced economies in the world, so it’s somewhat expected — if ironic — that they’re worried about income inequality. Asia is home to 17 out of the 20 most polluted cities in the world, and, as of 2012, sub-Saharan Africa accounted for 70% of the world’s AIDS cases.

In other words, all of us appear to have an exceptionally narrow view of the world: We see the biggest threats to our region as the biggest threats to everyone else, too.

Here is a visual representation of that data, also courtesy of Mic.com:

Moreover, the perception that religious and ethnic hatred poses the greatest threat to the world has seen the most growth over the past seven years, no doubt due to numerous high-profile sectarian conflicts across the planet.

Courtesy of The Atlantic is a color-coded map of the world that better shows how these great threats are geographically and culturally spread out:

A few other observations of the data from The Atlantic piece:

  • Other than Japan, the countries that saw nuclear weapons as their biggest danger included Russia (29 percent), Ukraine (36 percent), Brazil (28 percent), and Turkey (34 percent).
  •  The U.K.’s greatest concern was religious and ethnic hatred (39 percent), putting it in the same group as India (25 percent), Israel (30 percent), the Palestinian territories (40 percent), Lebanon (58 percent), and Malaysia (32 percent).
  • People in France were equally divided on what they consider the biggest threat, with 32 percent saying inequality and the same percentage saying religious and ethnic hatred.
  • Likewise in Mexico, nuclear weapons and pollution were tied as most menacing, at 26 percent.

It is also important to point out that in many cases, no single fear was dominant: in the U.S. for example, inequality edged over religious and ethnic hatred and nuclear weapons by only a few points. And in almost every region, anywhere from a fifth to a quarter of respondents expressed fear towards nuclear weapons (which I feel can be taken to mean war among states where the use of nukes is most likely). The survey observed that in many places, “there is no clear consensus” as to what constitutes the greatest danger to humanity, as this graph of all countries shows:

These results are very telling: as the earlier excerpt noted, you can learn a lot about a country’s circumstances based on what its people fear the most. Reading backwards from the results, it makes sense that what nations find the most threatening is what they have been most imperiled by presently or historically.

It is also interesting to note how societies, like individuals, view the world through their own experiential prism: because we are obviously most impacted and familiar with what immediately effects us, it makes sense that we would project those experiences beyond our vicinity. Just as our own individual beliefs — be they religious, political, social, etc. — are colored by personal life experiences, so too do entire nations often apply their most familiar concerns and struggles to the world at large.

Of course, this varies by country as well as by the respondents who represent said country; in many cases, participants are more likely come from higher educational and socioeconomic backgrounds, and thus reflect their class views rather than that of their wider society. (Admittedly, I am not sure if that applies to this particular Pew survey, as the respondents were interviewed by phone or face-to-face, with no indication as to their background.)

For my part, I personally put the most weight behind climate change, especially as it can exacerbate a lot of existing issues over the long-term (clashes among ethnic/religious groups over strained resources, refugees fleeing crop failures and placing strain upon host countries, etc.). What are your thoughts and opinions regarding the world’s greatest threat?

Altruism: It’s In Our DNA

Although, like most people, I have my cynical and misanthropic moments, I broadly consider myself to be an optimist with regards to human nature and our species’ capacity to improve itself and the world (arguably, I would be a poor humanist if I did not believe in the positive potential of humanity). The ability to practice concern for the welfare of others, without any want of reward or gain, represents one of the key virtues that will lead to a better world.

Much of my confidence stems from my own broadly beneficial experience with my fellow humans: I am fortunate to have experienced and witnessed so much kindness, compassion, and understanding. While my intimate study and exposure to the worst of humanity, past and present, has no doubt tempered my faith, I remain committed to the idea that humans are not in any sense fundamentally evil or violent, as many would believe.

Indeed, whatever moral and cognitive failings seem innate to our species seems offset by an inherent, evolutionary capacity to transcend such faults. Aside from ample anecdotal evidence of humans (as well as other primates) demonstrating selfless behavior, there is a large and growing body of research proving that selflessness and conscientiousness is a fundamental aspect of being human.

One of the most recent studies to explore the origins of human altruism was conducted by a team from the University of Zurich in Switzerland, which examined groups of primates — including humans — and how they each develop concepts of selflessness and cooperation. As reported in IFScience:

The researchers designed a test in which a food treat was placed on a sliding board. The individual moving the board can bring the treat within reach of others within the group, but will not be able to get the food themselves.

The experiment was carried out in 24 groups across 15 species of primates, including 3 groups of human children who were 5-7 years old. The food selection was tailored for each group, in order to test whether or not the primate would willingly give up a desired treat. The researchers found that species who most often utilized the “it takes a village” style of cooperative breeding were also more likely to help someone else get a treat, even though they didn’t get one themselves.

“Humans and callitrichid monkeys acted highly altruistically and almost always produced the treats for the other group members. Chimpanzees, one of our closest relatives, however, only did so sporadically,” Burkart explained in a press release.

The researchers also examined possible relationships between giving a treat to a friend and other cooperative behaviors, such as group hunting and complex social bonds, as well as relative brain size. Cooperative breeding was the only trait that showed a strong linear correlation and was the best metric for predicting altruistic behavior.

“Spontaneous, altruistic behavior is exclusively found among species where the young are not only cared for by the mother, but also other group members such as siblings, fathers, grandmothers, aunts and uncles,” Burkart continued.

However, cooperative breeding is likely one of many factors that could have influenced the evolution of altruism among humans. Over the evolutionary history of our ancestors, living in cooperative groups may have benefited greatly from high cognitive abilities, especially regarding things like language skills.

Burkart concluded: “When our hominin ancestors began to raise their offspring cooperatively, they laid the foundation for both our altruism and our exceptional cognition.”

In other words, being altruistic comes as natural to us as any other trait we consider to be quintessentially human (language, higher thinking, etc). Not only is it a virtue in itself, but it serves a pivotal role to our survival and flourishing. Working in tandem with the other characteristics of higher sentience, altruism helped grow and solidify social bonds, which in turn facilitates the cooperation and organization that is so vital to an otherwise defenseless and vulnerable species.

In fact, without our high cognitive capacity — our ability to share and develop new ideas, to invent, to coordinate and work together — we would not have survived against the harsh elements and the many physically superior predators that inhabited it. In the aggregate, every individual act of welfare and assistance to others helps create a stronger and more robust society that can better survive and prosper.

Shortly after the IFLS piece, NPR also published an article on the subject of altruism and its roots in human biology. It was inspired by the case of Angela Stimpson, a 42-year-old woman who donated a kidney to a complete stranger without any credit or reward. She cited a sense of purpose as her motivation, echoing many other altruists who claim to derive meaning from being kind and doing good deeds.

So what is the psychological basis of this position?  That is what Abigail Marsh of Georgetown University,a leading researcher on altruism, set out to discover:

Marsh wanted to know more about this type of extraordinary altruism, so she decided to study the brains of people who had donated a kidney to a stranger. Of the 39 people who took part in the study, 19 of them, including Angela Stimpson, were kidney donors.

Marsh took structural images to measure the size of different parts of their brains and then asked the participants to run through a series of computer tests while their brains were being scanned using functional MRI. In one test, they were asked to look at pictures of different facial expressions, including happiness, fear, anger, sadness and surprise.

Most of the tests didn’t find any differences between the brains of the altruistic donors and the people who had not been donors. Except, Marsh says, for a significant difference in a part of the brain called the amygdala, an almond-shaped cluster of nerves that is important in processing emotion.

These findings are the polar opposite to research Marsh conducted on a group of psychopaths. Using the same tests as with the altruists, Marsh found that psychopaths have significantly smaller, less active amygdalas. More evidence that the amygdala may be the brain’s emotional compass, super-sensitive in altruists and blunted in psychopaths, who seem unresponsive to someone else’s distress or fear.

The amygdala is part of the brain’s limbic system, the area that primarily houses our emotional life, and that plays a large role in forming memories and making decisions. Neither the study nor articles delves into the causality of the relationship between amygdala size and altruism: is it a large amygdala that leads one to become more selfless? Or does engaging in enough altruistic act over time cause the amygdala to grow larger? There is still much to learn about this area of the body.

But one thing is for certain: for all the negative behaviors and habits we associate with human nature, we must not overlook or understate just how intimately tied our humanity is with acts of kindness and compassion. From our biology to our neurology, humans, for the most part, have an instinct to be kind whenever and however possible. The key is to build upon these foundations, cultivate them in others, and figure out how to correct any naturalistic imbalances that may undermine. A difficult and long-term goal, but certainly a worthy and ultimately human one.

Five Big Takeaways on Creating Better Cities

In 2007, humanity reached a major, though largely overlooked, milestone: for the first time in history, over half of all humans lived in cities. Only a century before, a mere 15 percent of the world’s population lived in urban areas. The United Nations estimates that around 64 percent of the developing world, and 85 percent of the developed world, will be urbanized.

Needless to say, the world’s future lies in its cities, which are increasingly  the main drivers of everything from economic growth to cultural development. The science of cities has never been more vital: not only must we create urban areas that better promote human flourishing, but we must also take into account the impact on the environment, which is in an increasingly fragile state.

The City Lab column of The Atlantic reports on some of the findings by researchers involved in the growing field of “urban theory”, who over the years have gleaned some of these key observations and approaches:

Cities generate economic growth through networks of proximity, casual encounters and “economic spillovers.” The phenomenal creativity and prosperity of cities like New York is now understood as a dynamic interaction between web-like networks of individuals who exchange knowledge and information about creative ideas and opportunities. Many of these interactions are casual, and occur in networks of public and semi-public spaces—the urban web of sidewalks, plazas, and cafes. More formal and electronic connections supplement, but do not replace, this primary network of spatial exchange.

Through a similar dynamic, cities generate a remarkably large “green dividend.” It has long been known that cities have dramatically lower energy and resource consumption as well as greenhouse gas emissions per capita, relative to other kinds of settlements. Only some of this efficiency can be explained by more efficient transportation. It now appears that a similar network dynamic provides a synergistic effect for resource use and emissions—what have been called “resource spillovers.” Research is continuing in this promising field.

Cities perform best economically and environmentally when they feature pervasive human-scale connectivity. Like any network, cities benefit geometrically from their number of functional interconnections. To the extent that some urban populations are excluded or isolated, a city will under-perform economically and environmentally. Similarly, to the extent that the city’s urban fabric is fragmented, car-dependent or otherwise restrictive of casual encounters and spillovers, that city will under-perform—or require an unsustainable injection of resources to compensate. As Jacobs said, lowly appearing encounters on sidewalks and in other public spaces are the “small change” by which the wealth of a city grows.

Cities perform best when they adapt to human psychological dynamics and patterns of activity. Urban residents have a basic need to make sense of their environments, and to find meaning and value in them.  But this issue is not as straightforward as it may appear. Research in environmental psychology, public health and other fields suggests that some common attributes promote the capacity to meet these human requirements—among them green vegetation, layering, and coherent grouping. Wayfinding and identity are also promoted by iconic structures, and meaning is enriched by art. But for most people most of the time, evolutionary psychology is a more immediate factor to be accommodated. As Jacobs cautioned, a city is not primarily a work of art. That way of thinking is bad for cities—and probably bad for art too.

Cities perform best when they offer some control of spatial structure to residents. We all need varying degrees of public and private space, and we need to control those variations at different times of the day, and over the span of our lives. In the shortest time frames, we can open or close windows and doors, draw blinds, come out onto porches and informally colonize public spaces, or retreat inside the privacy of our homes. Over longer time frames, we can remodel our spaces, open businesses, build buildings, and make other alterations that gradually form the complex dynamic growth of cities.

Interesting stuff. What do you think?