Reflecting On The Killing Of Three Muslim Students

I rarely post about current events or news stories, but I have a rare bit of time and this even merits attention and reflection.

Last night, three Muslim students — Deah Barakat, 23; his wife, Yusor Mohammad Abu-Salha, 21; and her sister, Razan Mohammad Abu-Salha, 19 — were shot dead at a housing complex near University of North Carolina in Chapel Hill. The perpetrator was Craig Stephen Hicks, 46, who handed himself over to the police afterward. News is still unfolding as of this post, and the motive remains unclear, though some reports claim cite a dispute over parking — of all things to kill lover.

The natural question that comes to mind (or that should) is whether this incident was motivated by anti-Islam bigotry. This would certainly fit the pattern of post-9/11 attacks and harassment towards Muslims or those perceived to be Muslim (namely Sikhs). Opposition to Islam, ranging from criticism of the religion to out-and-out bigotry, have definitely seen an uptick in recent months following high-profile incidents involving Islamic extremists, such as the Charlie Hebdo shootings and the barbarism of Boko Haram and IS.

Given the present lack of information, it is difficult to determine why Hicks killed these people, although some sources have pointed out his open condemnation and mockery of organized religion on social media, as well as his association with atheist groups (albeit mainstream ones like Atheist for Equality that, to my knowledge, do not advocate violence or discrimination against religion people).

Ultimately, whether or not the perpetrator’s dislike of religion played a role in his decision to escalate a dispute into a murderous assault, it remains true that his atheism did not prevent him from such an immoral crime.

This tragic incident reaffirms why I much prefer the label of secular humanist over just plain atheist, precisely because mere disbelief in a deity or the supernatural says nothing about one’s morality or character. Atheism denotes what you do not have — religious beliefs — but not what you have chosen to replace said beliefs or ethical foundations with. Hence why atheists run the gamut from humanists like Albert Einstein to monsters like Joseph Stalin.

It goes without saying that a humanist framework is one that precludes violence against other humans, regardless of their beliefs, religious or otherwise. Of course people will always harm and kill one another regardless of whatever authority or precept they alleged to follow or associate with, whether it is secular or religious in nature. But this fact of human nature, whereby bad actions are caused by all sorts of other factors outside professed belief, does not preclude the creation of a comprehensive and authoritative moral and ethical framework.

Moreover, it is worth pointing out the distinction between being critical of religion as an idea and institution — all while still recognizing the humanity of its adherents — and hating religiously identifying people on such a visceral and hateful level as the perpetrator allegedly did. I myself am highly critical of religion as a whole, but I certainly do not view religious people as this faceless Other without personality, hopes, dreams, feelings, and humanity. Atheist or not, there is a difference between disliking or criticizing beliefs and ideas and taking the next step to hate or kill those innocents who hold such beliefs without harm to anyone else.

That said, it is important to remind fellow atheists to be careful to distinguish themselves (and their atheist leaders) as religious skeptics from religious bigots who incite such attacks or (in thankfully rare cases) directly perpetrates them. I am not trying to make this tragedy about me or the atheist movement, but highlighting the inherent dangers of proclaiming moral superiority by virtue of casting off religion while ignoring that one can still be a bad person, morally or behaviorally, regardless of what one believes.

If we are going to promote a skeptical view of religion, and opposition to its more harmful affects (both institutional and ideological), than we must do so alongside the propagation of a humanist ethic. By all means, critique religion and seek to minimize its harm, as I certainly do, but also recognize and fight the harms of non-religious origin, and more importantly see the humanity of the billions of fellow humans who, like it or not, hold religious views of some form or another.

All that said, I do not mean to read into this senseless act the larger issue of bigotry, lack of empathy, and the like; while likely factors, the details once again remain unknown for certain. It is also certainly not my intention to exploit a tragedy as an opportunity to get on a soap box for my own purposes and movement.

Rather, I am just tired of seeing people kill each other in such wanton manners for one reason or another: ideological, religious, anti-religious, opportunistic, etc. While I know this horror is a fact of human existence (at least for the foreseeable future — I cling to a kernel of utopianism), that does not mean that I want to be indifferent to the large psychological, social, and ideological factors underpinning so much of the killing and harming that goes on everyday somewhere in the world.

Given what little help I can lend to these unfortunate victims, the very least I can do — and in fact, feel obligated to do — is use the opportunity to reflect upon my own moral foundations and those of my fellow humans, both secular and non-religious. Maybe it is my way of trying to make sense of the senseless, or trying to derive meaning from sheer tragedy, but it is all I can do. I like to think that if enough of us continuous reflect on why we do the awful things we do, and what we can do about it, such barbarous acts will become more rare if not extinct.

One can still dream. In the meantime, my heart goes out to the victims and their loved ones. From what reports show, these young people were not only bright and talented, but socially conscious and humanitarian. By all accounts, they were, in other words, what humanists should aspire to be.

Altruism: It’s In Our DNA

Although, like most people, I have my cynical and misanthropic moments, I broadly consider myself to be an optimist with regards to human nature and our species’ capacity to improve itself and the world (arguably, I would be a poor humanist if I did not believe in the positive potential of humanity). The ability to practice concern for the welfare of others, without any want of reward or gain, represents one of the key virtues that will lead to a better world.

Much of my confidence stems from my own broadly beneficial experience with my fellow humans: I am fortunate to have experienced and witnessed so much kindness, compassion, and understanding. While my intimate study and exposure to the worst of humanity, past and present, has no doubt tempered my faith, I remain committed to the idea that humans are not in any sense fundamentally evil or violent, as many would believe.

Indeed, whatever moral and cognitive failings seem innate to our species seems offset by an inherent, evolutionary capacity to transcend such faults. Aside from ample anecdotal evidence of humans (as well as other primates) demonstrating selfless behavior, there is a large and growing body of research proving that selflessness and conscientiousness is a fundamental aspect of being human.

One of the most recent studies to explore the origins of human altruism was conducted by a team from the University of Zurich in Switzerland, which examined groups of primates — including humans — and how they each develop concepts of selflessness and cooperation. As reported in IFScience:

The researchers designed a test in which a food treat was placed on a sliding board. The individual moving the board can bring the treat within reach of others within the group, but will not be able to get the food themselves.

The experiment was carried out in 24 groups across 15 species of primates, including 3 groups of human children who were 5-7 years old. The food selection was tailored for each group, in order to test whether or not the primate would willingly give up a desired treat. The researchers found that species who most often utilized the “it takes a village” style of cooperative breeding were also more likely to help someone else get a treat, even though they didn’t get one themselves.

“Humans and callitrichid monkeys acted highly altruistically and almost always produced the treats for the other group members. Chimpanzees, one of our closest relatives, however, only did so sporadically,” Burkart explained in a press release.

The researchers also examined possible relationships between giving a treat to a friend and other cooperative behaviors, such as group hunting and complex social bonds, as well as relative brain size. Cooperative breeding was the only trait that showed a strong linear correlation and was the best metric for predicting altruistic behavior.

“Spontaneous, altruistic behavior is exclusively found among species where the young are not only cared for by the mother, but also other group members such as siblings, fathers, grandmothers, aunts and uncles,” Burkart continued.

However, cooperative breeding is likely one of many factors that could have influenced the evolution of altruism among humans. Over the evolutionary history of our ancestors, living in cooperative groups may have benefited greatly from high cognitive abilities, especially regarding things like language skills.

Burkart concluded: “When our hominin ancestors began to raise their offspring cooperatively, they laid the foundation for both our altruism and our exceptional cognition.”

In other words, being altruistic comes as natural to us as any other trait we consider to be quintessentially human (language, higher thinking, etc). Not only is it a virtue in itself, but it serves a pivotal role to our survival and flourishing. Working in tandem with the other characteristics of higher sentience, altruism helped grow and solidify social bonds, which in turn facilitates the cooperation and organization that is so vital to an otherwise defenseless and vulnerable species.

In fact, without our high cognitive capacity — our ability to share and develop new ideas, to invent, to coordinate and work together — we would not have survived against the harsh elements and the many physically superior predators that inhabited it. In the aggregate, every individual act of welfare and assistance to others helps create a stronger and more robust society that can better survive and prosper.

Shortly after the IFLS piece, NPR also published an article on the subject of altruism and its roots in human biology. It was inspired by the case of Angela Stimpson, a 42-year-old woman who donated a kidney to a complete stranger without any credit or reward. She cited a sense of purpose as her motivation, echoing many other altruists who claim to derive meaning from being kind and doing good deeds.

So what is the psychological basis of this position?  That is what Abigail Marsh of Georgetown University,a leading researcher on altruism, set out to discover:

Marsh wanted to know more about this type of extraordinary altruism, so she decided to study the brains of people who had donated a kidney to a stranger. Of the 39 people who took part in the study, 19 of them, including Angela Stimpson, were kidney donors.

Marsh took structural images to measure the size of different parts of their brains and then asked the participants to run through a series of computer tests while their brains were being scanned using functional MRI. In one test, they were asked to look at pictures of different facial expressions, including happiness, fear, anger, sadness and surprise.

Most of the tests didn’t find any differences between the brains of the altruistic donors and the people who had not been donors. Except, Marsh says, for a significant difference in a part of the brain called the amygdala, an almond-shaped cluster of nerves that is important in processing emotion.

These findings are the polar opposite to research Marsh conducted on a group of psychopaths. Using the same tests as with the altruists, Marsh found that psychopaths have significantly smaller, less active amygdalas. More evidence that the amygdala may be the brain’s emotional compass, super-sensitive in altruists and blunted in psychopaths, who seem unresponsive to someone else’s distress or fear.

The amygdala is part of the brain’s limbic system, the area that primarily houses our emotional life, and that plays a large role in forming memories and making decisions. Neither the study nor articles delves into the causality of the relationship between amygdala size and altruism: is it a large amygdala that leads one to become more selfless? Or does engaging in enough altruistic act over time cause the amygdala to grow larger? There is still much to learn about this area of the body.

But one thing is for certain: for all the negative behaviors and habits we associate with human nature, we must not overlook or understate just how intimately tied our humanity is with acts of kindness and compassion. From our biology to our neurology, humans, for the most part, have an instinct to be kind whenever and however possible. The key is to build upon these foundations, cultivate them in others, and figure out how to correct any naturalistic imbalances that may undermine. A difficult and long-term goal, but certainly a worthy and ultimately human one.

Featured Image -- 6073

The complete guide to procrastinating at work


The perfect post for starting the post-Labor Day workweek. I can certainly relate with a lot of what is stated here, in both my personal and professional life. It is nice to see more scientific attention centered on what is no doubt an increasingly common issue in the modern world. As it turns out, there is a rhyme and reason to procrastination beyond mere fatigue or laziness — in fact, it is a completely different creature altogether.

Originally posted on Quartz:

Some research says the best way to spark creativity is to walk away and that the best ideas come from those least-expected “aha!” moments. So maybe procrastination isn’t such a bad thing after all. Or is time spent on those cat memes taking its toll? Can procrastinating ever be a source of productivity?

Here’s the complete guide to procrastinating at work:

Clever people procrastinate smartly

The Creativity Research Journal studied the working habits of a particularly intelligent group of people, winners of the Intel Science Talent competition. They found the group procrastinated productively. Some used procrastination as a trigger for a helpful amount of stress needed to ignite positive action. Others saw it as a “thought incubator”: They put off making a decision because they wanted to fully process it before finding a solution.

Procrastinate using your to-do list

The same study also found that the tasks the science competition winners were doing while avoiding work were helping in other areas of their…

View original 695 more words

My Ambivalence on Human Nature

It is astounding how just a cursory glance of the human condition at any given moment can simultaneously yield so much good and evil. Browsing through one day’s worth of news, I can find such a mixed bag of humanity’s best and worst tendencies. It puts my mood in such a flux.

I will read an article about an altruistic act, a peace accord, the lifting of millions from poverty, or some other event on either a micro or macro level that demonstrates moral and social progress. Then right after I see some stomach-churning demonstration of cruelty, whether it is a heinous crime, warfare, or the immiseration of millions by a seemingly impervious regime.

After so many weeks, months, and years of taking it all in, both academically and autodidactically, it is difficult for me to be consistently cynical or optimistic about the course of humanity. Perhaps this is to be expected, since our thoughts and worldviews are shaped by our experience and knowledge, both of which are continuously changing.

As it stands, I suppose I am currently cautiously optimistic, because we have nonetheless come a long way as a species, even if innumerable vices and problems remain. I see the potential for progress and prosperity, even amid so many reminders of our proneness to fallibility, apathy, and hatred.

I think it is worth acknowledging that suffering would remain even with the best intentions. There are still the vagaries of natural disasters, disease, and simple misfortune (accidents and what not). The impact of all these factors can and have been reigned in, but I feel it is only up to a point.

Sorry if my thoughts seem all over the place, this was sort of a stream of consciousness. It goes without saying that I am fortunate to have the luxury to pontificate on such things in so much comfort, largely by an accident of birth.


The Awesome Power of Our Divided Brain

The following video from RSA explains how the hemispheric nature of our brains — which is poorly understood by most people — has profoundly affected human behavior, culture, and society. It’s part of a lecture given by renowned psychiatrist and writer Iain McGilchrist, whose full talk can be seen here. I hope you enjoy.

As always, feel free to share your thoughts and feedback.


War and Human Nature

From what I’ve seen, it’s become something of a canard to say that war is intrinsic to human nature. Large scale violence is not only uniquely human, but inseparably so, such that it’s hard to imagine human existence without it.

But a recent study is casting doubt on this widely-accepted and seemingly verified “Deep Roots Theory” of human violence. Scientific American reports on the research published today in Science, Lethal Aggression in Mobile Forager Bands and Implications for the Origins of War.

Of the 21 societies examined by Fry and Soderberg, three had no observed killings of any kind, and 10 had no killings carried out by more than one perpetrator. In only six societies did ethnographers record killings that involved two or more perpetrators and two or more victims. However, a single society, the Tiwi of Australia, accounted for almost all of these group killings.

Some other points of interest: 96 percent of the killers were male. No surprise there. But some readers may be surprised that only two out of 148 killings stemmed from a fight over “resources,” such as a hunting ground, water hole or fruit tree. Nine episodes of lethal aggression involved husbands killing wives; three involved “execution” of an individual in a group by other members of the group; seven involved execution of “outsiders,” such as colonizers or missionaries.

Most of the killings stemmed from what Fry and Soderberg categorize as “miscellaneous personal disputes,” involving jealousy, theft, insults and so on. The most common specific cause of deadly violence—involving either single or multiple perpetrators–was revenge for a previous attack.

These data corroborate a theory of warfare advanced by Margaret Mead in 1940. Noting that some simple foraging societies, such as Australian aborigines, can be warlike, Mead rejected the idea that war was a consequence of civilization. But she also dismissed the notion that war is innate–a “biological necessity,” as she put it – simply by pointing out (as Fry and Soderberg do) that some societies do not engage in intergroup violence.

Mead (again like Fry and Soderberg) found no evidence for what could be called the Malthusian theory of war, which holds that war is the inevitable consequence of competition for resources.

Instead, Mead proposed that war is a cultural “invention”—in modern lingo, a meme, that can arise in any society, from the simplest to the most complex. Once it arises, war often becomes self-perpetuating, with attacks by one group provoking reprisals and pre-emptive attacks by others.

The war meme also transforms societies, militarizes them, in ways that make war more likely. The Tiwi seem to be a society that has embraced war as a way of life. So is the United States of America.

Needless to say, I’m awaiting more research on the subject. But whatever the case is, I think it’s important not to view mass violence in such a fatalistic way. That mentality would only perpetuate a self-fulfilling prophecy, in which we’re more willing to accept war as an institution — or solution — by virtue of its apparent inevitability. This same approach accounts for many other moral and social evils.

Even if such negative behaviors do have deep roots, that’s hardly an excuse for not trying to mitigate their influence. Most human behavior stems from both nature and nurture, and I’m not aware of any human characteristic that strictly falls under one sphere or the other. Thus, there is always some avenue for improvement, albeit through concerted multidimensional efforts — better material conditions, in combination with quality education (formal and informal), tends to lead to a vast reduction in social ills.

Hormones Influence Ideology

A growing body of research is finding that biological and psychological factors influence the beliefs we otherwise feel are freely decided upon. From social attitudes to religious piety and even voting patterns, it seems that much of worldview is shaped by deterministic patterns we scarcely notice. An article in MoJo by Chris Mooney offers some pretty interesting details. Continue reading

Human Nature and Apathy

Many people, myself included, lament the fact that our species is so apathetic to the widespread suffering that is plentifully around us. However tragic, such indifference is both natural and expected. Our minds were not evolved for absorbing the sheer amount of stimulus that exists in the world.

Only very recently have most humans become regularly exposed to the overwhelming amount of people, events, and information that exists and multiplies all around us. There is a limit to how much we can think about or emotionally react to, and that’s why our immediate suffering — our trivial “first world problems” — is felt far more strongly that the more horrible but distant misery that exists out there. Telling someone that others have it worse is admirable but futile because our brains feel the personal circumstances more substantively and intimately than abstract ones.

It’s for this reason that society will obsess more about individual negative events highlighted in news versus the bigger but nameless and faceless statistics of human poverty. In fact, this is the same reason you’re more likely to donate to an individual suffering person than to broader charitable in general — look up Paul Slovik’s “psychic numbing” phenomenon. In some sense, this may even be a merciful defense mechanism — imagine if all the tremendous suffering in the world was equally impactful. We’d likely succumb to severe depression and misanthropy, or become very withdrawn.

Of course, I’m not saying this excuses callousness or apathy. We can still love and care for one another beyond our closest loved ones. We don’t need to be deeply affected by all the human suffering in the world in order to be troubled by it and seek to alleviate it. Empathy and social responsibility are intrinsic to our species. We must simply adapt to the existence of this new global community and expand our circle of compassion and consideration to be far wider. It’s difficult but not impossible, in my opinion.

What are your thoughts?

Beauty and Brains

I find it interesting that whenever a very attractive person — particularly a woman — demonstrates above-average intelligence or skill, it genuinely surprises most people. Similarly, I’ve seen people marvel at how a “nerdy” person can be athletic or charismatic. Needless to say, those peers who are both attractive and intelligent feel endless frustration at being reflexively labeled based solely on their looks and initial impression.

But this is nothing new, as humans were evolved to make quick judgements based little data — it’s a survival mechanism that has remained, often misapplied, in the modern world. In this instance, we seem to unconsciously associate good looks with stupidity or, at most, average intelligence (admittedly, I think even I have been guilty of this visceral stereotyping).

I’ve read a hypothesis suggesting that this correlation reflects a form of evolutionary compensation:  if one isn’t attractive, they make up for it by making themselves desirable in other ways; similarly, an unskilled or unintelligent person may harness whatever charisma or physical attractiveness they have to influence others or burnish their image. We see this pattern and therefore apply it in how we judge and analyze people.

In any case, it is interesting to note that traditionally (and for the most part to this day), heroic and virtuous characters in various media have almost always been portrayed as good looking, and intelligence is rarely shown to be mutually exclusive with physical attractiveness. Of course, this too likely reflects our evolutionarily-induced preference for well-rounded, attractive people.

Anyway, has anyone else noticed this? Is there a reason for these correlations? What are you thoughts on this?


The “end of history illusion” describes an almost universal phenomenon among human beings, in which we have a tendency to see the present time as the stopping point for any change of in lives. Once we reach a certain age, we essentially assume that from then on we’ll remain the same. It’s a little difficult to describe given that the “present” tense of time slides as we age. But the New York Times has a great article on it:

When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years. But when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.

They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement.

“Middle-aged people — like me — often look back on our teenage selves with some mixture of amusement and chagrin,” said one of the authors, Daniel T. Gilbert, a psychologist at Harvard. “What we never seem to realize is that our future selves will look back and think the very same thing about us. At every age we think we’re having the last laugh, and at every age we’re wrong.”

Trying to explain this tendency yields even more interesting considerations. After all, if people acknowledge how much they’ve changed over the years, why can’t they seem to realize that such change will continue?

People seemed to be much better at recalling their former selves than at imagining how much they would change in the future.

Why? Dr. Gilbert and his collaborators, Jordi Quoidbach of Harvard and Timothy D. Wilson of the University of Virginia, had a few theories, starting with the well-documented tendency of people to overestimate their own wonderfulness.

“Believing that we just reached the peak of our personal evolution makes us feel good,” Dr. Quoidbach said. “The ‘I wish that I knew then what I know now’ experience might give us a sense of satisfaction and meaning, whereas realizing how transient our preferences and values are might lead us to doubt every decision and generate anxiety.”

Or maybe the explanation has more to do with mental energy: predicting the future requires more work than simply recalling the past. “People may confuse the difficulty of imagining personal change with the unlikelihood of change itself,” the authors wrote in Science.

But it’s false comfort, as this mentality does have its caveats:

The phenomenon does have its downsides, the authors said. For instance, people make decisions in their youth — about getting a tattoo, say, or a choice of spouse — that they sometimes come to regret.

I think it comes down to the nature of the human mind. Our brains are limited in their capacity to look into the future. Our senses and perceptions are shaped by the here and the now, not by a hypothetical future that is far and away – and therefore difficult to grasp, let alone feel concerned about. Try as we might, we’re just too cognitively limited.