The Psychology of Misunderstanding

Misunderstanding someone, and being misunderstood in turn, is an indelible part of the human experience. So it is not surprising that there is a deep psychological basis for this inconvenient — and often times even dangerous — tendency to mutually misinterpret each other.

Business Insider and The Atlantic report on research that is getting to the bottom of why humans seem inherently unable to read one another’s feelings and intentions (or conversely, clearly convey their own). The reasons — and solutions — are pretty interesting:

First, most people suffer from what psychologists call “the transparency illusion” — the belief that what they feel, desire, and intend is crystal clear to others, even though they have done very little to communicate clearly what is going on inside their minds.

Because the perceived assume they are transparent, they might not spend the time or effort to be as clear and forthcoming about their intentions or emotional states as they could be, giving the perceiver very little information with which to make an accurate judgment.

“Chances are,” Halvorson writes, “how you look when you are slightly frustrated isn’t all that different from how you look when you are a little concerned, confused, disappointed, or nervous.

Your ‘I’m kind of hurt by what you just said’ face probably looks an awful lot like your ‘I’m not at all hurt by what you just said’ face. And the majority of times that you’ve said to yourself, ‘I made my intentions clear,’ or ‘He knows what I meant,’ you didn’t and he doesn’t.”

In other words, we have a blindspot with respect to our own behavior and communication. We fail to recognize, let alone see, that we might be coming off a certain way to others than we mean to. This goes a long way to explain another common human failing: hypocrisy.

While many hypocritical acts are no doubt deliberate, a lot of times it is accidental — you genuinely do not notice you are acting contrary to your intention behaviors and values. The transparency illusion applies as much to ourselves as to our external communications with others. We think our principles and values are clear, and thus fail to be vigilant or aware of any instance in which we violate them. After all, it is neither instinctive nor feasible to be methodically analyzing each and every action or statement. Hence we tend to just assume we are consistent and principled as we think we are.

All this touches on the next conclusion of the study, which looks at our perceptions to one another (and towards ourselves):

The perceiver, meanwhile, is dealing with two powerful psychological forces that are warping his ability to read others accurately. First, according to a large body of psychological research, individuals are what psychologists call “cognitive misers.” That is, people are lazy thinkers.

he perceiver, meanwhile, is dealing with two powerful psychological forces that are warping his ability to read others accurately. First, according to a large body of psychological research, individuals are what psychologists call “cognitive misers.” That is, people are lazy thinkers.

According to the work of the Nobel Prize winner Daniel Kahneman, there are two ways that the mind processes information, including information about others: through cognitive processes that Kahneman calls System 1 and System 2. These “systems,” which Kahneman describes in his book “Thinking Fast and Slow,” serve as metaphors for two different kinds of reasoning.

System 1 processes information quickly, intuitively, and automatically. System 1 is at work, as Halvorson notes in her book, when individuals engage in effortless thinking, like when they do simple math problems like 3 + 3 = 6, or when they drive on familiar roads as they talk to a friend in the car, or when they see someone smile and immediately know that that person is happy.

When it comes to social perception, System 1 uses shortcuts, or heuristics, to come to conclusions about another person. There are many shortcuts the mind relies on when it reads others facial expressions, body language, and intentions, and one of the most powerful ones is called the “primacy effect” and it explains why first impressions are so important.

According to the primacy effect, the information that one person learns about another in his early encounters with that person powerfully determines how he will see that person ever after.

For example, referring to research conducted about the primacy effect, Halvorson points out that children who perform better on the first half of a math test and worse on the second half might be judged to be smarter than those who perform less well on the first part of the test, but better on the second part.

In contrast to System 1 style of thinking, which is biased and hasty, System 2 processes information in a conscious, rational, and deliberative manner. Whereas System 1 thinking is automatic and effortless, System 2 thinking takes effort.

Thus, System 2 acts as a check on System 1. It helps evaluate and update first impressions, prejudices, and other brash thoughts. It is basically a backup for when your thoughts fail you.

But as I alluded to during my tangent about hypocrisy, this sort of deeper, conscious thinking takes time and mental energy. In fact, it is rarely ever engaged in without some sort of external trigger or reminders — such as someone pointing out that you misunderstood them or read a certain situation wrong (even then, egotism, face-saving, or just plain arrogance might leave you resistant to sincere self-analysis).

But as the article points out, humans are otherwise too inclined to be “cognitive misers” to go much further beyond System 1. Hence why misunderstandings and miscommunications alike are so common.

To make matters more complicated, there is more to interpersonal conflict than a shortcoming in our thought processes. A lot of other variables — albeit as just as psychologically inherent — are at play, too.

Perception is also clouded by the perceiver’s own experiences, emotions, and biases, which also contributes to misunderstandings between people. As Halvorson puts it, everyone has an agenda when they interact with another person. That agenda is usually trying to determine one of three pieces of information about the perceived: Is this person trustworthy? Is this person useful to me? And does this person threaten my self-esteem?

How a perceiver answers those questions will determine whether she judges the other person in a positive or negative way. Take self-esteem. Researchers have long found that individuals need to maintain a positive sense of themselves to function well.

When someone’s sense of herself is threatened, like when she interacts with someone who she thinks is better than her at a job they both share, she judges that person more harshly. One study found, for example, that attractive job applicants were judged as less qualified by members of the same sex than by members of the opposite sex. The raters who were members of the same sex, the researchers found, felt a threat to their self-esteem by the attractive job applicants while the members of the opposite sex felt no threat to their self-esteem.

In a sense, there is something reassuring about a lot of our misunderstandings being rooted in flaws that are mostly beyond our control. It is not that most people have bad intentions or are purposefully being obtuse, unclear, or inconsiderate — it is that our minds and cognitive capacity make us inherently prone to faulty thinking, nearly always without us realizing it.

Given all these obstacles to accurately perceiving someone (or conveying yourself to them), what do people have to do to come across they way they intend to?

“If you want to solve the problem of perception,” Halvorson says, “it’s much more practical for you to decide to be a good sender of signals than to hope that the perceiver is going to go into phase two of perception. It’s not realistic to expect people to go to that effort.

Can you imagine how exhausting it would be to weigh every possible motivation of another person? Plus, you can’t control what’s going on inside of another person’s mind, but you can control how you come across.”

People who are easy to judge — people who send clear signals to others, as Halvorson suggests people do—are, researchers have found, ultimately happier and more satisfied with their relationships, careers, and lives than those who are more difficult to read.

It’s easy to understand why: Feeling understood is a basic human need. When people satisfy that need, they feel more at peace with themselves and with the people around them, who see them closer to how they see themselves.

In a recent discussion about this article with some friends, it was brought up whether or not humans should somehow be altered, perhaps with cybernetic implants or something, so that they can think and communicate more clearly. Setting aside the precise means and mechanics of it, the hypothetical suggests that we if somehow eliminate our tendency to misunderstand and miscommunicate with each other, the world would be a better place overall.

Humans would be less prone to anxiety, less likely to fight with loved ones or make wrong assumptions about strangers, and refrain from the sort of violence that is often predicated by misunderstanding.

But this would raise questions about how fundamentally different human behavior and society as a whole would be without this barrier between us. Our individual and collective psychology is shaped by this constant and fundamentally human inability to communicate or understand clearly. As a species, we have developed all sorts of ideas, rituals, approaches, institutions, and even art forms to get around this problem, or to express ourselves in alternative ways. What would happen to all of that if we removed this inconvenient yet familiar issue?

It is a bit of a tangent, but it touches on the overall point expressed in this research and many more about how biological, psychological, and evolutionary limitations shape our existence and affect our conditions. What are your thoughts?

Quote: On The End Of Trends

How about this: these days there are no scenes or genres, only “aesthetics.” A scene implies a physical community in physical architectures, and as such is a fatal slur against the URL everspace and its viral lungs. A genre implies limits, intentions, rules, fixity, and—as every itchy-fingered Facebook commenter knows—is a hateful thing. Nothing exists anyways, not really, only names, only hyperlinks, only patterns that work up to a point and then need an upgrade. Backspace your tearful emojis, hypocrites, it’s always been that way; it’s just more obvious now that code flows through our arteries rather than squeezes of blood and other smells. But it’s not homogenous out there and never will be, the online underground and the cultures tapping its magma are built on a vector field that ripples and clumps together, each blob too quick and continuous for your Dad’s rock collection. An aesthetic is not an object, it’s a way of looking, a way of finding beauty and sifting experiences, originating with process and behavior rather than product, or, indeed, a journalist with a butterfly net.

[…] “Aesthetic,” a word that doesn’t prioritize any one particular medium of art and even suggests them all together, is a much more suitable term than “trend” or “genre,” and highly applicable to previous online-underground-led movements like vaporwave and sea punk for which imagery and multimedia is a hugely significant and probably defining factor.

— Adam Harper writing in The Faderas quote in The Atlantic

I for one welcome the end of rigidly defined, strictly enforced subcultures — assuming such a thing really existed in the first place. One of the most defining and influential aspects of the Information Age is the widespread access to all sorts of aesthetics, ideas, fashions, styles, and other cultural and intellectual outputs. With so much to command our attention, how else could any individual simply stick to one narrative, idea, or aesthetic preference?

Why keep only to rock music, sports fandom, or comic books when you can have all of the above and then some? Why feel that you need to be part of some cohesive and internally conforming subculture — akin to membership in a formal club with strict rules and guidelines — when you can follow the patterns, practices, and preferences you want based solely on what you genuinely enjoy; social circles built around particular interests need not be mutually exclusive from other activities and interests. There is no reason why loving sports and fitness puts you at odds with nerdier pursuits like video games and science fiction (or why those things should even be the exclusive purview of nerds to begin with).

For that matter, highbrow and low-brow pursuits can sit perfectly comfortably with one another: the idea that one must be a high-class auteur to enjoy orchestral music and Broadway plays is at odds with observed reality. Yes, there are some correlations between one’s class and identity and what one tends to enjoy doing — though that has as much to do with economic barriers to certain activities more than anything — but that is not always the case when people have freer access to the sorts of trends and interests they genuinely would enjoy if they had the time, resources, exposure, etc.

Of course, as usual, it is more complicated than that. People like categories and labels, however much they try to convince others (and themselves) otherwise. By neatly organizing these things, as well as other people and ourselves, we make all the information and stimuli out there easier to manage and keep track of. This is especially salient in an age where we are bombarded by ideas, concepts, designs, and other data all the time.

It is perhaps understandable then that people are threatened by, or even resentful of, perceived outsiders encroaching on their traditional territory: their subculture was fundamental to their identity before the walls began breaking down and the lines blurred, allowing people who once lacked any stock or interest in these activities to take part more easily than before (again, the increasingly mainstream nature of nerd culture is the most recognizable example, but hardly the only one).

Moreover, in the social media context, wherein everyone feels the need to sell or present themselves to a wider network of contacts and friends, listing one’s preferred musical or film genres, political persuasion, or religious adherence is a way to stand out and feel validated. As a social species, we need our peers — from loved ones to even strangers — to have some sort of impression, reaction, or conception of us: as intellectuals, sports fans, artists, blue collar laborers, etc. How will we adjust to the ever-growing circle of social connections to worry about and be accountable to? How will we adapt to the fact that so many previously exclusive and inaccessible things are increasingly available to all?

At this point, I am just expressing a stream of consciousness, so I am sure I missed something. What are your thoughts guys?

Reflecting On The Killing Of Three Muslim Students

I rarely post about current events or news stories, but I have a rare bit of time and this even merits attention and reflection.

Last night, three Muslim students — Deah Barakat, 23; his wife, Yusor Mohammad Abu-Salha, 21; and her sister, Razan Mohammad Abu-Salha, 19 — were shot dead at a housing complex near University of North Carolina in Chapel Hill. The perpetrator was Craig Stephen Hicks, 46, who handed himself over to the police afterward. News is still unfolding as of this post, and the motive remains unclear, though some reports claim cite a dispute over parking — of all things to kill lover.

The natural question that comes to mind (or that should) is whether this incident was motivated by anti-Islam bigotry. This would certainly fit the pattern of post-9/11 attacks and harassment towards Muslims or those perceived to be Muslim (namely Sikhs). Opposition to Islam, ranging from criticism of the religion to out-and-out bigotry, have definitely seen an uptick in recent months following high-profile incidents involving Islamic extremists, such as the Charlie Hebdo shootings and the barbarism of Boko Haram and IS.

Given the present lack of information, it is difficult to determine why Hicks killed these people, although some sources have pointed out his open condemnation and mockery of organized religion on social media, as well as his association with atheist groups (albeit mainstream ones like Atheist for Equality that, to my knowledge, do not advocate violence or discrimination against religion people).

Ultimately, whether or not the perpetrator’s dislike of religion played a role in his decision to escalate a dispute into a murderous assault, it remains true that his atheism did not prevent him from such an immoral crime.

This tragic incident reaffirms why I much prefer the label of secular humanist over just plain atheist, precisely because mere disbelief in a deity or the supernatural says nothing about one’s morality or character. Atheism denotes what you do not have — religious beliefs — but not what you have chosen to replace said beliefs or ethical foundations with. Hence why atheists run the gamut from humanists like Albert Einstein to monsters like Joseph Stalin.

It goes without saying that a humanist framework is one that precludes violence against other humans, regardless of their beliefs, religious or otherwise. Of course people will always harm and kill one another regardless of whatever authority or precept they alleged to follow or associate with, whether it is secular or religious in nature. But this fact of human nature, whereby bad actions are caused by all sorts of other factors outside professed belief, does not preclude the creation of a comprehensive and authoritative moral and ethical framework.

Moreover, it is worth pointing out the distinction between being critical of religion as an idea and institution — all while still recognizing the humanity of its adherents — and hating religiously identifying people on such a visceral and hateful level as the perpetrator allegedly did. I myself am highly critical of religion as a whole, but I certainly do not view religious people as this faceless Other without personality, hopes, dreams, feelings, and humanity. Atheist or not, there is a difference between disliking or criticizing beliefs and ideas and taking the next step to hate or kill those innocents who hold such beliefs without harm to anyone else.

That said, it is important to remind fellow atheists to be careful to distinguish themselves (and their atheist leaders) as religious skeptics from religious bigots who incite such attacks or (in thankfully rare cases) directly perpetrates them. I am not trying to make this tragedy about me or the atheist movement, but highlighting the inherent dangers of proclaiming moral superiority by virtue of casting off religion while ignoring that one can still be a bad person, morally or behaviorally, regardless of what one believes.

If we are going to promote a skeptical view of religion, and opposition to its more harmful affects (both institutional and ideological), than we must do so alongside the propagation of a humanist ethic. By all means, critique religion and seek to minimize its harm, as I certainly do, but also recognize and fight the harms of non-religious origin, and more importantly see the humanity of the billions of fellow humans who, like it or not, hold religious views of some form or another.

All that said, I do not mean to read into this senseless act the larger issue of bigotry, lack of empathy, and the like; while likely factors, the details once again remain unknown for certain. It is also certainly not my intention to exploit a tragedy as an opportunity to get on a soap box for my own purposes and movement.

Rather, I am just tired of seeing people kill each other in such wanton manners for one reason or another: ideological, religious, anti-religious, opportunistic, etc. While I know this horror is a fact of human existence (at least for the foreseeable future — I cling to a kernel of utopianism), that does not mean that I want to be indifferent to the large psychological, social, and ideological factors underpinning so much of the killing and harming that goes on everyday somewhere in the world.

Given what little help I can lend to these unfortunate victims, the very least I can do — and in fact, feel obligated to do — is use the opportunity to reflect upon my own moral foundations and those of my fellow humans, both secular and non-religious. Maybe it is my way of trying to make sense of the senseless, or trying to derive meaning from sheer tragedy, but it is all I can do. I like to think that if enough of us continuous reflect on why we do the awful things we do, and what we can do about it, such barbarous acts will become more rare if not extinct.

One can still dream. In the meantime, my heart goes out to the victims and their loved ones. From what reports show, these young people were not only bright and talented, but socially conscious and humanitarian. By all accounts, they were, in other words, what humanists should aspire to be.

Altruism: It’s In Our DNA

Although, like most people, I have my cynical and misanthropic moments, I broadly consider myself to be an optimist with regards to human nature and our species’ capacity to improve itself and the world (arguably, I would be a poor humanist if I did not believe in the positive potential of humanity). The ability to practice concern for the welfare of others, without any want of reward or gain, represents one of the key virtues that will lead to a better world.

Much of my confidence stems from my own broadly beneficial experience with my fellow humans: I am fortunate to have experienced and witnessed so much kindness, compassion, and understanding. While my intimate study and exposure to the worst of humanity, past and present, has no doubt tempered my faith, I remain committed to the idea that humans are not in any sense fundamentally evil or violent, as many would believe.

Indeed, whatever moral and cognitive failings seem innate to our species seems offset by an inherent, evolutionary capacity to transcend such faults. Aside from ample anecdotal evidence of humans (as well as other primates) demonstrating selfless behavior, there is a large and growing body of research proving that selflessness and conscientiousness is a fundamental aspect of being human.

One of the most recent studies to explore the origins of human altruism was conducted by a team from the University of Zurich in Switzerland, which examined groups of primates — including humans — and how they each develop concepts of selflessness and cooperation. As reported in IFScience:

The researchers designed a test in which a food treat was placed on a sliding board. The individual moving the board can bring the treat within reach of others within the group, but will not be able to get the food themselves.

The experiment was carried out in 24 groups across 15 species of primates, including 3 groups of human children who were 5-7 years old. The food selection was tailored for each group, in order to test whether or not the primate would willingly give up a desired treat. The researchers found that species who most often utilized the “it takes a village” style of cooperative breeding were also more likely to help someone else get a treat, even though they didn’t get one themselves.

“Humans and callitrichid monkeys acted highly altruistically and almost always produced the treats for the other group members. Chimpanzees, one of our closest relatives, however, only did so sporadically,” Burkart explained in a press release.

The researchers also examined possible relationships between giving a treat to a friend and other cooperative behaviors, such as group hunting and complex social bonds, as well as relative brain size. Cooperative breeding was the only trait that showed a strong linear correlation and was the best metric for predicting altruistic behavior.

“Spontaneous, altruistic behavior is exclusively found among species where the young are not only cared for by the mother, but also other group members such as siblings, fathers, grandmothers, aunts and uncles,” Burkart continued.

However, cooperative breeding is likely one of many factors that could have influenced the evolution of altruism among humans. Over the evolutionary history of our ancestors, living in cooperative groups may have benefited greatly from high cognitive abilities, especially regarding things like language skills.

Burkart concluded: “When our hominin ancestors began to raise their offspring cooperatively, they laid the foundation for both our altruism and our exceptional cognition.”

In other words, being altruistic comes as natural to us as any other trait we consider to be quintessentially human (language, higher thinking, etc). Not only is it a virtue in itself, but it serves a pivotal role to our survival and flourishing. Working in tandem with the other characteristics of higher sentience, altruism helped grow and solidify social bonds, which in turn facilitates the cooperation and organization that is so vital to an otherwise defenseless and vulnerable species.

In fact, without our high cognitive capacity — our ability to share and develop new ideas, to invent, to coordinate and work together — we would not have survived against the harsh elements and the many physically superior predators that inhabited it. In the aggregate, every individual act of welfare and assistance to others helps create a stronger and more robust society that can better survive and prosper.

Shortly after the IFLS piece, NPR also published an article on the subject of altruism and its roots in human biology. It was inspired by the case of Angela Stimpson, a 42-year-old woman who donated a kidney to a complete stranger without any credit or reward. She cited a sense of purpose as her motivation, echoing many other altruists who claim to derive meaning from being kind and doing good deeds.

So what is the psychological basis of this position?  That is what Abigail Marsh of Georgetown University,a leading researcher on altruism, set out to discover:

Marsh wanted to know more about this type of extraordinary altruism, so she decided to study the brains of people who had donated a kidney to a stranger. Of the 39 people who took part in the study, 19 of them, including Angela Stimpson, were kidney donors.

Marsh took structural images to measure the size of different parts of their brains and then asked the participants to run through a series of computer tests while their brains were being scanned using functional MRI. In one test, they were asked to look at pictures of different facial expressions, including happiness, fear, anger, sadness and surprise.

Most of the tests didn’t find any differences between the brains of the altruistic donors and the people who had not been donors. Except, Marsh says, for a significant difference in a part of the brain called the amygdala, an almond-shaped cluster of nerves that is important in processing emotion.

These findings are the polar opposite to research Marsh conducted on a group of psychopaths. Using the same tests as with the altruists, Marsh found that psychopaths have significantly smaller, less active amygdalas. More evidence that the amygdala may be the brain’s emotional compass, super-sensitive in altruists and blunted in psychopaths, who seem unresponsive to someone else’s distress or fear.

The amygdala is part of the brain’s limbic system, the area that primarily houses our emotional life, and that plays a large role in forming memories and making decisions. Neither the study nor articles delves into the causality of the relationship between amygdala size and altruism: is it a large amygdala that leads one to become more selfless? Or does engaging in enough altruistic act over time cause the amygdala to grow larger? There is still much to learn about this area of the body.

But one thing is for certain: for all the negative behaviors and habits we associate with human nature, we must not overlook or understate just how intimately tied our humanity is with acts of kindness and compassion. From our biology to our neurology, humans, for the most part, have an instinct to be kind whenever and however possible. The key is to build upon these foundations, cultivate them in others, and figure out how to correct any naturalistic imbalances that may undermine. A difficult and long-term goal, but certainly a worthy and ultimately human one.

Featured Image -- 6073

The complete guide to procrastinating at work


The perfect post for starting the post-Labor Day workweek. I can certainly relate with a lot of what is stated here, in both my personal and professional life. It is nice to see more scientific attention centered on what is no doubt an increasingly common issue in the modern world. As it turns out, there is a rhyme and reason to procrastination beyond mere fatigue or laziness — in fact, it is a completely different creature altogether.

Originally posted on Quartz:

Some research says the best way to spark creativity is to walk away and that the best ideas come from those least-expected “aha!” moments. So maybe procrastination isn’t such a bad thing after all. Or is time spent on those cat memes taking its toll? Can procrastinating ever be a source of productivity?

Here’s the complete guide to procrastinating at work:

Clever people procrastinate smartly

The Creativity Research Journal studied the working habits of a particularly intelligent group of people, winners of the Intel Science Talent competition. They found the group procrastinated productively. Some used procrastination as a trigger for a helpful amount of stress needed to ignite positive action. Others saw it as a “thought incubator”: They put off making a decision because they wanted to fully process it before finding a solution.

Procrastinate using your to-do list

The same study also found that the tasks the science competition winners were doing while avoiding work were helping in other areas of their…

View original 695 more words

My Ambivalence on Human Nature

It is astounding how just a cursory glance of the human condition at any given moment can simultaneously yield so much good and evil. Browsing through one day’s worth of news, I can find such a mixed bag of humanity’s best and worst tendencies. It puts my mood in such a flux.

I will read an article about an altruistic act, a peace accord, the lifting of millions from poverty, or some other event on either a micro or macro level that demonstrates moral and social progress. Then right after I see some stomach-churning demonstration of cruelty, whether it is a heinous crime, warfare, or the immiseration of millions by a seemingly impervious regime.

After so many weeks, months, and years of taking it all in, both academically and autodidactically, it is difficult for me to be consistently cynical or optimistic about the course of humanity. Perhaps this is to be expected, since our thoughts and worldviews are shaped by our experience and knowledge, both of which are continuously changing.

As it stands, I suppose I am currently cautiously optimistic, because we have nonetheless come a long way as a species, even if innumerable vices and problems remain. I see the potential for progress and prosperity, even amid so many reminders of our proneness to fallibility, apathy, and hatred.

I think it is worth acknowledging that suffering would remain even with the best intentions. There are still the vagaries of natural disasters, disease, and simple misfortune (accidents and what not). The impact of all these factors can and have been reigned in, but I feel it is only up to a point.

Sorry if my thoughts seem all over the place, this was sort of a stream of consciousness. It goes without saying that I am fortunate to have the luxury to pontificate on such things in so much comfort, largely by an accident of birth.


The Awesome Power of Our Divided Brain

The following video from RSA explains how the hemispheric nature of our brains — which is poorly understood by most people — has profoundly affected human behavior, culture, and society. It’s part of a lecture given by renowned psychiatrist and writer Iain McGilchrist, whose full talk can be seen here. I hope you enjoy.

As always, feel free to share your thoughts and feedback.


War and Human Nature

From what I’ve seen, it’s become something of a canard to say that war is intrinsic to human nature. Large scale violence is not only uniquely human, but inseparably so, such that it’s hard to imagine human existence without it.

But a recent study is casting doubt on this widely-accepted and seemingly verified “Deep Roots Theory” of human violence. Scientific American reports on the research published today in Science, Lethal Aggression in Mobile Forager Bands and Implications for the Origins of War.

Of the 21 societies examined by Fry and Soderberg, three had no observed killings of any kind, and 10 had no killings carried out by more than one perpetrator. In only six societies did ethnographers record killings that involved two or more perpetrators and two or more victims. However, a single society, the Tiwi of Australia, accounted for almost all of these group killings.

Some other points of interest: 96 percent of the killers were male. No surprise there. But some readers may be surprised that only two out of 148 killings stemmed from a fight over “resources,” such as a hunting ground, water hole or fruit tree. Nine episodes of lethal aggression involved husbands killing wives; three involved “execution” of an individual in a group by other members of the group; seven involved execution of “outsiders,” such as colonizers or missionaries.

Most of the killings stemmed from what Fry and Soderberg categorize as “miscellaneous personal disputes,” involving jealousy, theft, insults and so on. The most common specific cause of deadly violence—involving either single or multiple perpetrators–was revenge for a previous attack.

These data corroborate a theory of warfare advanced by Margaret Mead in 1940. Noting that some simple foraging societies, such as Australian aborigines, can be warlike, Mead rejected the idea that war was a consequence of civilization. But she also dismissed the notion that war is innate–a “biological necessity,” as she put it – simply by pointing out (as Fry and Soderberg do) that some societies do not engage in intergroup violence.

Mead (again like Fry and Soderberg) found no evidence for what could be called the Malthusian theory of war, which holds that war is the inevitable consequence of competition for resources.

Instead, Mead proposed that war is a cultural “invention”—in modern lingo, a meme, that can arise in any society, from the simplest to the most complex. Once it arises, war often becomes self-perpetuating, with attacks by one group provoking reprisals and pre-emptive attacks by others.

The war meme also transforms societies, militarizes them, in ways that make war more likely. The Tiwi seem to be a society that has embraced war as a way of life. So is the United States of America.

Needless to say, I’m awaiting more research on the subject. But whatever the case is, I think it’s important not to view mass violence in such a fatalistic way. That mentality would only perpetuate a self-fulfilling prophecy, in which we’re more willing to accept war as an institution — or solution — by virtue of its apparent inevitability. This same approach accounts for many other moral and social evils.

Even if such negative behaviors do have deep roots, that’s hardly an excuse for not trying to mitigate their influence. Most human behavior stems from both nature and nurture, and I’m not aware of any human characteristic that strictly falls under one sphere or the other. Thus, there is always some avenue for improvement, albeit through concerted multidimensional efforts — better material conditions, in combination with quality education (formal and informal), tends to lead to a vast reduction in social ills.

Hormones Influence Ideology

A growing body of research is finding that biological and psychological factors influence the beliefs we otherwise feel are freely decided upon. From social attitudes to religious piety and even voting patterns, it seems that much of worldview is shaped by deterministic patterns we scarcely notice. An article in MoJo by Chris Mooney offers some pretty interesting details. Continue reading

Human Nature and Apathy

Many people, myself included, lament the fact that our species is so apathetic to the widespread suffering that is plentifully around us. However tragic, such indifference is both natural and expected. Our minds were not evolved for absorbing the sheer amount of stimulus that exists in the world.

Only very recently have most humans become regularly exposed to the overwhelming amount of people, events, and information that exists and multiplies all around us. There is a limit to how much we can think about or emotionally react to, and that’s why our immediate suffering — our trivial “first world problems” — is felt far more strongly that the more horrible but distant misery that exists out there. Telling someone that others have it worse is admirable but futile because our brains feel the personal circumstances more substantively and intimately than abstract ones.

It’s for this reason that society will obsess more about individual negative events highlighted in news versus the bigger but nameless and faceless statistics of human poverty. In fact, this is the same reason you’re more likely to donate to an individual suffering person than to broader charitable in general — look up Paul Slovik’s “psychic numbing” phenomenon. In some sense, this may even be a merciful defense mechanism — imagine if all the tremendous suffering in the world was equally impactful. We’d likely succumb to severe depression and misanthropy, or become very withdrawn.

Of course, I’m not saying this excuses callousness or apathy. We can still love and care for one another beyond our closest loved ones. We don’t need to be deeply affected by all the human suffering in the world in order to be troubled by it and seek to alleviate it. Empathy and social responsibility are intrinsic to our species. We must simply adapt to the existence of this new global community and expand our circle of compassion and consideration to be far wider. It’s difficult but not impossible, in my opinion.

What are your thoughts?