Featured Image -- 6073

The complete guide to procrastinating at work

Eupraxsophy:

The perfect post for starting the post-Labor Day workweek. I can certainly relate with a lot of what is stated here, in both my personal and professional life. It is nice to see more scientific attention centered on what is no doubt an increasingly common issue in the modern world. As it turns out, there is a rhyme and reason to procrastination beyond mere fatigue or laziness — in fact, it is a completely different creature altogether.

Originally posted on Quartz:

Some research says the best way to spark creativity is to walk away and that the best ideas come from those least-expected “aha!” moments. So maybe procrastination isn’t such a bad thing after all. Or is time spent on those cat memes taking its toll? Can procrastinating ever be a source of productivity?

Here’s the complete guide to procrastinating at work:

Clever people procrastinate smartly

The Creativity Research Journal studied the working habits of a particularly intelligent group of people, winners of the Intel Science Talent competition. They found the group procrastinated productively. Some used procrastination as a trigger for a helpful amount of stress needed to ignite positive action. Others saw it as a “thought incubator”: They put off making a decision because they wanted to fully process it before finding a solution.

Procrastinate using your to-do list

The same study also found that the tasks the science competition winners were doing while avoiding work were helping in other areas of their…

View original 695 more words

My Ambivalence on Human Nature

It is astounding how just a cursory glance of the human condition at any given moment can simultaneously yield so much good and evil. Browsing through one day’s worth of news, I can find such a mixed bag of humanity’s best and worst tendencies. It puts my mood in such a flux.

I will read an article about an altruistic act, a peace accord, the lifting of millions from poverty, or some other event on either a micro or macro level that demonstrates moral and social progress. Then right after I see some stomach-churning demonstration of cruelty, whether it is a heinous crime, warfare, or the immiseration of millions by a seemingly impervious regime.

After so many weeks, months, and years of taking it all in, both academically and autodidactically, it is difficult for me to be consistently cynical or optimistic about the course of humanity. Perhaps this is to be expected, since our thoughts and worldviews are shaped by our experience and knowledge, both of which are continuously changing.

As it stands, I suppose I am currently cautiously optimistic, because we have nonetheless come a long way as a species, even if innumerable vices and problems remain. I see the potential for progress and prosperity, even amid so many reminders of our proneness to fallibility, apathy, and hatred.

I think it is worth acknowledging that suffering would remain even with the best intentions. There are still the vagaries of natural disasters, disease, and simple misfortune (accidents and what not). The impact of all these factors can and have been reigned in, but I feel it is only up to a point.

Sorry if my thoughts seem all over the place, this was sort of a stream of consciousness. It goes without saying that I am fortunate to have the luxury to pontificate on such things in so much comfort, largely by an accident of birth.

 

The Awesome Power of Our Divided Brain

The following video from RSA explains how the hemispheric nature of our brains — which is poorly understood by most people — has profoundly affected human behavior, culture, and society. It’s part of a lecture given by renowned psychiatrist and writer Iain McGilchrist, whose full talk can be seen here. I hope you enjoy.

As always, feel free to share your thoughts and feedback.

Link

War and Human Nature

From what I’ve seen, it’s become something of a canard to say that war is intrinsic to human nature. Large scale violence is not only uniquely human, but inseparably so, such that it’s hard to imagine human existence without it.

But a recent study is casting doubt on this widely-accepted and seemingly verified “Deep Roots Theory” of human violence. Scientific American reports on the research published today in Science, Lethal Aggression in Mobile Forager Bands and Implications for the Origins of War.

Of the 21 societies examined by Fry and Soderberg, three had no observed killings of any kind, and 10 had no killings carried out by more than one perpetrator. In only six societies did ethnographers record killings that involved two or more perpetrators and two or more victims. However, a single society, the Tiwi of Australia, accounted for almost all of these group killings.

Some other points of interest: 96 percent of the killers were male. No surprise there. But some readers may be surprised that only two out of 148 killings stemmed from a fight over “resources,” such as a hunting ground, water hole or fruit tree. Nine episodes of lethal aggression involved husbands killing wives; three involved “execution” of an individual in a group by other members of the group; seven involved execution of “outsiders,” such as colonizers or missionaries.

Most of the killings stemmed from what Fry and Soderberg categorize as “miscellaneous personal disputes,” involving jealousy, theft, insults and so on. The most common specific cause of deadly violence—involving either single or multiple perpetrators–was revenge for a previous attack.

These data corroborate a theory of warfare advanced by Margaret Mead in 1940. Noting that some simple foraging societies, such as Australian aborigines, can be warlike, Mead rejected the idea that war was a consequence of civilization. But she also dismissed the notion that war is innate–a “biological necessity,” as she put it – simply by pointing out (as Fry and Soderberg do) that some societies do not engage in intergroup violence.

Mead (again like Fry and Soderberg) found no evidence for what could be called the Malthusian theory of war, which holds that war is the inevitable consequence of competition for resources.

Instead, Mead proposed that war is a cultural “invention”—in modern lingo, a meme, that can arise in any society, from the simplest to the most complex. Once it arises, war often becomes self-perpetuating, with attacks by one group provoking reprisals and pre-emptive attacks by others.

The war meme also transforms societies, militarizes them, in ways that make war more likely. The Tiwi seem to be a society that has embraced war as a way of life. So is the United States of America.

Needless to say, I’m awaiting more research on the subject. But whatever the case is, I think it’s important not to view mass violence in such a fatalistic way. That mentality would only perpetuate a self-fulfilling prophecy, in which we’re more willing to accept war as an institution — or solution — by virtue of its apparent inevitability. This same approach accounts for many other moral and social evils.

Even if such negative behaviors do have deep roots, that’s hardly an excuse for not trying to mitigate their influence. Most human behavior stems from both nature and nurture, and I’m not aware of any human characteristic that strictly falls under one sphere or the other. Thus, there is always some avenue for improvement, albeit through concerted multidimensional efforts — better material conditions, in combination with quality education (formal and informal), tends to lead to a vast reduction in social ills.

Hormones Influence Ideology

A growing body of research is finding that biological and psychological factors influence the beliefs we otherwise feel are freely decided upon. From social attitudes to religious piety and even voting patterns, it seems that much of worldview is shaped by deterministic patterns we scarcely notice. An article in MoJo by Chris Mooney offers some pretty interesting details. Continue reading

Human Nature and Apathy

Many people, myself included, lament the fact that our species is so apathetic to the widespread suffering that is plentifully around us. However tragic, such indifference is both natural and expected. Our minds were not evolved for absorbing the sheer amount of stimulus that exists in the world.

Only very recently have most humans become regularly exposed to the overwhelming amount of people, events, and information that exists and multiplies all around us. There is a limit to how much we can think about or emotionally react to, and that’s why our immediate suffering — our trivial “first world problems” — is felt far more strongly that the more horrible but distant misery that exists out there. Telling someone that others have it worse is admirable but futile because our brains feel the personal circumstances more substantively and intimately than abstract ones.

It’s for this reason that society will obsess more about individual negative events highlighted in news versus the bigger but nameless and faceless statistics of human poverty. In fact, this is the same reason you’re more likely to donate to an individual suffering person than to broader charitable in general — look up Paul Slovik’s “psychic numbing” phenomenon. In some sense, this may even be a merciful defense mechanism — imagine if all the tremendous suffering in the world was equally impactful. We’d likely succumb to severe depression and misanthropy, or become very withdrawn.

Of course, I’m not saying this excuses callousness or apathy. We can still love and care for one another beyond our closest loved ones. We don’t need to be deeply affected by all the human suffering in the world in order to be troubled by it and seek to alleviate it. Empathy and social responsibility are intrinsic to our species. We must simply adapt to the existence of this new global community and expand our circle of compassion and consideration to be far wider. It’s difficult but not impossible, in my opinion.

What are your thoughts?

Beauty and Brains

I find it interesting that whenever a very attractive person — particularly a woman — demonstrates above-average intelligence or skill, it genuinely surprises most people. Similarly, I’ve seen people marvel at how a “nerdy” person can be athletic or charismatic. Needless to say, those peers who are both attractive and intelligent feel endless frustration at being reflexively labeled based solely on their looks and initial impression.

But this is nothing new, as humans were evolved to make quick judgements based little data — it’s a survival mechanism that has remained, often misapplied, in the modern world. In this instance, we seem to unconsciously associate good looks with stupidity or, at most, average intelligence (admittedly, I think even I have been guilty of this visceral stereotyping).

I’ve read a hypothesis suggesting that this correlation reflects a form of evolutionary compensation:  if one isn’t attractive, they make up for it by making themselves desirable in other ways; similarly, an unskilled or unintelligent person may harness whatever charisma or physical attractiveness they have to influence others or burnish their image. We see this pattern and therefore apply it in how we judge and analyze people.

In any case, it is interesting to note that traditionally (and for the most part to this day), heroic and virtuous characters in various media have almost always been portrayed as good looking, and intelligence is rarely shown to be mutually exclusive with physical attractiveness. Of course, this too likely reflects our evolutionarily-induced preference for well-rounded, attractive people.

Anyway, has anyone else noticed this? Is there a reason for these correlations? What are you thoughts on this?

Link

The “end of history illusion” describes an almost universal phenomenon among human beings, in which we have a tendency to see the present time as the stopping point for any change of in lives. Once we reach a certain age, we essentially assume that from then on we’ll remain the same. It’s a little difficult to describe given that the “present” tense of time slides as we age. But the New York Times has a great article on it:

When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years. But when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.

They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement.

“Middle-aged people — like me — often look back on our teenage selves with some mixture of amusement and chagrin,” said one of the authors, Daniel T. Gilbert, a psychologist at Harvard. “What we never seem to realize is that our future selves will look back and think the very same thing about us. At every age we think we’re having the last laugh, and at every age we’re wrong.”

Trying to explain this tendency yields even more interesting considerations. After all, if people acknowledge how much they’ve changed over the years, why can’t they seem to realize that such change will continue?

People seemed to be much better at recalling their former selves than at imagining how much they would change in the future.

Why? Dr. Gilbert and his collaborators, Jordi Quoidbach of Harvard and Timothy D. Wilson of the University of Virginia, had a few theories, starting with the well-documented tendency of people to overestimate their own wonderfulness.

“Believing that we just reached the peak of our personal evolution makes us feel good,” Dr. Quoidbach said. “The ‘I wish that I knew then what I know now’ experience might give us a sense of satisfaction and meaning, whereas realizing how transient our preferences and values are might lead us to doubt every decision and generate anxiety.”

Or maybe the explanation has more to do with mental energy: predicting the future requires more work than simply recalling the past. “People may confuse the difficulty of imagining personal change with the unlikelihood of change itself,” the authors wrote in Science.

But it’s false comfort, as this mentality does have its caveats:

The phenomenon does have its downsides, the authors said. For instance, people make decisions in their youth — about getting a tattoo, say, or a choice of spouse — that they sometimes come to regret.

I think it comes down to the nature of the human mind. Our brains are limited in their capacity to look into the future. Our senses and perceptions are shaped by the here and the now, not by a hypothetical future that is far and away – and therefore difficult to grasp, let alone feel concerned about. Try as we might, we’re just too cognitively limited.

Video

Homeless Man Donates Handouts to Fellow Homeless

This exemplary human being has given away over $9,000 he’s collected through panhandling to a fellow homeless mother and child. When many better off people can’t be bothered with giving the less fortunate the time of day, a man who is scarcely getting by still find the means and the love to give to others. This is a very inspiring story. I especially like the news anchors statement towards the end.

Hat tip to my friend Ray for sharing this with me.

Lying, By Sam Harris

The following is an excerpt from a relatively new e-book by neuroscientist Sam Harris titled Lying. It’s an in-depth analysis on the psychology and ethics of deception, and it is by far one of the most interesting things I’ve ever read on the subject.

At least one study suggests that 10 percent of communication between spouses is deceptive. Another has found that 38 percent of encounters among college students contain lies. However, researchers have discovered that even liars rate their deceptive interactions as less pleasant than truthful ones. This is not terribly surprising: We know that trust is deeply rewarding and that deception and suspicion are two sides of the same coin. Research suggests that all forms of lying—including white lies meant to spare the feelings of others—are associated with poorer-quality relationships.

Once one commits to telling the truth, one begins to notice how unusual it is to meet someone who shares this commitment. Honest people are a refuge: You know they mean what they say; you know they will not say one thing to your face and another behind your back; you know they will tell you when they think you have failed—and for this reason their praise cannot be mistaken for mere flattery.

Honesty is a gift we can give to others. It is also a source of power and an engine of simplicity. Knowing that we will attempt to tell the truth, whatever the circumstances, leaves us with little to prepare for. We can simply be ourselves.

This is one of many sections that stood out to me, and I highly recommend this for everyone, given the ubiquity of the issue. You can download the e-book, which is less than a hundred pages, for a mere three dollars on Harris’s website.

I’ve been reflecting a lot on the implications of this data. Lying is so prevalent, even between confidants, that it makes me wonder whether it’s just a natural part of being human. Every society has a moral prohibition towards dishonesty – indeed, the importance of truthfulness is one of the few universal norms across human society – yet we seem unable to reign in on our own fibbing, let alone keep others in line.

If everyone is a hypocrite (albeit to varying degrees) who has the moral high ground with respect to lying? Heck, who can we even trust to be truthful? Even the most seemingly honest person can turn out to be an expert fibber.

And if lies of all kinds factor into our daily interactions, what good would truth-telling be in the long run? It may help your reputation in some respects, but it may also hinder you in others. After all, many people don’t take the truth as well as they claim they would. Honesty is valued in principle, but I’ve long observed (and been guilty of) the ambivalence people have towards being given a truth they don’t want to hear. Many of us have an almost duplicitous attitude towards honesty – we like it so long as it doesn’t inconvenience us or our own neat perception of the world.

In light of all this, is it possible to imagine a world with less lying? Is it possible to go through life with only a minimal amount of deception? Is lying really all that bad if everyone does it, and if society and human psychology seem tacitly structured around it? I don’t mean to sound cynical or misanthropic – I’m far from it – but I think this is something to think about. Please, share your thoughts.