Link

The Market Meltdown We’re All Paying For

Some might not like the tone or source of this article, but I think it’s appropriate given the blood-boiling assessment it makes:

Deregulation of the banks had been key to the 2000s boom. During preceding years, especially under Democratic President Bill Clinton, the federal government’s rules for financial institutions and operations were rewritten to meet Wall Street’s thirst for bigger gambles. For instance, parts of the Glass-Steagall Act of 1933, instituted to put firewalls between commercial banks, insurance companies, securities firms and investment banks in order to stop another 1930s-style market panic, were gutted.

As a result, financial institutions were free to invest in more exotic and riskier financial products–with bigger payoffs all around. The people at the head of firms like Lehman Brothers were raking it in–making the “greed is good” bloodsuckers of Wall Street during the 1980s look “fiscally responsible.”

Take Richard Fuld, who ran Lehman from 1994 until 2008. For his “hard work” at the high-finance equivalent of playing the ponies, the “Gorilla of Wall Street,” as Fuld was known, made the list of America’s 25 highest-paid executives for eight years in a row–until the very year the bank collapsed. He raked in nearly $500 million in compensation during his time as CEO, and he still owns three “homes”–mansions in Greenwich, Conn., and Jupiter Island, Fla., and a ranch in Sun Valley, Idaho.

From the first days of the financial crisis, the “experts” heaped blame for Wall Street’s meltdown on ordinary people–workers who “caused” the housing crash because they “bought homes they couldn’t afford,” for example.

But the real culprits were parasites like Richard Fuld. No one could possibly claim that Fuld or his fellow banksters contributed anything to the good of society as a whole. On the contrary, they sucked billions and billions of dollars into their Wall Street casino for the sole aim of making a tiny group of people rich beyond the wildest dreams of most people.

Furthermore, these profitable institutions and their wealthy bosses are doing better than ever, even while the overwhelming majority of Americans continues to suffer:

In March, the Dow Jones stock market indicator returned to the high point of the 2000s boom almost five years before, erasing a more than 50 percent loss during the crisis–and it’s continued climbing since. The profitability of the banks and corporations had surpassed pre-recession levels long before. Since the crisis times of late 2008, corporate profits have increased at a rate of more than 20 percent every single year, according to the New York Times.

For the elite at the very top of society, things have never been better. According to IRS figures, virtually all the gains from the economic recovery since 2009–95 percent–have gone to the top 1 percent. More than 60 percent of the gains went to the top 0.1 percent–that is, people with annual incomes of more than $1.9 million.

As anyone reading this article almost certainly knows, the working-class majority in the U.S. is living in a very different world.

Some 11.3 million Americans are unemployed, according to the Bureau of Labor Statistics, and tens of millions more have dropped out of the workforce or struggle to make ends meet with part-time jobs. Worker productivity is up, having gained nearly 25 percent from 2000 to 2012, according to an August report from the Economic Policy Institute, while “wages were flat or declined for the entire bottom 60 percent” of the workforce.

The foreclosure crisis that followed the Wall Street crash has largely left the news, but it’s not over. Home prices have begun to rebound generally, but many working-class families still aren’t out from under extreme mortgage debt. Almost 25 percent of homeowners with a mortgage were “underwater”–meaning they owe more on their loans than their homes are worth–as of the second quarter of 2013, according to the Zillow real estate database.

To add more insult to injury, we the people, who have already disproportionately suffered from Wall Street’s malfeasance, has also played an unwitting role in helping it to prosper:

Working with the support of congressional Democrats, including presidential candidate Barack Obama, the Bush administration–despite its claimed reverence for the free market–put together the $700 billion Troubled Asset Relief Program (TARP), giving the Treasury Department the authority to take over bad debts and pump cash into major financial institutions. In addition to this, the government eventually committed trillions of dollars to various programs to help the banks.

When it took over, the Obama administration–its Treasury Department staffed by the same Federal Reserve officials who presided over the crisis, alongside plenty of former executives from Goldman Sachs and other banks–adopted the Bush proposal almost without alteration.

When anyone questioned why the government was pouring taxpayer dollars into institutions that had gambled their way into crisis, the answer from the establishment was that a financial industry strengthened by the TARP and reined in by new financial “reforms” would be able to lend money to finance new investments.

The exact opposite happened–banks and other institutions tightened up on every form of credit. For example, in the first three months of 2012, JPMorgan Chase, Wells Fargo, Bank of America and Citigroup cut their lending by a collective $24 billion, nearly wiping out the $34 billion increase in lending from the whole of the previous year.

With all the federal funds sloshing around uselessly in the financial system, the banksters instead started banking guaranteed, risk-free profits–by taking money from the Federal Reserve lent to financial institutions at an effective interest rate of 0 percent, and lending it back to the government through the purchase of Treasury bills at 3 percent interest.

There is little question that this system is broken, plutocratic, and increasingly less representative of the people, much less interested in the general welfare. I think there is still hope for reform, both political and economic, but it will take a long time and a lot of hard work and unity on the part of the public. How much longer until this systemic injustice becomes unbearable?

Link

TV Host Gets Plastic Surgery to “Get Ahead”

Julie Chen, who’s hosted several prominent television series , recently admitted to having undergone eyelid surgery many years ago in order to look “less Chinese.”

Chen was working as a local reporter in Dayton, Ohio, almost 20 years ago and wanted a chance to be an anchor. What her news director told her at age 25 is pretty startling.

“He said, ‘You will never be on this anchor desk, because you’re Chinese,’” Chen revealed to her co-hosts. ”He said, ‘Let’s face it Julie, how relatable are you to our community? … Because of your Asian eyes, sometimes I’ve noticed that when you’re on camera and you’re interviewing someone, you look disinterested, you look bored because your eyes are so heavy, they are so small.’”

Chen said his speech felt like a dagger to her heart.

“It felt like a weird, grown-up version of racism in the workplace,” she added. “I started developing a complex.”

To make matters worse, she met with agents for career advice, only to hear from a “big-time” agent, “‘I cannot represent you unless you get plastic surgery to make your eyes look bigger,’” Chen recalled. “And I did it.”

Stories like this are why I frankly don’t blame folks for getting cosmetic surgery. Given that so many people are savaged for their looks or pressured (directly or implicitly) to change their appearance, making such a decision is sadly understandable (albeit not in every case). There’s plenty of empirical and scientific evidence showing that attractive people have a natural edge in the way they’re treated, regarded, or judged in unrelated areas such as talent (e.g. the halo effect).

Granted, the pressure to look good — and the subsequent benefit of doing so — is nothing new. It’s just that there is now newer and better options for doing so. It’s interesting to note that people who get surgery receive a lot of flak for that decision as well, which drives home the point that unless you win the genetic lottery and happen to already look a certain way, you’re disadvantaged in certain areas or social circles regardless.

Chen’s decision is also indicative of the fact that beauty in our society, as well as in others influenced by our culture, is increasingly defined by looking as close to caucasian as possible. This is evident in the fact that few non-white women reach prominence in fashion, film, or other public venues, and those that do make it tend to look closer to “white” — hence why methods like skin bleaching and eyelid surgery have become more popular around the world.

To be clear, provided they do it safely and within reason, I don’t think anyone should receive  additional scrutiny nor be looked down upon for changing their appearance in this way. It’s yet another innovation in our historical, socially-conditioned obsession with beauty, just as make up, hair dye, and other methods once were. Granted, it’s the most long-lasting and radical means (so far), but the motivation and concept remains the same.

People are entitled to do what they will with their appearance, just as they’re allowed to let themselves go and defy standard conventions of beauty. Now, there are certainly cases where people are risking their health, finances, and (ironically) their appearance in order to look a certain way. It’s been argued that such instances denote psychological and personal problems that must be addressed. In that instance, I’d be worried and seek to get involved.

Of course, I’m not saying it’s necessarily good thing that people feel the need to go to these lengths, just that it’s unfortunately driven by social and (arguably) natural conditioning that’s difficult to resist. If we want to minimize the practice, we need to stop privileging attractiveness and telling people they can’t follow their dreams unless they look a certain way.

As always, however, I could be wrong, at which point I invite you to share your own thoughts. 

Link

The following excerpt is from a post on Brute Reason discussing the problems with using psychiatric terms in a colloquial and metaphorical sense.

These words are used so casually that our conception of their meaning gradually shifts without our even noticing it. It’s like a boy-who-cried-wolf type of situation in that regard. If nine different friends joke to you about how they’re ‘sooooo OCD’ because they like all their books organized just so on their shelf (a situation familiar to just about every bibliophile, honestly), then the tenth friend who comes to you and tells you that they have OCD is probably going to evoke that mental image, rather than one of someone who actually can’t stop obsessing over particular little things and carrying out rituals that interfere with that person’s normal functioning, perhaps to the point of triggering comorbid disorders like depression. This may be a person who washes their hands until they are raw and hurting, someone who has to flick the light switch on and off seven times every time they leave a room, or someone who has recurring, uncontrollable thoughts about hurting someone they love even though they have no actual desire to do that.Well, that sounds a little different than insisting that your books be categorized by subject and then alphabetized by author, no?Likewise, if your friends are constantly telling you they’re ‘depressed’ because their team lost or because they got a bad grade, only to return to their normal, cheerful selves within a few hours, the next person who tells you that they are “depressed” might elicit a reaction of, ‘Come on, get over it! You’ll feel better if you go out with us.’

And so the meanings of words change.

We must either change the way these words are used, or at the very least recognize the nuance in their meaning — not everyone who says they have anxiety or depression actually does, in the clinical sense; moreover, those who do make a serious claim to such conditions should be given the benefit of the doubt, and not assumed to be displaying mere personality quirks or the like.
Thoughts?
Link

Happy 194th Birthday Léon Foucault

Two centuries ago, it was difficult for scientists to model intricate planetary orbits. Léon Foucault helped devise a method to make celestial orbits a bit easier to understand. 

Wednesday marks the 194th anniversary of the French physicist’s birth. To celebrate Mr. Foucault and his breakthrough pendulum, let’s take a look at how he was able to model Earth’s rotation.

Jean Bernard Léon Foucault was born in Paris in 1819. While Foucault received a medical education, the profession did not quite suit him. The young doctor is said to have a distaste for bloody medical dissections. But Foucault was brilliant when it came to making models, tools, and devices. 

And Foucault’s craftsmanship came in handy.

Foucault and a series of teachers, bosses, and partners tackled many scientific questions by building contraptions that could make hard-to-grasp phenomenas more tangible. Foucault was able to measure the speed of light. He improved the daguerreotype, an early form of photography. He found a way to prove that light is a wave, not a beam of particles. He named the gyroscope, a stabilizing tool found in everything from toys to the International Space Station. 

In 1851, Foucault made one of his best-remembered experiments: the scientist devised the first model to demonstrate the rotation of the earth on its axis.

People had tried many different ways to explain Earth’s rotation before Foucault. One group had even launched cannon balls up into the air with the hopes that the world would spin enough that they could measure the deviation once the ball plummeted back to earth. Compared to that loud, inaccurate (and dangerous) plan, Foucault’s solution was remarkably elegant. He strung up a brass weight at the end of six-foot wire. The metal ball hung over a pile of damp sand, just close enough that the brass brushed against the sand as it swung slowly back and forth. At first, the pendulum simply carved a straight line in the sand. But over the course of several hours, the line turned into a bow-tie shape.

Newton’s laws of motion state that an object will not change direction unless another force hits it. This means that while Foucault’s pendulum kept swinging in the same direction, the earth (and the sand on the ground) turned underneath it. It’s as if you drew a line back and forth repeatedly on a piece of paper, but then slowly rotated the sheet as you kept drawing – eventually the lines would form a circle.  

Foucault’s experiment became a sensation. The French government even ordered a large-scale version that would hang inside the Pantheon in Paris, with a 219-foot, 61-pound pendulum suspended from the building’s dome. Modern-day pendulums hang in the United Nations headquarters in New York, the California Academy of Sciences in San Francisco, the Boston Museum of Science, and many other locations.

Link

Have Young Americans Lost Their Moral Compass?

In recent months there has been a visible struggle in the media to come to grips with the leaking, whistle-blowing and hacktivism that has vexed the United States military and the private and government intelligence communities. This response has run the gamut. It has involved attempts to condemn, support, demonize, psychoanalyze and in some cases canonize figures like Aaron Swartz, Jeremy Hammond, Chelsea Manning and Edward Snowden.

In broad terms, commentators in the mainstream and corporate media have tended to assume that all of these actors needed to be brought to justice, while independent players on the Internet and elsewhere have been much more supportive. Tellingly, a recent Time magazine cover story has pointed out a marked generational difference in how people view these matters: 70 percent of those age 18 to 34 sampled in a poll said they believed that Snowden “did a good thing” in leaking the news of the National Security Agency’s surveillance program.

So has the younger generation lost its moral compass?

No. In my view, just the opposite.

The article is a pretty engaging read, and I recommend reading it and deciding for yourselves. 

Link

A Detailed Report of US Mass Shootings From 1982 to 2013

This sobering Google spreadsheet provides a record of all mass and spree shootings that have occurred in the US over the last 30 years. Aside from the usual stats — such as the name of the perpetrator, location, number of fatalities — it also includes a summary of the crime, any known motive of the killer, and whether or not they had a confirmed history of mental illness.

It’s well-sourced and updated five minutes, having unfortunately grown quite a bit over the last couple of years (coinciding with other reports that have found a decline in mass shootings despite an overall drop in crime). Needless to say, it’s a somber read, and an awful reminder of the unusually high incidence of gun massacres in this country — reasons that will be explored for another day. 

Link

A Woman of Indian Descent Wins Miss America — And the Bigots Come Out of the Woodwork

Nina Davuluri is the first Indian-American (and only the second of Asian descent) to win the pageant. Despite her achievements, the poor woman has had to contend with much-cited racism and denouncement. 

She was referred to as “the Arab,” by some and other commenters noted “This is America, not India.” One called her “Miss 7-11.”

There were those who huffed about it being inappropriately close to 9/11.

Some mentioned ties to al-Qaeda and at least one flat-out called her a terrorist.

Feel free to click the link and see some stomach-churning examples for yourself.

My personal favorite is the quip, “This is America, not India.” Setting aside the fact that — shockingly — most Americans are not originally from America, I’m sure our indigenous people had a similar line of thinking throughout our history. 

This is yet another disturbing example of the cultural, historical, and geographical ignorance of many Americans, to say nothing of their narrow definition of what constitutes “American” — white-looking, through and through. Such a mentality is anathema to the founding principles of this nation, which although marred by the realities of slavery and indigenous exclusion, nonetheless reflect an inclusive and transcend ideal of nationhood.

There is nothing about this woman, or any nonwhite and/or foreign-born person, that makes them any less  Americans, or any trustworthy, valuable, and dignified. Heaven forbid that we reflect the fundamental American values of equality and fairness by having someone who isn’t a WASP reach social, political, or public prominence — even in a venue as tepid as Miss America. 

I’m happy that she reportedly seems to be taking it all in stride (maybe she saw it coming). While I don’t care much for these pageants, I certainly don’t think their participants deserved to be savaged for their appearance or ethnicity, especially when it’s a positive step to see more people of color represented in prominent venues. I’m also pleased to see far more backlash against this bigotry than support, which is a nice reminder of the better side of our nation. 

Interestingly, the runner up was of Chinese ancestry, who I’m sure would have been subject to stupid comments as well (albeit perhaps not as vitriolic, since Asians are often perceived to be “better foreigners” than their “Arab-looking” counterparts). 

Link

The 30 Anniversary of the Sabra and Shatila Massacres

ON the night of Sept. 16, 1982, the Israeli military allowed a right-wing Lebanese militia to enter two Palestinian refugee camps in Beirut. In the ensuing three-day rampage, the militia, linked to the Maronite Christian Phalange Party, raped, killed and dismembered at least 800 civilians, while Israeli flares illuminated the camps’ narrow and darkened alleyways. Nearly all of the dead were women, children and elderly men.

Thirty years later, the massacre at the Sabra and Shatila camps is remembered as a notorious chapter in modern Middle Eastern history, clouding the tortured relationships among Israel, the United States, Lebanon and the Palestinians. In 1983, an Israeli investigative commission concluded that Israeli leaders were “indirectly responsible” for the killings and that Ariel Sharon, then the defense minister and later prime minister, bore “personal responsibility” for failing to prevent them.

While Israel’s role in the massacre has been closely examined, America’s actions have never been fully understood. This summer, at the Israel State Archives, I found recently declassified documents that chronicle key conversations between American and Israeli officials before and during the 1982 massacre. The verbatim transcripts reveal that the Israelis misled American diplomats about events in Beirut and bullied them into accepting the spurious claim that thousands of “terrorists” were in the camps. Most troubling, when the United States was in a position to exert strong diplomatic pressure on Israel that could have ended the atrocities, it failed to do so. As a result, Phalange militiamen were able to murder Palestinian civilians, whom America had pledged to protect just weeks earlier.

Link

Why Even the Smartest People Fail at Reason

 

Being reasonable isn’t easy. Heck, for all our intelligence, it doesn’t even come natural, as more and more studies are demonstrating:

One provocative hypothesis is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to glimpse their systematic thinking errors. However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.

The problem with this introspective approach is that the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.

I think this ultimately (and obviously) validates the importance of engaging in dialogue with others and leaving yourself open to criticism.

But then again, our own biases make it hard to accept criticism of our deeply held beliefs, which is where science, reason, and other methodologies come into play. Yet even these can be misused or misunderstood.

So basically, trying to figure out the world and what is true is very, very hard and constant vigilance…go figure.

Link

Evolution Doesn’t Favor Selfishness

When most people think of evolution, they imagine a coldly efficient process whereby only the strongest and most ruthless survive, e.g. “survival of the fittest” and social Darwinism. People often use this mischaracterization to paint naturalists — namely atheists — as immoral, or to justify a cynical and misanthropic attitude towards human nature.

But setting aside the folly of deriving so many vast implications from a natural mechanistic process, new research suggests that evolution isn’t as selfish as widely assumed. It’s a somewhat long but interesting read.

A team from Michigan State University, US, used a model of the prisoner’s dilemma game, where two suspects who are interrogated in separate prison cells must decide whether or not to inform on each other.

In the model, each person is offered a deal for freedom if they inform on the other, putting their opponent in jail for six months. However, this scenario will only be played out if the opponent chooses not to inform.

If both “prisoners” choose to inform (defection) they will both get three months in prison, but if they both stay silent (cooperation) they will both only get a jail term of one month.

The eminent mathematician John Nash showed that the optimum strategy was not to cooperate in the prisoner’s dilemma game.

Co-operating is key for evolution

“For many years, people have asked that if he [Nash] is right, then why do we see cooperation in the animal kingdom, in the microbial world and in humans,” said lead author Christoph Adami of Michigan State University.

The answer, he explained, was that communication was not previously taken into account.

In 1974, Richard Dawkins published a gene-centred view of Charles Darwin’s theory of natural selection.

He argued that it was not groups or organisms that adapt and evolve, but individual genes and each living organism’s body was a survival machine for its genes.

Prof Andrew Coleman from Leicester University explains that this new work suggests that co-operation helps a group evolve, but does not argue against the selfish gene theory of evolution.

Rather, he adds, it helps selfish genes survive as they reap the awards of inhabiting co-operative groups.

“The two prisoners that are interrogated are not allowed to talk to each other. If they did they would make a pact and be free within a month. But if they were not talking to each other, the temptation would be to rat the other out.

“Being mean can give you an advantage on a short timescale but certainly not in the long run – you would go extinct.”

These latest findings contradict a 2012 studywhere it was found that selfish people could get ahead of more co-operative partners, which would create a world full of selfish beings.

This was dubbed a “mean and selfish” strategy and depended on a participant knowing their opponent’s previous decision and adapting their strategy accordingly.

Crucially, in an evolutionary environment, knowing your opponent’s decision would not be advantageous for long because your opponent would evolve the same recognition mechanism to also know you, Dr Adami explained.

This is exactly what his team found, that any advantage from defecting was short-lived. They used a powerful computer model to run hundreds of thousands of games, simulating a simple exchange of actions that took previous communication into account.

A previous study found that selfish strategies were favourable

“What we modelled in the computer were very general things, namely decisions between two different behaviours. We call them co-operation and defection. But in the animal world there are all kinds of behaviours that are binary, for example to flee or to fight,” Dr Adami told BBC News.

“It’s almost like what we had in the cold war, an arms race — but these arms races occur all the time in evolutionary biology.”

Social insects

Prof Andrew Coleman of Leicester University, UK, said this new work “put a brake on over-zealous interpretations” of the previous strategy, which proposed that manipulative, selfish strategies would evolve.

“Darwin himself was puzzled about the co-operation you observe in nature. He was particularly struck by social insects,” he explained.

“You might think that natural selection should favour individuals that are exploitative and selfish, but in fact we now know after decades of research that this is an oversimplified view of things, particularly if you take into account the selfish gene feature of evolution.

“It’s not individuals that have to survive, its genes, and genes just use individual organisms – animals or humans – as vehicles to propagate themselves.”

“Selfish genes” therefore benefit from having co-operative organisms.

Human nature needn’t be so ruthless or competitive. As a social species, it not only makes ethical sense to work together, but it’s simply more practical.