Today’s Google Doodle honors Grace Murray Hopper (December 9, 1906 – January 1, 1992), an American computer scientist and Rear Admiral in the US Navy who pioneered computer programming.
Hopper was one of the first programmers of the Harvard Mark I computer, one of the most advanced of its time. She also developed the first compiler, used for translating source code, for a computer programming language. She was the first to conceptualize the idea of distinct programming languages, which contributed to the development of COBOL, one of the first programming languages in modern computers.
She also popularized the term “debugging” for fixing computer glitches, due to an incident in which an actual moth got into a computer and caused it to malfunction, requiring it to be removed (whereupon she remarked they were “debugging” the system).
Hopper was awarded 40 honorary degrees from universities worldwide during her lifetime. Her remarkable accomplishments and her naval rank earned her the nickname “Amazing Grace.”
Contrary to popular belief, the much-maligned use of the word “ax” in place of “ask” is neither new nor technically an error. The pronunciation can be traced back to the eighth century and derives from the Old English verb “acsian”, which meant “to demand”, “inquire”, or “seek”.
Everyone from Geoffrey Chaucer to the writers of the first English-language Bible utilized it. Nowadays, it’s associated mostly with people from the American South or the Caribbean, particularly those of African and Indian descent. Unfortunately, it also has a negative connotation, with many perceiving it as low-class or unprofessional.
NPR had a fascinating interview with Pulitzer Prize winning journalist Paul Salopek, who is engaging on a fascinating journey that I’ve long dreamed about: traveling the world by foot.
“There’s something about moving across the surface of the earth at 3 miles per hour that feels really good,” he tells NPR’s Steve Inskeep.
Salopek plans to walk 21,000 miles total — from Africa to the Middle East, across Asia, down through Alaska and all the way to Tierra del Fuego. He calls it the “Out of Eden Walk” because the idea is to follow the path of human migration.
Along the way, he’s documenting the journey for National Geographic magazine. In fact, his journey is the cover story in this month’s issue, with photos by John Stanmeyer.
Salopek is currently 10 months into the voyage, and just crossed the border into Jordan from Saudi Arabia. He has faced numerous obstacles, he says, like extreme temperatures and dust devils. As well as manmade obstacles that are vastly different from what early Homo sapiens might have encountered.
It’s remarkable to imagine just how much our early ancestors went through. If we think traveling is difficult and expensive now, imagine being the very first to have done so without the benefit of knowledge or technology? I hope to engage in this life-changing pursuit myself some day. If the fifty-one year-old Salopek can do it, I’m sure I can pull it off — that leaves me plenty of time to save money.
Read more about his journey, and see gorgeous photos, on the official website.
Although long overshadowed by the far more destructive Second World War, it was World War I (then known as the Great War) that first gave humanity a bloody taste of large-scale, industrialized warfare. Indeed, the unresolved conflicts of the First World War is what largely gave rise to the second.
All that aside, the amount of death wrought by this aptly proclaimed “War to End All Wars” is staggering, as the following chart from The Economist soberingly displays. I’ll let the numbers speak for themselves.
An entire generation was ground up in a senseless war that everyone thought would be over in no time. Every single one of those men was a distinct human being with his own identity, personality, dreams, fears, and loved ones. It’s hard to believe tens of millions more would join them just two decades or so later — and many more have since, albeit in far less visible conflicts.
Contrary to popular belief, the expression “The enemy of my enemy is my friend” (also known as enemy mine) is not an Arab proverb, nor does it have any origins in the Middle-East. Instead it comes from the Indian philosopher and teacher Chanakya (also known as Kautilya or Vishnu Gupta), who was a royal advisor to the Mauryan dynasty in the fourth century BC.
Known as the “Indian Machiavelli”, he is considered a pioneer in political science and economics, making contributions to both areas long before they were formal fields of study. His seminal work was the Arthashastra, one of the first books in history to discuss statecraft, diplomacy, economic policy, ethics, and military strategy. The original wording of the phrase went the following way:
The king who is situated anywhere immediately on the circumference of the conqueror’s territory is termed the enemy.
The king who is likewise situated close to the enemy, but separated from the conqueror only by the enemy, is termed the friend (of the conqueror).
Chanakya also took some progressive views as well, advocating for the fair treatment of women and peasants, land reform, environmental protection, and disaster relief. His main advice on governing could be summed up in the following statement:
In the happiness of his subjects lies the king’s happiness, in their welfare his welfare. He shall not consider as good only that which pleases him but treat as beneficial to him whatever pleases his subjects.
Cracked, which does a great job of unloading fun facts in a humorous manner, has taken on some prominent misconceptions and myths regarding history’s greatest war. The one I’ll highlight is perhaps the most pervasive:
World War II wasn’t just a clever name. It was a global conflict that included epic acts of heroism by non-Americans like the storming of Madagascar, the Battle of Westerplatte, the Battle of Moscow, the Battle of Kursk, the epically badass Kokoda Track, the pilots of the Polish Underground State, the details of El Alamein or the HMS Bulldog.
However, there is one Zangief-sized elephant in the room that America loved to leave out of conversation until the end of the Cold War: the Soviet Union. The “Great Patriotic War” as they called it was the single largest military operation in history, and home to perhaps the biggest turning-point of the war: the Battle of Stalingrad.
Understand, the Russia versus Germany part of the war wasn’t just a little more important than the part the USA was involved in. It was “four times the scale” of the whole Western front, larger than all other phases of the war put together. The Soviet military suffered eight million soldiers dead, more than 20 freaking times the number of U.S. casualties. Sounds pretty brutal for a John Wayne movie? Try figuring in another 13.7 million dead civilians.
Of course, the U.S. nonetheless played a pivotal role in the conflict, which included supplying the Soviet Union with supplies during the initial stages of Germany’s invasion. Americans also took the lead in the post-war recovery efforts. But as with anything else in history, there’s a lot of nuance and narrative that’s left out, especially when it comes from other or less savory perspectives. Just wait until you see the other myths in the article, particularly the one about Churchill.
Though he recently passed away, the esteemed Yale political scientist has left us with prescient and concerning observations about the nature of America’s current political problems. Linz specialized in comparative government, determining why some nations do better at democracy than others. He saw the contrast between stable long-term democracies and dysfunctional coup-ridden ones as being driven more by structural problems that by culture or economic conditions.
As an article in Slate observes, his insight is both informative and relevant, beginning with his views as to why presidential systems seem more flawed than others:
The problem, according to Linz, is right there in the title: too much reliance on presidents. In Linz’s telling, successful democracies are governed by prime ministers who have the support of a majority coalition in parliament. Sometimes, as in the British Commonwealth or Sweden or post-Franco Spain, these prime ministers are formally subordinate to a monarch. Other times, as in Germany or Israel or Ireland, there is a largely ceremonial, nonhereditary president who serves as head of state. But in either case, governing authority vests in a prime minister and a cabinet whose authority derives directly from majority support in parliament.
When such a prime minister loses his parliamentary majority, a crisis ensues. Either the parties in parliament must negotiate a new governing coalition and a new cabinet, or else a new election is held. If necessary, the new election will lead to a new parliament and a new coalition. These parliamentary systems are sometimes very stable (see the United Kingdom or Germany) and sometimes quite chaotic (see Israel or Italy), but in either case, persistent legislative disagreement leads directly to new voting.
In a presidential system, by contrast, the president and the congress are elected separately and yet must govern concurrently. If they disagree, they simply disagree. They can point fingers and wave poll results and stomp their feet and talk about “mandates,” but the fact remains that both parties to the dispute won office fair and square. As Linz wrote in his 1990 paper “The Perils of Presidentialism,” when conflict breaks out in such a system, “there is no democratic principle on the basis of which it can be resolved, and the mechanisms the constitution might provide are likely to prove too complicated and aridly legalistic to be of much force in the eyes of the electorate.” That’s when the military comes out of the barracks, to resolve the conflict on the basis of something—nationalism, security, pure force—other than democracy.
But what about the United States, the world’s first and oldest continuous democracy, which has retained a presidential system from the start? It would seem to poke holes in Linz’s analysis, and indeed he argued that America was something of an outlier for unique reasons:
The success of American democracy seemed to show that institutions were not the key. Old-fashioned Anglophone pluck and liberal values triumphed under both presidential and parliamentary systems. If something was going wrong south of the border, blame some aspect of Latin culture or economic development. But Linz always did have an answer to this objection. In the 1990 paper, he said that a full explanation of America’s success was complicated, but that “it is worth noting that the uniquely diffuse character of American political parties—which, ironically, exasperates many American political scientists and leads them to call for responsible, ideologically disciplined parties—has something to do with it.”
That was 23 years ago. Today, of course, we have ideologically disciplined parties that are “responsible” in the sense that they make a serious effort to deliver on their stated policy agendas. We also have a government shutdown, a looming debt ceiling breach, and a country in which regular order budgeting is an increasingly distant memory.
Indeed, it’s worth remembering that the Civil War was precipitated largely by the inability of political parties to compromise on the issue of slavery — it took an all out war to resolve the problem that our institutions could not. This is partly because our representative system lends disproportionate weight to smaller states, which is also why the expansion of civil rights was such a difficult and piecemeal process (since a minority of Southern senators could filibuster any such effort).
While it’s very unlikely that the US will succumb to a another civil war, let alone a military coup, this vulnerability to deadlock does seem likely to continue haunting us.
…Linz’s work raises the deeper question not of what will happen next week or next month, but next year or next decade. In a world with well-sorted parties and little ticket-splitting, the geography-driven differences in voting results for the House, Senate, and president are going to lead to persistent conflicts, in which both sides feel they have an electoral mandate to stand firm and there’s no systematic way to resolve the issue. That’s very bad news for America, and nobody knows how to stop it.
Indeed, our constitutional system was predicated on the idea that the virtue and integrity of our politicians would make effective governance possible, whatever the hurdles. The checks and balances written into our government were designed to be challenging, forcing representatives to be pragmatic and cooperative — or else. Thus it depended — and still depends — on the character of each individual public servant. As with any political system, without virtue, there is bad government. The problem is that our system depends far more on it than others, which is why we find ourselves in this troubling predicament.
But that’s just my interpretation. What are your thoughts?
There is a common misconception that the Ancient Romans created rooms called vomitoria for the sole purpose of vomiting food, which they would do regularly as part of a decadent binge and purge cycle.
In actuality, such a gluttonous practice was never common in Rome, and although vomitoria did exist, they were not used for actual vomiting. Rather, the vomitorium was an entranceway through which crowds entered and exited a stadium.
Behold the decadent vomitorium.
The world vomitorium comes from the Latin verb vomitum, which means “to spew forth” — thus a vomitorium was designed to rapidly spew forth a large number of people. Given the loose application of the word, and the widespread stereotype of Rome as a center of moral decay and debauchery, it’s an understand misconception
Popular culture is full of references to the famous French emperor being short. Heck, his name is even used to describe a presumed psychological phenomenon in which men of short stature compensate for their size by being more aggressive and domineering. (How insecure to you have to be to conquer all of Europe just to make up for your height?)
But contrary to popular belief, Napoleon Bonaparte was not short — in fact, he was actually slightly taller than the average French male of his time.
After his death in 1821, the French emperor’s height was recorded as 5 feet 2 inches in French feet, which translates to 5 feet 7 inches in the English equivalent. Many people didn’t take into account the distinction, since there wasn’t a standardized international system of measurements yet. (Sure enough, it would be the French that would come up with a universal metric system in 1799.)
Other factors contributed to this widespread belief. It didn’t help that the rival English had an obvious bias against Napoleon, and thus caricatured him as short in comparison to their own mighty toops. He was often seen in the company of his Imperial Guard, who were themselves even taller than average, as such elite troops tended to be. His nickname of “Le Petit Caporal” or “The Little Corporal” is also cited as evidence that he was short, but is now believed to have simply been a term of affection by his troops.
Although Napoleon’s actual height still seems unremarkable by modern standards, the average male at his time was shorter than men nowadays. The audacious emperor had many reasons for changing the face of Europe — compensating for a diminutive stature was certainly not one of them.
As we all know, one of the most iconic features of a pirate is the eyepatch. This ubiquitous attribute is allegedly inspired by the Arab pirate Rahmah ibn Jabir al-Jalahimah, an 18-century pirate who wore it after losing an eye during a battle. He is considered to be one of the most ruthless, fearless, and successful pirates in the Persian Gulf, if not the world.
Rahmah was born in 1760 in Qurain (modern day Kuwait) and began life as a mere horse dealer. Aspiring for greater and more daring things, he used the money he saved to buy his first ship and then set out with ten companions to begin a career of buccaneering. He was so successful that he quickly acquired a new craft: a 300-ton boat manned by 350 men. He would later have as many as 2,000 followers, many of whom were African slaves. His most prominent flagship was the Al-Manowar, which he derived from the British expression “man-of-war” to describe large and powerful warships.
Rahmah become a major political force in the region, forming alliances with many regional rulers and opposing the powerful Al-Khalifa family that ruled the strategic island of Bahrain. He formed an alliance with the first Saudi dynasty when it conquered Bahrain, thereafter founding the fort of Dammam on Saudi soil in 1809. But after the Saudis were expelled, the opportunistic Rahmah allied himself with the rulers of Muscat in 1816, joining them in their own failed invasion of Bahrain.
He died on his ship, Al-Ghatroushah, in 1826 during a sea battle against the vengeful Al-Khalifa. As the battle was being lost and his ship was being boarded, Rahmah, with his eight-year-old son by his side, lit the gunpowder kegs onboard, killing all of his men and the enemy raiders, preferring to die by his own hand than by the hands of Al-Khalifa.