The American Cities With The Most (And Fewest) L.G.B.T. People

The following chart comes from the New York Times, based on Gallup’s latest survey of where L.G.B.T. people live. (Click the image to make it larger.)

Areas With Largest and Smallest LGBT Populations

A summary of the results:

The Gallup analysis finds the largest concentrations in the West — and not just in the expected places like San Francisco and Portland, Ore. Among the nation’s 50 largest metropolitan areas, Denver and Salt Lake City are also in the top 10. How could Salt Lake be there, given its well-known social conservatism? It seems to be a kind of regional capital of gay life, attracting people from other parts of Utah and the Mormon West.

On the other hand, some of the East Coast places with famous gay neighborhoods, including in New York, Miami and Washington, have a smaller percentage of their population who identify as gay — roughly average for a big metropolitan area. The least gay urban areas are in the Midwest and South.

Significant as these differences are, the similarities are just as notable. Gay America, rather than being confined to a few places, spreads across every major region of the country. Nationwide, Gallup says, 3.6 percent of adults consider themselves gay, lesbian, bisexual or transgender. And even the parts of the country outside the 50 biggest metropolitan areas have a gay population (about 3 percent) not so different from some big metropolitan areas. It’s a reflection in part of increasing tolerance and of social connections made possible by the Internet.

Frank Newport, the editor in chief of Gallup, notes that the regional variation in sexual orientation and identity is much smaller than the variation in many other categories. The share of San Francisco’s population that’s gay is only two and a half times larger than the share outside major metro areas. The regional gaps in political attitudes, religion and ethnic makeup are often much wider.

“For a generation, they all remember the moment they walked through their first gay bar,” said Paul Boneberg, executive director of the G.L.B.T. Historical Society in San Francisco. “But now they come out for the first time online, and that changes, for some people, the need to leave.”

As with any such research, there are also some caveats to keep in mind:

Before this Gallup analysis, the most detailed portrait of gay demography was the Census Bureau estimates of same-sex couples, including an analysis by the Williams Institute at U.C.L.A. Those estimates and Gallup’s new data show broadly similar patterns: Salt Lake City ranks high on both, and San Jose ranks low, for instance. But couples are clearly an imperfect proxy for a total population, which makes these Gallup numbers the most detailed yet to be released.

Gallup previously released estimates for the country as a whole and for each state. The estimates are based on the survey question, “Do you, personally, identify as lesbian, gay, bisexual or transgender?”

As with any survey, the data comes with limitations. Respondents are asked to place themselves in a single category — L.G.B.T. or not — even though some people consider sexuality to be more of a spectrum. The data also does not distinguish between center cities and outlying areas. Manhattan most likely has a larger percentage of gay and lesbian residents than the New York region as a whole.

And the data is affected by the federal government’s definition of metropolitan areas. Earlier, we mentioned that Raleigh’s percentage is low in part because its area does not include Durham and Chapel Hill. Boston’s percentage may be higher because its metropolitan area is relatively small, with fewer outlying areas. On the whole, however, there is no clear relationship between a metropolitan area’s size and the share of its population that’s gay.

What are your thoughts?

The Greedy Hospitals That Drive Up Healthcare Costs

There are no shortage of culprits in America’s expensive yet, at best, average healthcare outcomes. But chief among them, despite getting comparatively less attention compared to insurers, are hospitals. As Slate reports:

The health sector employs more than a tenth of all U.S. workers, most of whom are working- and middle-class people who serve as human shields for those who profit most from America’s obscenely high medical prices and an epidemic of overtreatment. If you aim for the crooks responsible for bleeding us dry, you risk hitting the nurses, technicians, and orderlies they employ. This is why politicians are so quick to bash insurers while catering to the powerful hospital systems, which dictate terms to insurers and have mastered the art of gaming Medicare and Medicaid to their advantage. Whether you’re for Obamacare or against it, you can’t afford to ignore the fact that America’s hospitals have become predatory monopolies. We have to break them before they break us.

What do I mean by that? Last fall, Mark Warshawsky and Andrew Biggs made a striking observation: From 1999 to 2013, the cost to employers of an average family health policy increased from $4,200 to $12,000 per year. In an alternative universe in which employer premiums had remained flat, salaries would have been $7,800 higher, a life-changing difference for most low- and middle-income families. To protect these families, many people want the government to pick up a bigger share of our hospital bills. But this just shifts the burden from employers to taxpayers. The Congressional Budget Office expects federal health spending to almost double as a share of GDP between now and 2039. With the exception of interest on the debt, all other federal spending will shrink. What this means in practice is that high medical prices charged by hospitals will gobble up taxpayer dollars that might otherwise have gone to giving poor people more cash assistance, welfare-to-work programs, and Pell grants; fixing potholes; sending missions to Mars; and who knows what else.

When you survey the health systems of other rich countries, you’ll find some that rely a bit more on private insurance markets than ours (like Switzerland) and others that rely a bit more on centralized bureaucracies (like Britain), but what you won’t find is a country where hospitals dare to charge such obscenely high prices. Avik Roy, a senior fellow at the Manhattan Institute and a conservative health reform guru, has observed that although the average hospital stay in the world’s rich countries is $6,222, it costs $18,142 in the U.S. Guess what? Spending three times as much doesn’t appear to yield three times the benefit.

And while both private and public insurance schemes are far from flawless, their efficiencies and improprieties are also, at least in part, driven by the power of hospitals:

When insurers have tried to play hardball with the hospitals that gouge them, as in the 1990s, when managed-care organizations kept rising healthcare costs in check for a few short years, hospitals pressured state legislatures to enact “selective contracting” and “any willing provider” laws that impeded MCOs from steering patients to facilities where they could negotiate good rates. Moreover, MCOs can’t do much if a local hospital buys up all of the nearby medical providers.

But wait a second. How is it that hospitals are also gouging Medicare? Medicare alone accounts for 20 percent of all national health expenditures, a number that, if anything, understates the extent of its influence. Shouldn’t Medicare be able to use its pricing power to get hospitals to play ball? Medicare offers standardized reimbursement rates for different services, which hospitals always insist are far too low. Yet for some routine medical procedures, the reimbursement rate is higher than the cost of performing the procedure (which, once you already have the equipment and the personnel, can be pretty low), meaning the hospital makes money off of the procedures. For other services, like giving a patient personal attention, the reimbursement rate is lower than the cost of providing the service, so this is where hospitals skimp. The unsurprising result is that we have a health system that is increasingly devoid of personal attention while at the same time generating an ever-higher volume of the medical procedures for which Medicare is willing to overcompensate.

As hospitals continue to merge and acquire competitors (including the less expensive office-based practices), this problem is likely to only get worse in the coming years. The solution? Well, here are two offered by a law professor cited in the Slate piece:

Our government can simply accept that the market power of hospitals will continue to increase while making more of an effort to force them to accept low reimbursement rates. This approach is certainly worth trying, yet it ignores the fact that because hospitals are big employers, they wield a great deal of political influence. Whenever bureaucrats try to tame hospitals, lawmakers ride to the rescue of the big medical providers.

The second path is to rely on antitrust enforcement to crack down on hospital mergers and acquisitions and, more importantly in the long run, to make it easier for new medical providers to enter the business and to compete with hospitals. Naturally, hospitals hate this kind of competition, particularly from specialized providers that focus exclusively on providing one or two medical services inexpensively. To the hospitals, these providers “cherry-pick” and “cannibalize” their most profitable business lines without ever having to take on the larger burdens of running hospitals. There’s some truth to these complaints, which is why governments should compensate hospitals directly for care that it wishes to subsidize. But we need smaller, more efficient competitors to keep the big hospitals in check and to drive down medical costs for society as a whole.

Curbing the power of the big hospitals isn’t a left-wing or a right-wing issue. Getting this right will make solving all of our health care woes much easier, regardless of where you fall on the wisdom of Obamacare. Let’s get to it.

Indeed, everyone should have an interest in reigning in on these oligarchic and predatory practices, whether to create a freer and more cost-effective market for medical care, or to subsequently expand access to such care among the less wealthy. Granted, hospitals are but one of several factors, but judging from the data cited in this article, they are a major player.

Thoughts?

Carnivores of the World

It turns out that one country has famously carnivorous America beat: the small European nation of Luxembourg (which hosts a lot of transients and expatriate workers from around the world, thus possibly driving consumption higher).

The following graph,courtesy of The Economist, lists the countries where meat is most popular.

Carnivores of the World

It is interesting to see how some types of meat prevail in certain countries: Argentina, perhaps unsurprisingly, leads the way in beef consumption; people in Kuwait, Israel, and the Caribbean nation of St. Lucia love poultry; and Austrians, Danes, and Spaniards favor pork. While developing countries like China, India, and Brazil are driving the overall demand for meat, people in the developed world eat far more per person.

Here is how meat consumption has changed over the years, according to The Economist:

Cow (beef and veal) was top of the menu in the early 1960s, accounting for 40% of meat consumption, but by 2007 its share had fallen to 23%. Pig is now the animal of choice, with around 99m tonnes consumed. Meanwhile advances in battery farming and health-related changes in Western diets have helped propel poultry from 12% to 31% of the global total.

One wonders how much longer we can sustain such increasingly meat-dominated diets. Raising livestock is a drain on finite resources like land, water, and grain (which could all be put to better, human-centered use). It also produces a lot of pollution, including the kind that contributes to climate change. While China and other rising countries are routinely blamed for driving up demand — which is indeed the case — it is still the richer world that consumes far more resources per person.

How Screens Negatively Impact Health

Thanks to the boom in mobile technology — particularly smartphones and tablets — screens have become ubiquitous in modern society. It is almost impossible for most people to avoid exposing their eyes to some sort of screen for hours at a time, whether it is texting on your phone, bingeing shows and movies on Netflix, or playing video games.

In fact, the introduction of electricity is what first began the disruption of 3 billion years of cyclical sunlight governing the functions of life. What has been the effect of increasingly undermining this cycle, which humans have long been shaped by?

Wired explores some of the troubling research coming out regarding if and how more and more light exposure is negatively impacting us:

Researchers now know that increased nighttime light exposure tracks with increased rates of breast cancer, obesity and depression. Correlation isn’t causation, of course, and it’s easy to imagine all the ways researchers might mistake those findings. The easy availability of electric lighting almost certainly tracks with various disease-causing factors: bad diets, sedentary lifestyles, exposure to they array of chemicals that come along with modernity. Oil refineries and aluminum smelters, to be hyperbolic, also blaze with light at night.

Yet biology at least supports some of the correlations. The circadian system synchronizes physiological function—from digestion to body temperature, cell repair and immune system activity—with a 24-hour cycle of light and dark. Even photosynthetic bacteria thought to resemble Earth’s earliest life forms have circadian rhythms. Despite its ubiquity, though, scientists discovered only in the last decade what triggers circadian activity in mammals: specialized cells in the retina, the light-sensing part of the eye, rather than conveying visual detail from eye to brain, simply signal the presence or absence of light. Activity in these cells sets off a reaction that calibrates clocks in every cell and tissue in a body. Now, these cells are especially sensitive to blue wavelengths—like those in a daytime sky.

But artificial lights, particularly LCDs, some LEDs, and fluorescent bulbs, also favor the blue side of the spectrum. So even a brief exposure to dim artificial light can trick a night-subdued circadian system into behaving as though day has arrived. Circadian disruption in turn produces a wealth of downstream effects, including dysregulation of key hormones. “Circadian rhythm is being tied to so many important functions”, says Joseph Takahashi, a neurobiologist at the University of Texas Southwestern. “We’re just beginning to discover all the molecular pathways that this gene network regulates. It’s not just the sleep-wake cycle. There are system-wide, drastic changes”. His lab has found that tweaking a key circadian clock gene in mice gives them diabetes. And a tour-de-force 2009 study put human volunteers on a 28-hour day-night cycle, then measured what happened to their endocrine, metabolic and cardiovascular systems.

As the article later notes, it will take a lot more research to confirm the causation between disrupting the circadian rhythm and suffering a range of mental and physical problems. Anecdotal evidence would suggest that in the long-term, for many (though not all) people, too much exposure to screen-light can cause problems. But given the many other features of modern society that are just as culpable — long hours of work, constant overstimulation, sedentary living — identifying which, if not most, aspects of the 21st century lifestyle is responsible can be difficult to do, let alone resolve.

How Secular Is Your City?

The religiously unaffiliated — an identity that broadly encompasses everyone from strong atheists and agnostics, to New Agers, deists, and “unchurched” Christians — make up almost a quarter of the U.S. population (22 percent to be exact). Unsurprisingly, some regions, states, and cities are more likely to be irreligious than others. The Public Religion Research Institute (PRRI) lists the major U.S. cities that have the most (and the fewest) people without formal religion.

Note that the data come from the results of over 50,000 interviews across these metropolitan areas. Perhaps it is little surprise that the northeastern and western parts of the country are where most of the least religious cities are located; these regions as a whole tend to be pretty secular, especially when compared to the “Bible Belt” of the south (where the least secular cities are situated).

With 42 percent of its residents identifying as religiously unaffiliated, Portland occupies a space all its own. “Portlandia”, an urban mecca for eco-conscious free spirits, has substantially more unaffiliated residents than the next three most religiously unaffiliated cities, Seattle (33 percent), San Francisco (33 percent) and Denver (32 percent).

The least unaffiliated city in the U.S.? Nashville, with only 15 percent of its residents identifying as religiously unaffiliated. A plurality (38 percent) of Nashville is white evangelical Protestant.

Granted, by the standards of the historically devout South, 15-18 percent nonreligious is pretty high. A large part of this may have to do migration of people from the less religious northeast, a trend that began in the 1960s and ’70s and has continued to this day. Aside from the secularizing effect of these transplants, the results may also reflect the tendency for cities in general to be less religious than rural or smaller urban areas.

Given the overall growth in “Nones” — those who claim no religious affiliation in Census surveys — it is likely that the percentage of irreligious people in cities across the country will continue to grow. Again, this hardly reflects the growth of atheists or agnostics per se, just in people unwilling to identify or associate with any formal religious label or institution.

As for my hometown and current residence of Miami, I guess I am not too surprised that we are just around the national average. The city has a large youth population buttressed by many international and northern migrants. While Hispanics tend to be fairly religious, their children and grandchildren — like younger generations of most other demographic groups — are often less so.

The Problem With a Terrifying and Loving God

One of the first things that caused my religious faith to waver was the paradoxical way in which the Christian God was conveyed (at least by my particular Catholic church): infinitely loving yet presiding over a cosmic system whereby sinners and nonbelievers suffer for eternity without pardon (a punishment that is literally unsurpassable in its harshness).

Now of course, there were always caveats, namely that God does not want anyone to end up in hell (despite first creating and still maintaining such a system), hence Jesus, the work of the church and its missionaries, etc.

Setting aside the ethical and theological scruples, I also took issue (and still do) with the way that Christians themselves use this contradictory nature as some sort of stick and carrot to cajole their opponents (be they nonbelievers, adherents of other religions, or even more liberal Christians).

Captain Cassidy over a Patheos captures this approach perfectly:

When a Christian says something like “You should convert because Jesus loved you so much he died for you, but if you don’t then you’ll suffer unspeakable torment forever and ever and ever”, I’m left wondering just what is being said here. Am I supposed to convert out of awe for this supposed act of love? Or am I supposed to convert out of sheer terror and a desire to avoid torment? Because I honestly can’t tell which tactic the Christian is going for. It doesn’t seem loving to torment people.

And the really bad news for Christian zealots is, you can’t really mix and match when it comes to love and terror. I’m not sure it’s even possible to love that which terrorizes us, or (to be more accurate) that which is used to terrorize us. If you want to go with the lovey-dovey stuff, then terror destroys it; if you go with terror, then it’s hard to squeak about lovey-dovey stuff after threatening someone with lurid torture and pain. That so many Christians seem perfectly content to do exactly this mincing dance seems downright grotesque to me. If they described a real person that way, as a man who would physically hurt me if I refused to do what he wanted but who loved me and wanted my love in return, then I’d tell them to stuff it and keep their abusive asshole of a buddy far away from me. The split-second that violence enters the equation, love leaves it–unless of course someone has internalized violence so effectively that it no longer disqualifies a being from slavish devotion.

When Ken Ham ominously threatens people with “God’s judgment” and says, regarding the possible destruction of Earth by a meteor strike, that “unbelievers should be afraid of Jesus Christ’s judgment instead”, it’s hard not to wonder if he’s saying that people should convert because of their terror of this “judgment”–in other words, out of fear of going to Hell. But which is it? Is his god loving, or is he a sociopathic monster? Which gear is he picking here?

Now obviously, many Christians reject both this tactic and its theological underpinnings. Many religious people are genuinely loving and either downplay or outright repudiate the terrifying nature of God.

But in the United States especially, many people prescribe to this notion and utilize it in their preaching, proselytizing, or apologetics. It represents a cynical and totalitarian mentality that seems less concerned about others’ salvation and more focused on manipulating people: to use my earlier analogy, if the carrot of God’s love does not work, than the stick of His fear just might.

Now that I’m out of Christianity and have been for a while, I can see these fearmongering, terroristic tactics for what they are: attempts to strong-arm compliance and force obedience. If you want to see what a Christian really thinks is persuasive, wait to see what that person’s big guns look like. Look for what follows the “but” in their proselytizing. If you let people do it, they’ll tell you exactly what’s really important to them. “He loves you, but if you don’t obey him then you’ll suffer mightily” is the message of way too many Christians.

Violence is the last refuge of the incompetent, as Isaac Asimov put it long ago. Threats are what bullies use when they can’t get their way any other way. When someone can’t win by reason or logic or facts, and that person lacks a moral compass and has no empathy or compassion for others, then such a person will use force to try to win by any means possible. If Christians actually had a good reason to fear the threats they make, they’d already have given us the goods.

Once you’ve identified the threat being made, then you can ask for evidence that it’s a threat you really need to fear. If Ken Ham really thinks that his god’s judgment would be scarier and worse for humanity than a meteor hitting the Earth, but can’t come up with anything solid and credible to explain why his threat is something anybody needs to fear, then I’m safe in dismissing what he blusters as the bombast of a bully angry that he can’t get his way any other way. And I call shenanigans on him claiming that Christians aren’t scared at all of catastrophes; I was a Christian myself for many years and can absolutely tell him that why yes, a great many Christians are downright terrified of the end of the world. He’s talking out of his ass, but what else is new? His followers will eat it up with a spoon and parrot it, many hoping that their own fears will be allayed if they do.

For me, this strain of Christianity says more about the psychology and personality of its adherents than about the religion as a whole (though insofar as Christian doctrine gives fuel to such a common approach, it definitely has its problems).

Just as I have met many friendly and compassionate people who prescribe to a more friendly and compassionate form of Christianity (which in some forms seems more Deistic or New Agey than anything), so too do less than kind people, often with an aggressive and domineering streak, just happen to apply their Christian faith in that way.

Quite a few non-believers and even many Christians have already abandoned threats and the very idea of Hell as incompatible with the idea of a loving god. But to those Christians who use their religion as a way of expressing aggression and dominance, those threats are their primary tools, and they’ve got all kinds of rationalizations already made up in their minds about why they can’t possibly stop threatening people. Phrases like “for their own good” figure prominently here.

The funny thing is that all we’d need is one single credible piece of evidence supporting their threats. Just one. That’s all. But they can’t do that. Instead, they are content to keep issuing threats. And if someone vulnerable happens to fall for the threats and converts on the basis of them, then these Christian bullies will feel 100% justified in continuing to use threats and bullying to get their way. But even if the threats don’t work, they’ll keep using them because threats are what they, personally, think are compelling–as I’ve mentioned before, these threats overshadow even the very best intentions for many Christians.

If the fear of God’s wrath and punishment is the strongest incentive you have, or think others should have, for believing in your religion, you need to reevaluate the basis and sincerity of your faith. Most of these individuals would never accept fear as a legitimate reason to trust or follow political leaders, or any human being. Does God’s divine nature and / or status as our alleged Creator make him immune to such reasonable considerations? Are we supposed to cower in fear of a loving, fatherly creator and use that terror — in some bizarre combination with love and awe — as a basis to believe in Him? It sounds like an abusive relationship more than anything. How can genuine love be compelled by threat of violence of the worst kind imaginable?

What are your thoughts?

America’s Troubling Firebombing of Japan

Prior to the better-known atomic bombings of Hiroshima and Nagasaki (which have also been subject to controversy and ethical discussion), the United States executed a series of “firebombings” against Japanese cities that claimed more lives in a single night — over 100,000 civilians, mostly women, children, and the elderly — than the more infamous atomic strikes that followed months later.

Jacobin examines the various problems with the campaign, on both a strategic and ethical level (e.g. there were little to no military or economic targets, virulent anti-Japanese racism may have motivated the attacks, etc.)

In January 1945 — two days before Franklin Roosevelt was to meet with British Prime Minister Winston Churchill and Soviet leader Joseph Stalin in Yalta — the Japanese were offering surrender terms almost identical to what was accepted by the Americans on the USS Missouri in the Japan Bay on September 2, 1945.

The Japanese population was famished, the country’s war machine was out of gas, and the government had capitulated. The Americans were unmoved. The firebombing and the nuclear attacks were heartlessly carried out. If anyone is guilty of disregarding the “context” of the firebombing of Tokyo, it’s the sycophantic and biased American historians who deride these critical facts.

What little criticism that exists of the firebombing is attacked for failing to put the bombing in proper context and not providing alternate solutions for ending the war. These attacks are also riddled with “they did it too” justifications.

World War II was carried out with brutality on all fronts. The Japanese military murdered nearly six million Chinese, Korean, and Filipino civilians by the end of it. However, to argue that Japanese civilians deserved to die — that children deserved to die — at the hands of the U.S. military because their government killed civilians in other Asian countries is an indefensible position, in any moral or ethical framework.

One can see parallels with the equally controversial Allied bombings of Dresden, which killed 22,000-25,000 civilians for little strategic merit.

What are your thoughts on these largely undiscussed (at least in popular discourse) actions? I recommend reading the whole article to get a wider picture of the positions for and against this decision, and whether the usual justifications have any merit.

What Does It Take To Be Middle-Class In Your U.S. City?

From NPR’s Planet Money column, comes a useful guide to seeing where you stand in the socioeconomic spectrum in your city. It is no secret that cost of living varies wildly from region to region, and even cities within the same state can have huge disparities in what constitutes a livable or comfortable income. (Click the hyperlink to the original article if you need to a larger version.)

Middle-Class Around The World

The article notes some details about the data and methodology:

We used the family income data from the 2013 American Community Survey. This counts only families, which the government defines as households with two or more people related by birth, marriage or adoption.

The graph focuses on families living in the country’s 30 most populous cities. For the most part, it doesn’t include those living in suburbs and rural areas. That’s why the national median is higher than the median incomes in almost all of the cities on the graph.

One final note: In the area around San Jose (which includes the heart of Silicon Valley), 13 percent of families have annual incomes of $250,000 or more.

So where does your city stand in this chart?

Featured Image -- 6912

Why I treat my 7-year-old like a person instead of a tragedy waiting to happen

Eupraxsophy:

Interesting critique about the prevailing 21st century attitudes towards children in the U.S. and other Western countries. Thoughts?

Originally posted on Quartz:

At age 10, my grandfather was herding cattle 20 miles through the mountains of rural Utah. It might take him all day, but he did it, and he did it by himself.

I was delivering newspapers through my suburban neighborhood when I was 8 years old. Sometimes in the dark of morning, sometimes with my older sister, but mostly on my own.

My oldest son—a 7-year-old kid growing up in Brooklyn—is hardly even allowed to go outside by himself. Not yet. And not because I don’t trust him or think he couldn’t handle himself. It’s simply because I don’t trust the neighbors not to call Child Protective Services (CPS) on me.

You could say that Brooklyn in 2014 is such a different time and place from rural Utah in the 1930s, and you would, in some ways, be right. Urban jungles pose different dangers than rural mountains and even suburban…

View original 896 more words

Today’s Google Doodle Honors Emmy Noether

Google’s iconic doodles have a great track record of highlighting important but often obscure figures in science, social justice, and other human endeavors. Today’s colorful doodle casts a well-needed spotlight on Emmy Noether, an influential German mathematician who made groundbreaking contributions to abstract algebra and theoretical physics.

Some of the greatest minds of the time, including Albert Einstein himself, owed a debt of gratitude to her pioneering work. As the Washington Post notes:

“In the judgment of the most competent living mathematicians”, penned [Einstein], “Fräulein Noether was the most significant creative mathematical genius thus far produced since the higher education of women began.”

After a lifetime of being discouraged and disallowed, underpaid and unpaid, doubted and ousted, Emmy Noether had reached the pinnacle of peer respect among her fellow giants of mathematical science.

“In the realm of algebra, in which the most gifted mathematicians have been busy for centuries”, Einstein continued in his letter, “she discovered methods which have proved of enormous importance in the development of the present-day younger generation of mathematicians”.

Read more about her delightful doodle, as well as the accomplishments it highlights, here.