Over the years, sociologists and historians have devised various means of measuring the progress of human civilization. You might look at how the average diet has improved. Or take a look at general health and life expectancy. Measure the decreasing size of circuit boards, or take note of the amount of scientific information produced in a given year. Compare the godlike abs of today’s superhero actors to their flabby forerunners.
A less frequently used metric is the ease with which you may stick your head in an active volcano.
As it happens, you can do that right now. Live webcam footage of the mouth of Icelandic volcano Bardarbunga brings hell-on-Earth to your iPad.
What you’ll see, as you wipe Cheerio-flavored milk from your lip, is the glowing inferno of Earth’s molten belly spewing forth into the very air we mortals breathe. AKA, you will see the march of human progress. To our forefathers, such a view likely meant either imminent death or hallucination.
Now, it is mild entertainment to fill that small gap between re-checking your Gmail and drafting fantasy football players.
But to linger a bit more, here is what is actually happening in that haze of thick smoke and redly glowing globs of light. Bardarbunga is Iceland’s second-tallest mountain, a volcano that reaches more than 6,500 feet above sea level. After several weeks of seismic activity (read: earthquakes), the volcano this week turned to eruption, spewing red lava some 160 feet (a good half football-field) into the sky.
If you’re looking at liquid lava, that means it’s 1,300 to 2,200 degrees Fahrenheit as it first erupts, otherwise known as extremely damn hot. Heated by geothermal energy (80% from radioactive decay, 20% energy left over from Earth’s original formation), that makes lava hot enough to ignite most human possessions it touches, if it does not bury them first.
Bardarbunga is a “stratovolcano,” meaning its profile comes from the progressive buildup of layers of cooled lava. That makes it of the same type that buried Pompeii, a people famously known for not being able to peer at the mouth of an active volcano in their pajamas. Think on those mummified losers, and marvel at how far we’ve come.
When I think about “religion and politics” in the United States, my instinct is to oversimplify: I imagine a be-suited Evangelical Republican, praising his God by voting for small government.
You may do the same. This is an unfortunate instinct, as things are rarely that simple. And, ironically, it plays into Republican strategy. Conservatives have waged a very successful campaign to tie being Christian with being Republican.
Millions of individual cases violate this “rule” — enough, in fact, that it should probably not really count as a rule. There are plenty of Christian Democrats. There are plenty of Christian progressives. And, not to be lost in all of this, there are plenty of “religious” people in the United States who are not Christian. And they cover a range of political identities.
Still, conservatives have succeeded in linking religion in the U.S. culturally with Christianity, and Christianity with Republicanism. That this is even a stronger association among those who disagree with both those philosophies is a testament to the effort’s success. More importantly, I’m sure it weighs heavily on those who subscribe to the teachings of Jesus and His Merry Men. A good Christian is a good Republican. This is in the culture. Even their opponents (pricks like me) may sling this as a stereotype and/or insult.
So it must be true.
Perhaps this has lessened a bit since what I remember as its heyday, when George W. Bush was the Evangelical in the Oval Office. I know, anecdotally, of people with definite Christian beliefs who were off-put by the war on terror and even the tax cuts for the rich. “The Christian Left” Facebook group boasts over 180,000 “likers,” and posts things like this:
“If the USA can’t afford to provide basic medical care, feed the poor, protect the environment, maintain our infrastructure, or teach our children anymore, then what exactly is our bloated military budget defending?”
Good question, right? But, anyway, that group states as its goal, “To follow Jesus by taking actions on behalf of the oppressed, the sick, the hungry, the poor, the incarcerated…” and other greatest hits of alleged Christian concern.
The group also argues vociferously that it exists — i.e., that progressive Christians are real, live people. “We can’t let the right-wing dominate Christianity like they do. They’ve twisted it into something that has nothing to do with Christ,” they say in this post. This is how successful the linkage of conservativism and Christianity has been: to be a progressive follower of Christ, you need to work hard to convince people that you are real.
But here’s a more nuanced look at religion and the U.S. ballot box: Tobin Grant of the Religion News Service mapped voting and religious persuasion using Pew data. He looked at how religious groups voted along two axes: 1) Government size (big, with many services vs. small, with few services) and 2) Morality (government that enforces morality vs. one that does not).
These are interesting choices in their own right, but they also seem to break out along the traditional Democrat/Left vs. Republican/Right divide in U.S. politics. Generally speaking, the Dems like big government and the attendant social programs, while the GOP favors a smaller government that enforces, for example, “traditional” definitions of marriage.
So, unless every Christian denomination appeared in the upper right corner (smaller government, greater protection of morality), then the popular association of Republicanism and Christianity would falter. And of course it did. Evangelicals, unsurprisingly, are up there. But check out Catholics: pretty much a circle around the center of the graph. One of the biggest denominations in the country, therefore, comes in all flavors: traditionally Democrat to traditionally Republican — and an equal number of adherents who combine beliefs across party lines.
That may be, to me, the most interesting part of this graph: It not only breaks apart the Republican-Christian identification, it also cuts across the two-party continuum. Check out the Anglicans and Presbyterians, who want a smaller government that also stays out of morality. See the Baptists, who want more morality, but also more services. And, of course, the very diverse Catholics. Not to mention the non-Christian groups: Buddhists and Atheists like governments that offer services and stay out of morality (ok, so they’re likely Democrats/Progressives). Hindus, however, tend toward more morality paired with greater services. Neither pure Democrat nor pure Republican ideology would serve their needs.
It’s a big mash, in other words. Religious identity does seem to be associated with politics, but in more-complicated ways than the popular prejudice would suggest.
ThinkProgress links this at least partially to economics. Churches with poorer flocks generally like more government services. Catholics include a relatively even distribution of economic groups, so those folks cluster around the center.
But there are almost as many economic outliers. Hindus tend to make good bank, but they vote for services. Many Evangelicals make modest incomes, and often rely on social services themselves, but oppose big government. So if religious groups do cluster into clear political quadrants, money doesn’t explain why — not entirely. Neither does political party. Faith is part of a complicated network of identities — economics, race, ethnicity, region, immigration history, and I’m sure many more — that could affect political ideology.
But there doesn’t seem to be much inherent in any particular religion’s teachings that leads people to choose a particular political identity. As ThinkProgress writes:
“Regarding the two issues discussed above, the data hints that a voter’s religious affiliation is a strong indicator of their political beliefs, but it’s not totally clear whether religious teachings are the main force shaping those political beliefs. A longer analysis of history, theology, and actual voting patterns of parishioners would be required to get a more accurate picture of what’s going on here.”
Economics is part of the greater identity matrix that shapes political beliefs. So is religion. And so are the ways that political parties themselves attempt to define your religion for you.
Assuming the United States adheres to international conventions it has signed (not always the case), we can’t use tear gas in war. The Chemical Weapons Convention treaty , which went into force in 1997, banned the substance’s use in warfare. But we’re cool to use it on our own citizens, as this quite effective meme from OurTime.org has pointed out.
That little shareable quote is effective because it immediately raises questions. First of all, it raises the question of “What the fuck?” Follow-up questions include: “Wait, is that true?” And “How does that work?” There are two main points to look at here: What does the treaty say, and how “bad” is tear gas? In other words — is it banned in warfare? SHOULD it be banned in warfare and/or anywhere else?
Trick or Treaty?
First, the fun stuff: treaty stipulations!! *hysterical cheering* Is the meme right that the United States has pledged not to use tear gas in warfare? Politifact checked this, and ruled that it “is close to being accurate.” But close only counts in horseshoes and chemical warfare, so how close are we talking? Basically, yes, the Chemical Weapons Convention, or CWC, broadly bans “the development, production, acquisition, stockpiling, retention, transfer or use of chemical weapons by States Parties.” And the convention defines tear gas as a chemical weapon. Specifically, tear gas is included under the umbrella of “riot control agents” that cause sensory irritation and other unpleasant things.
The meme is a little iffy on the year (the treaty went into force in 1997, and was only drafted in 1993), but is otherwise correct. Politifact dings them for eliding some of the context, however. The treaty makes a special provision for using tear gas as domestic riot control. Politifact says:
“[The meme] tries to leverage the Chemical Weapons Convention’s decision to ban tear gas as evidence of why the technique should be illegal for policing, yet that very same convention explicitly allows its use for domestic law enforcement purposes.”
Ok, but that depends on what you mean by “should.” Should as in, “mandated by international law”? Then, no. The Ferguson police are not explicitly violating a binding treaty. Should as in, “the right thing to do”? The meme makes a stronger case on that front. I think the point with this meme was that the banning of tear gas in warfare implies that this is a terrible substance to use on people. So it is also terrible for police to use it. Especially terrible, actually, since these are their fellow citizens.
Here’s how OurTime co-founder Jarrett Moreno characterized the motivation behind the meme when challenged by PolitiFact: “The focus of our post was raising an ethical and moral question: If we can’t use tear gas on our enemies, why is it acceptable to use on our own citizens?”
Yeah, I think that’s a point the meme actually makes pretty clearly. That the convention makes an exception for use by police forces is interesting — and a bit troubling. As Politifact found during its fact-check, it’s a bit odd for a treaty to make such a domestic-use exception. But the meme’s core point still stands: An international treaty has declared this stuff off-limits for war. You can do a lot of bad stuff in war. You can, to name a few, fire machine guns at people and drop bombs. So, tear gas must be pretty bad. And police are using it against U.S. citizens angry because one of their own was executed.
How awful is awful?
So, I think the meme is effective and honest in what it is trying to do. It effectively suggests that tear gas is a terrible thing. But here’s where my second question comes in: Is tear gas as awful as its inclusion in the CWC ban implies? Is it, perhaps, grouped among far-worse agents as a sort of overreach or excessive caution? Does it stand beside sarin gas in the “chemical weapons” lineup the same way that a wiffle bat and a Tommy gun are both weapons?
Well, sort of. Sarin gas will kill you. Tear gas, in most cases, will not. Its use is not without casualties. Some people “controlled” with high levels of tear gas have suffered heart failure and death. At least one person died because the exploding canister hit him in the head.
But tear gas intends to make you feel unpleasant. Sarin intends to make you dead. There is a huge difference there, and that at least partially explains the treaty’s two-faced approach to tear gas. Riot control gas should be kept away from the battlefield, in part, because it could be mistaken for something more deadly. In other words, tear gas is dangerous because it looks like actually dangerous stuff. Politifact quotes political scientist Richard Price: “Part of the thinking is that soldiers in the field don’t have the ability to readily distinguish in the heat of battle if a gas being used is tear gas or something more lethal.”
Signers of the CWC treaty, however, argued that tear gas is crucial for riot control. Once a riot starts, few things are as effective in stopping it without casualties, these parties said. And, thus, the bifurcated mandate was negotiated.
Let’s Get Gassy
All this sounds like minimizing. So, let’s finally answer the question: How bad is tear gas? No, it won’t (in most cases) kill you, and is (in almost every case) not intended to do so. But it is not benign. “Unpleasant” is a sanitizing word, so let’s actually imagine our sensitive eyeballs and nerves invited to a tear-gas party:
If you get tear-gassed, it means you got hit with one of three chemicals. One of those is pepper spray, of the kind used to casually Weed-Be-Gone some of the Occupy protestors. The others are Mace (chloroacetophenone, or CN) and CS (chlorobenzylidenemalononitrile). The Ferguson cops are probably using CS.
Both CS and CN work by irritating mucous membranes. These are the awesome slimy things that let your eyelids slide over your sight-orbs and keep sex from turning into dry, joyless friction (ideally). This means it makes you get that burny feeling in your eyes, mouth, nose and lungs. Eyes will burn and tear up. The gas makes it hard to breathe and can give you chest pains. If you get really super “controlled,” stuff may come out of both ends, as they say. And, I’m not sure about this, but given that your membranes will be burning, I imagine that this will be some painful barf/squirts.
Here, a (self-alleged) soldier on Yahoo! Answers says that, “It sucks. Your eyes start running and it feels like you’re breathing in fire.” Some have said that the sensation is like drowning. Your body produces mucus, filling up your airways with fluid. That’s why it feels like asphyxiation.
Yes: unpleasant. In another context, we’ve debated whether “drowning sensations” qualify as torture. Remember waterboarding? The international community was pretty clear on that: Yes, it’s torture. I’m not claiming that getting tear gassed is the same as getting George W. Bushed. But, as with the battlefield ban, the association of tear gas with a bigger, badder cousin does point out its own awfulness. It’s not sarin. It’s not torture. But it’s on the continuum.
If nothing else, let’s use the unsanitized words: It’s a “crowd-management agent that causes unpleasant sensations,” yes. But it’s also a “chemical weapon that makes you feel like you’re drowning.” Just because it’s legal to use it against civilians doesn’t mean it’s benign — or that it deserves only benign descriptions when spoken of in that context.
In the year Two-Thousand-and-Whatever-Year-You-Are-Reading-This-In, Dr. Peter Macmuffin — mad scientist extraordinaire, super fan, ComicCon never-misser, and fully funded emeritus professor at the University of Wisconsin-Madison — realized the dream of perpetual, real-life fan fiction: He created the Fantasy Drive. And he just about ruined his pants when he realized what he’d done. With this device, Dr. MacMuffin could travel anywhere the minds of geeks and nerds had dreamed. This is his first adventure.
Dr. Peter MacMuffin created a Fantasy Drive, and stepped through it into the past.
Not the real past, mind you, but a fantasy version. Brooklyn: 1940. A street where a young Steve Rogers was getting bumrushed in an alley. Peter MacMuffin knew exactly where to find Steve Rogers, you see, because that is how the Fantasy Drive worked. It took you where you wanted to go.
And this is where Dr. Peter MacMuffin wanted to go first. To save, meet, and become super-best-buds with the future Captain America. They would drink old timey beers together. They would catch a Dodgers game. Dr. Peter MacMuffin would probably get Steve Rogers all kinds of laid.
Here was the alley where Steve Rogers — at this point, still a scrawny, 90-pound, hilariously rat-faced little fink of a guy — would vainly try to defend himself against some ’40s neighborhood toughs. This was it! This was the alley. Peter MacMuffin recognized it from the movie stills and the comics he had pored over. The Fantasy Drive had worked. It had worked! He was in Earth-199999, in December of 1940, on the corner of Hicks St. and Leaman Place, Brooklyn, New York, United States. Across the seas, the war against the Nazis and Emperor Hirohito waged. And, good Lord, Dr. MacMuffin thought — the Red Skull. The Red Skull himself was creating superweapons, in the flesh. (And bone. Red bone. Boner. Dr. MacMuffin had a boner.)
Peter MacMuffin flipped the collar of his long, thick trench coat (he had dressed himself in the style of the time before stepping through the Fantasy Drive, naturally), and strode down the alley. He could hear their voices already, harsh and Brooklyn-y and careening around within the stone and brick walls of the narrow passageway.
“Myeah, stay down, Rogers,” one of the toughs said. “Myeah.”
But puny little Steve Rogers would not stay down. Bullshit, he would fucking stay down! This was the future Captain America! Peter MacMuffin thought. Like fuck he’d stay down! The little guy staggered up, grabbing a trash can lid. (Like in the movie! Peter MacMuffin thought.) “I can do this all day,” Rogers said. (Like in the movie! Peter MacMuffin thought.)
Dr. MacMuffin had come just in time. A second later, and Bucky (movie-Bucky, mind you) would have swaggered in and saved the day. All very nice and good, great, yeah, we all loved it, but this was Dr. MacMuffin’s show. Fuck some Bucky shit.
Peter MacMuffin threw back the tails of his trench coat and stood grandly, hands on hips. “Ahoy there, young neighborhood toughs,” he intoned. “Unhand Steve Rogers.”
All eyes turned to look. Scrawny Steve Rogers let the trash can shield (FORESHADOWING!) descend on his scrawny arm. The two neighborhood toughs turned slowly around, ready to spit or punch or yell, or whatever the situation required. “Who the gosh-darn are you?” one of them said.
“Myeah!?” the other one said.
“Dr. Peter MacMuffin,” Dr. Peter MacMuffin said. “Remember that name.”
As the two Brooklyn-y neighborhood toughs were kicking the shit out of him, Dr. Peter MacMuffin remembered that he did not have any fighting skills or any significant physical strength or any real plan beyond showing up and being at least bigger than puny, pre-Captain American Steve Rogers.
Here was the problem in his adventure.
In the end, Bucky (movie-Bucky), came and saved the day again (or still, or…whatever). Only this time, he saved the pretty goddamn-awful dinged up Dr. Peter MacMuffin, while puny Steve Rogers watched from the side, trash-can shield still in hand, wondering just who the hell this dude was.
On his way to the hospital in the neato 1940s ambulance, Dr. Peter MacMuffin realized he would need a different way to get close to Steve Rogers. Maybe he should learn some fighting skills? Step back through the Fantasy Drive and bulk up in the real world for a while before teleporting back for another go?
But that seemed like a lot of work. And, like, physical work. He might have to do warm-up stretches. No, Dr. Peter MacMuffin thought, as he faded in and out of unconsciousness due to the blunt-force trauma he had just endured: he would use his kick-ass science mind to do this right. He’d made the Fantasy Drive. He could make something for Captain Steve Rogers, too.
Of course. That was it. He’d goose the Super Soldier Serum.
“I’ll goose the Super Soldier Serum!” he shouted. But the EMTs did not understand what he was talking about. Because they did not know what a Super Soldier Serum was, and because Dr. Peter MacMuffin’s lips were nearly swollen shut and he’d lost five teeth. So, he basically talked like a man eating 15 marshmallows.
But in his mind, it was triumphant.
But the EMTs put him down as potentially mentally disabled.
Tune in next time, when Dr. Peter MacMuffin returns for ‘Fantasy Drive: Peter MacMuffin Gooses Captain America.’ Same Bat-place. And you can read it at whatever time is most convenient for you.
Yes, I know — the United States owes its balls to China. But I don’t mean that debt.
Yes, I know — your college education turned you into a lifelong indentured servant to the bank. But I don’t mean that debt, either.
Yes, I know — predatory loans have hung mortgage-albatrosses around the necks of the un-bailed-out. But I don’t mean that debt either.
We’re in ecological debt, too. This is much more unfortunate, because here we’re not just dealing with numbers sliding around on some investment banker’s screen. This is not the fiction of currency, in other words — it’s real-world natural resources. We’re using the Earth itself, earlier and earlier each year.
As a planet, as of today, we have already spent all of the Earth’s resources for the year. And, in case you have forgotten, it’s still summer. Called “Earth Overshoot Day,” Aug. 19 marks the day on which we’ve used all the resources the Earth can produce in a year.
If you’re wondering why all commerce and industry have not ground to a halt, it is because we do not simply use what the Earth produces in a year for that year. (If only.) We’ve been mining the Earth’s historically produced resources for generations. Mother Nature had billions of years to churn out biological and mineral goodies before we hairless apes came up with first agriculture and then industry. So, we’ve largely treated the Earth as an infinite resource. You can understand the mistake: You “arrive” on the scene with resource needs/desires, and start chipping away at billions of years of production, it feels like that shit’s just going to keep on keepin’ on.
Somewhere along the line, however, modern humans learned about scarcity. Even industrialized humans figured this out. But the United States still has a problem, collectively, wrapping its Mountain Dew-addled grey matter around the idea of ecological debt.
Again, unsurprising. As a nation, we’re clearly not good on debt or long-term-planning-type things. We tend to treat plastic like a bottomless cup of hot steaming free money, with average U.S. credit card debt at over $15K per household (!). More broadly, people in the U.S. are more likely to cast shade on climate-change claims, with one of the two major political parties routinely pushing for more drilling — because there’s always more to get.
That makes sense. At a smaller scale, the colonization of the American landmass recapitulated the appearance of agricultural and industrial societies on a 4.5-billion-year-furnished planet. For people in the pre-United States and early United States, here was a wonderland of forests, minerals, farmable land and other resources that had barely been exploited.
I recently interviewed a West Virginia agronomist for an article, and he talked about how early colonizers and U.S. pioneers dined on 100s of years of forest growth. The U.S. land mass, from sea to shining sea, had pumped centuries of photosynthetic industry into miles of free wood. It seemed like a free lunch. An Olive Garden never-ending-pasta-bowl of old-growth forest, if you will.
Nowadays, by contrast, U.S. loggers nibble at a mere 70 years or so of arboreal growth.
But that feeling, that the United States is a land of Olive Gardenian plenitude, remains with us. It is the, ironically, “conservative” position, to believe that you can just keep drilling. Conservative, because it harkens back to America’s early glory, when we were young and the top soil went down for miles and the forests were infinite and we could throw a perfect spiral right into the end zone from 50 yards out.
But not, you know, “conservative,” in the actual dictionary meaning of the word. Quite the opposite. In the ecological equivalent of racking up a ledger with China to finance wars and tax cuts, we are spending through the Earth’s yearly resources in 8 months — and that number has steadily shrunk, with Earth Overshoot Day not falling until October in 2000.
Such numbers should maybe shock us into some sort of truly conservative actions. But so should the data on rising average temperatures. In a country in which college students will be paying off credit card debt into their 70s, I imagine we’ll just slide that eco-debt onto the ledger and buy something nice for ourselves.
You know what Dungeons & Dragons fans probably get tired of? Just, the persistent air of “cool” that hangs over the game.
You know, when everyone hears D&D and assumes “bad-ass, good-looking loner.” That must get old.
Every Dungeon master tires of being asked, “So, how many varsity sports did you letter in?” Or, “Do you ever get tired of all the sexual intercourse you are having?” You walk into a bar, hoping for a quiet night sharing drinks with friends, then some stranger overhears you discussing mage armor, and now you’re awash in free drinks and romantic propositions.
Listen, people. Just because you imagine adventures in fantasy, swords-and-sorcery realms while rolling 6-sided dice, that does not automatically mean you are a sex god who can’t make it to tonight’s party because you have too many others to go to and because of all your dates.
But finally, someone has done something to nerd up fantasy role-playing games a bit. It’s about time. Blogging for the World Science Festival, Roxanne Palmer writes that casting water-breathing spells can come with the side effect of a basic statistics lesson. That is, to play the game, it helps to understand probability: What are your chances of rolling the 4 you need so that your sword-hit lands on the owlbear, Palmer asks? You find that out fairly simply by adding up the total number of dice outcomes and dividing into that the number that would produce your owlbear-slaying numeral.
Basic probability. And so, a good thing. Few sources of authority are so frequently misused and abused as statistics. Plenty of people say, “It’s true because God said it!” but statistics is a god that can even fool the atheistic scientist, if she’s not careful. It is a malleable and confusing god, with impressive intonations like “standard deviation” and “confidence intervals.”
After all, Mark Twain didn’t say “Lies, damned lies and logical fallacies.” (He probably didn’t say the other thing, either, or at least, did not originate it, but that’s beside the point.)
Playing Dungeons & Dragons may not, by itself, keep you from getting hoodwinked by unscrupulous political pollsters or poorly written pop-science. But it’s a start. And maybe next time a tragically handsome D&D-er saunters into the bar, you’ll mention Bell Curves and significance instead of asking to polish his rad motorcycle.
I can be uncharitable. Suspicious and cynical. So sometimes I think the worst of people’s motivations. That at least partly explains my reaction to social media when celebrities die.
Instinctively, I don’t like what happens. A big-name celebrity passes, and everyone (or so it feels) needs to make a statement on Facebook or Twitter. And here is the (uncharitable) reason why it bothers me: in many cases, these proclamations seem to be more about the person making them than about the person lost. At their worst, they aim to make this event, this death, into an advertisement for the individual speaking.
“I am a person who cares about an actor/writer/singer like this. That is part of my personality, I want you to know. I am connected to this person in this way, and I want everyone to know that.”
The death of a human being, a stranger, comes to serve the same purpose as listing a particular band in your About Me section.
I, of course, can’t prove that this motivates any of the statements people make. But, let’s be honest, a lot of social media is about this sort of personal branding. When that’s applied to a person’s death, I feel gross about it.
No need to tiptoe around it: The obvious prompt for these thoughts was Robin Williams’ recent death by apparent suicide. I don’t mean to paint every social media posting about this extremely troubling death as such a shallow personal brag — or even to say that any of them were entirely all about that. But when the news of a death like this hits Twitter (because that’s where it hits first these days), and people rush to Facebook-post “I remember watching ‘Mrs. Doubtfire'” or to repost a meme someone made with Robin Williams and a quote — at least part of it feels self-serving. And that grosses me out.
I’ve felt that about other celebrity deaths, as well — and have refrained from adding my own post for that reason (also, just because I did not feel impelled to post anything). But Robin Williams is a special case, I think. He was a huge star. A big Hollywood name for decades who starred in films that were huge parts of many people’s childhoods: “Hook,” “Mrs. Doubtfire.” He could exemplify the archetype of the sad clown — veering in his roles and even within his very expressive eyes from manic, comedic joy to what appeared to be, at least, deep sensitivity, if not deep inner sadness.
So, I get why people were affected. I was affected. And I’ve seen several social media posts and articles (and even some memes) with mature, intelligent and touching reflections on the actor’s life and death. Like I said, Robin Williams was a special case. The social media and journalistic response to his death seems to have continued past that initial outpouring of reaction posts into something more meaningful — even what could be a beneficial discussion of depression, and the mystery of what goes on inside another person’s mind, especially the mind of a suicide.
But those initial Facebook posts, by so many people, still bug me. They will probably bug me again the next time a well-known figure dies. But, now that I’ve plumbed the uncharitable side of me that bristles at such posts, let me be more understanding — or, at least, neutrally inquisitive.
Because, I am curious: Why must everyone (or so many, anyway) RUSH to Facebook and share their proclamation on a celebrity death? Why do people feel the need to make a social media statement?
Perhaps it is just because that is how we communicate now. We don’t transmit, one-to-one, we broadcast. In the past, we would still, of course, have discussed Robin Williams’ death. But we would have done it person-to- person. Talking about it on Facebook may happen simply because that is where we talk about everything.
Particularly the big things that everyone hears about.
Social media has so taken over the public sphere that anything that happens in the public sphere –a big event, a political result, a celebrity death — seems, instinctively to us, to happen within social media. It will be reported and discussed on social media; that is where the conversation will take place. So, to not comment on it in the social-media sphere would be to act as if it had not happened. Facebook is the giant room in which we all stand; who are you to ignore the elephant in the middle of it?
I do get that. And I’m no stranger to responding to that new sense of the world. Lord knows, I overuse Facebook and Twitter, too. Lord knows, I’m prone to make some snarky comment about national news because I feel like Twitter is the place to talk about it. Lord knows, I post jokey statuses to Facebook in an effort to garner “likes,” because it feels good to get likes.
But, again. It’s also how we communicate today. This is, frequently, how I interact with old and current friends: I post something I hope they’ll like. I comment on what they’ve said. And this all happens out in the social media courtyard, everyone shouting their conversations for the world to hear.
There was a time I felt weird about that. Writing on people’s Facebook walls, instead of simply emailing them privately, seemed bizarre at first, back in 2006 (!). I didn’t see why people did it. I suspected it was because the wall-message wasn’t really about the person being messaged, but about how the messenger wanted to appear to everyone else. I thought that was a bit icky.
But now I do it. Because it is what people do. And I’ve come to accept that posting on someone’s wall is a way to both communicate with that person and, yes, get some attention from the crowd. Maybe garner some likes. That both are happening at once doesn’t necessarily diminish either. Besides, it’s what everyone does.
I’m sure when people post their reactions to the deaths of Robin Williams and other beloved celebrities, they are similarly multitasking. As I said above, I doubt anyone’s Facebook post about the news was entirely about personal branding. But social media is a weird form of communication. It does double-duty in that way: every post goes out to everyone. You are broadcasting. And so, it is both about whatever you’re saying to one individual and about projecting yourself in front of the millions. To do so, even if only in part, over the death of a fellow human being, and a stranger — I think I’ll continue to feel strange about that.
Something coconut flavored in a big, bowl-shaped glass. The sweet smell of suntan oil and the sweat of attractive young people. A sunset. Coronas with lime. Jimmy Buffet.
A 2-foot-long stick insect crawling up your thigh.
The thing about tropical paradise is that the insects can be insane. Sky-blue water and lush vegetation typically coincide with sci-fi spiders and beetles that dine on small children (I am slightly exaggerating.) “Heaven-on-Earth” is disgusting, people, iswhatimsaying. (I like to imagine the real Heaven, if it exists, as a place where smiling angels and cherubs are constantly swatting at baseball-sized flies and fleeing Mothra.)
I’ve always lived in a cold-weather state (Iowa, New York, Wisconsin). And it’s funny: in the depths of winter, nothing seems as desirable as summer. Going to the lake. Picnics. The Fourth of July. You never imagine the insects. You just don’t think of them. “It will be warm!” your soul breathes. The permafrost will melt, seep back into the cracks in the soil, disappear down the storm drains. You will jog. There will be iced coffee and bike rides. People will get tans on their shins.
And you spend the Fourth of July slapping the back of your neck. “Oh, right. That.” You forget about the streams of ants into your kitchen, the mosquitoes buzzing in your ear just as you doze off, those clouds of little gnats above the sidewalks.
A corresponding thing happens when us frigid Northerners contemplate the tropics, I think. You envision beaches, but you will not be entirely comfortable there. Instead, you will be sticky, and never quite sure if that is sweat dribbling down your leg or a huge bug, like the one you saw in the corner of your cabin and crushed, screaming, with a loafer. Bugs, bugs everywhere. Your week in paradise will turn into a mild meth withdrawal.
But it is another level. The lifelessness of a Wisconsin winter is to the insect-world of a Wisconsin summer, as the insect-world of a Wisconsin summer is to the bugaocalypse of the tropics.
These thoughts on the big-bugged tropics were spurred by the annual Bloomin’ Butterflies exhibit at Madison’s Olbrich Botanical Gardens, where I volunteered this weekend. Dan Capps, a Madison-based amateur entomologist who’s been collecting bugs since 1958, had brought some of his mounted butterflies (and moths). One moth must have had a 6-inch wingspan.
This Mothra came from Mexico. I asked Jeff Capps (the collector’s son, who had dropped by to talk butterfly with visitors) if the big ones usually come from warmer, tropical regions. Because, I mean, that seems right. I know the insects in India were, aside from being ever-present, often big. Huge beetles, that just might show up in your rice (ok, it happened once, but it was scarring.) I’ve heard tales of tropical rainforest spiders, etc.
Some light Internet research confirms that it’s generally the case, for at least a couple reasons. Interestingly, mammals tend to grow bigger in colder climates, because the extra body volume helps them retain heat (and, vice versa, lower surface-area-to-volume helps mammals dissipate heat in surfing country). This is called Bergmann’s rule, after the German biologist who thought it up.
Insects, broadly speaking, violate this rule, however. Since the bugs don’t rely on internal heat, the other advantages of warm-weather climates can help them get big. Cold snaps freeze or kill off the bugs, so insects that celebrate a white Christmas have limited growing periods. Warm-weather bugs can keep pushing right on through the holidays, and they have more abundant food, to keep pumping up those thoraxes.
Another “biological rule” also plays a role: the “island rule,” named after Benjamin Island, founder of Island Records. Just kidding, it’s named for the actual masses of land. The isolated island gives small animals more food, with less competition and fewer predators. Since tropical islands tend to, you know, be islands, this rule can give insects a boost. As this post notes, New Zealand, for example, boasts huge-ass beetles, the weta and other big, gross things.
Question: If you’re going to freeze yourself for an eventual shot at eternal life, are you gonna go with just the head, or the deluxe package that puts your entire mortal coil on ice?
Real people are facing this choice today, as David Casarett reveals in this interview on the developing science of death and revival with the World Science Festival. Cryonics, the, uh, pretty speculative science and technology of cooling people off to preserve them for future medical miracles, has conferences, apparently. And, at one of these conferences, Casarett learned that freezings come in the two varieties: full-body or head-only.
The difference between those choices? $130,000. That is, it’ll cost you $200,000 to freeze all your parts and a (relatively) skint $70,000 to simply ice your dome.
I wonder: who’s choosing the bargain deal here? It’s a long shot that it will work, either way. You’re betting your $70K to $200K on a couple of out-there hypotheticals: A) that future science will be able to revive frozen people and B) in many cases, that future science will be able to heal a deadly disease or disorder that it now can’t. (Many people paying for cyronics do so because current medicine cannot save their lives.)
But cryonics also has a place in the ‘singularity‘ crowd, folks who believe that immortality will become a technological possibility at some point in the future. If you die now, of whatever causes, you’ll miss your shot. So, freeze immediately after death. Save for later. Thaw at room temperature.
Going for the brain-freeze-only strikes me as pretty odd. As of last year, 270 people have had themselves chilled. Two-thirds of the cold folks at cryogenics company Alcor are head (or brain) only, as are half of the American Cryonics Society’s patients (though the group no longer offers the neck-up option).
So there are people — people who think there’s a chance future science can revive them and make them immortal — who said, “Yeah, I’m all for that. But just the head for me.” A good number of the people choosing post-life (and, hopefully, pre-immortality) deep-freeze elect to skim a few bucks off the bill by trashing their appendages, torso and genitals. (People! Do you remember what genitals are used for! It’s awesome!)
I don’t know, I think when you’re aiming at immortality, go for the Venti. You know, get the full-featured package. It’s like when you buy a home or get your first adult apartment — ditch the Ikea stuff that only saves you money in the short-term, and spring for a real wooden cabinet.
I could keep going with the purchasing metaphors. But you get the point. I’m clearly fixating on the cost differential, but I think it’s fascinating. There is a not insignificant difference in money. You could very understandably decide that $70K is your upper-limit on many purchases. And if someone were to offer you an upgrade for nearly three times that amount, it would make a ton of sense to turn it down.
But we’re not talking a normal purchase here. It’s, quite literally and emphatically, not something with a limited lifespan. If you believe that the singularity, technology-based immortality, and all that are a possibility, then we’re talking about purchasing eternity. People had to make economic decisions about eternity. And some chose the discount. That is amazing.
That Wikipedia entry on neuropreservation lists some reasons people have chosen to dump their non-head portions. For one, some say that focusing on preserving the brain is better, because that’s where memory, personality, etc. are stored. Fair enough — you gotta prioritize. (Though, I’m not clear why freezing the whole thing makes the brain-icing of poorer quality.)
But cost is also a big consideration. I understand that $130,000 is nothing to scoff at. I doubt many, if any, of these purchases come from middle-class folks, much less from the poor. I assume they are mostly well-off. I just like to imagine them seeking out this crazy long-shot for immortality, deciding to do it, and then asking for the daily special.
“I will live forever! I will be immortal!” Adjusts glasses, checks out the bill. “Ummm…”
I wonder if there are any people who saved money by freezing only their brains, and then put the difference into a trust fund for their future selves. What if they wake up, it’s the year 3014, and science has discovered how to revive these ancient person-cicles. Then, the heads get put in jars, “Futurama”-style. That trust fund could be worth a lot by then. So heads’ll be rollin’. But, when they look over at their old golfing partner, who’s galavanting around on real-life legs and pitching future-nurses on the butt with his revived fingers — Mr. Head is going to need to buy a reeeaallly nice jar to make himself feel better.
Dick Nixon was before my time, given I came into this world about five months before Ronald Reagan convinced 48 states he wasn’t on the precipice of senility. I was born in 1980, is what I’m trying to say.
So, I don’t have a lot of familiarity with how Tricky Dick spoke, outside of mostly cartoon (or, at least, cartoonish) parodies. I didn’t see him on TV. I didn’t listen to any radio addresses by the Dickster (Secret Service nickname, I’m pretty sure). Still, this Twitter account that brought Richard Nixon into the modern world always read as so…authentic to me.
I was not sure why. The account captures the paranoia, ruthlessness and propensity for cursing that I vaguely knew characterized the man. Here, for example, is something the fake, living Nixon wrote today (Aug. 7):
“He can’t and should not do this, attack our integrity, and by God I’m going to fight the little bastard.”
But more so than that, it’s the turns of phrase. They are unexpected, dripping with individual voice, and poetic in a fascinatingly brutal sort of way. Check out this other one, also from today:
“The press is the enemy. The press is the enemy. Write that on a blackboard a hundred times and never forget it.”
It’s just so particular. “Write that on a blackboard a hundred times and never forget it.” Why a blackboard? Why 100 times? Why the need to punctuate the 100 writings with an admonition never to forget? They are parallel sayings; either would have done. But he hits you with both fists. And with that, “hits you with both fists,” I’m aiming at what Fake Dick does best, I think: Choose something colorful, illustrative — an image or a punchy word instead of just limply saying the thing you’re talking about.
The account reads so clearly like the sayings of a particular person — even if I am not all that familiar with the real-life person being parodied, the particularity rang out clearly. And it’s really a joy to read. It must be a joy to write, too, to speak in the voice of someone who used words so brutally.
I was happy to read, then, this profile today of the account and its author — happy to see, first, that others found the account equally impressive as I did. Happy, second, to see that it actually does sound like Nixon. Unsurprisingly, the man behind the account is a writer, a playwright named Justin Sherin. Somewhat surprisingly, at 33, he’s younger than me. I guess he paid closer attention to historical speeches than I did. Or is just a better writer. Both are likely true. That fellow can write, the little bastard.