All posts by michael.dhar@gmail.com

Where’d You Get Those Genes?

via deviantart.net

Where’d you get your genes?

Oh, on clearance at Penney’s.

Right!? Probably a pretty good/bad first response if a science communicator ever asked you this. Especially a TED presenter. (For some reason, those things bug me — so artificial and smug in their self-importance. “Where do your genes come from?” asks the fame-hungry scientist, stalking the stage with a headset microphone. “PENNEY’S ON CLEARANCE!!” I yell from the back row, before being escorted off the premises.)

Anyway, your GENES, of course, do not come from Penney’s. Human genetic material is one of the few things department stores do not (yet) sell. But here is an actual TEDed presentation on a question you may not have considered: how’d you get those genes that turn matter into you? A question so basic/fundamental that it is an accomplishment just to ask it.

The answer is: from just three basic sources.

Most casual laypersons know that our DNA consists of genes — packets of genetic material that convey traits. But why does DNA contain these little phenotypic missives? How did this unzippable, replicable molecule come to be segmented into the chemical equivalent of chapters (or sentences, or words)?

Here’s how the TEDed talk tells it: First, well, “it depends on the gene,” they say. “It depends” is hardly ever a satisfying answer, so let’s try to boil those “depends” down to a few (hopefully) interesting sources. Your genes come from:

1) Legacy Genes: The earliest forms of life first developed genes in order to replicate/survive, and passed them on down to you, me and Frank over the millennia. For example, genes for DNA copying.

2) Copy Errors: Speaking of DNA copying, new genes have arisen when DNA accidentally created multiple copies of a gene. The new copies could then mutate into new genes. Presto! Your genome now has both Gene Classic and New Gene. Plus, maybe Crystal Gene and Lemon-Lime Gene down the road.

3) Random Employment: Long stretches of noncoding DNA, ‘genetic gibberish,’ sits there in the genome just sort of hanging out. Sometimes, mutations make it, in fact, do something — i.e., code for a protein. If further mutations make that protein useful — new gene!

And from those three sources, all the bewildering array of functions the human and other bodies perform. One of the more interesting examples from the video: One snake’s venom originated as a chemical made in the pancreas. That gene got copied, mutated, and took a trip, ending up expressing in the fangs. Pancreatic juice did bad things to snake victims, so it turned out to be a useful change. So the snake got a venom gene.

It amounts to a lot of reshuffling. Billions and billions of years of reshuffling of text, and it seeds the planet with an incredibly rich vocabulary of genes. Including mouse-paralyzing pancreatic fang-juice.

If you’re paying attention, you’ll notice that the first answer sort of begs the original question. How were those first genes created? How did the first replicable packets of genetic material — genes — develop? It’s a lot easier to answer the question of how, once there are a few genes, new versions are formed. Once you have the basic machinery going, new widgets can come along. But that first segmenting of DNA into genes would have to arise as the genetic code mutated and evolved, and started doing discrete things on discrete stretches of itself.

There’s plenty out there on the origin of life from nonliving matter. A crucial first step is the development of replicating molecules, RNA and/or DNA. These replicating molecules would be subject to evolution, eventually. Then, you get molecules of lesser or greater fitness. And, I suppose, this could involve the kind of information-segregation that you see with genes. But it seems to me like a still somewhat mysterious step.

Holes in the Table

via yatzer.com

Physics can make the world seem weird, and that’s pretty fun. Some notable popular science writers (see Hawking, Stephen and Kaku, Michio) have concocted some pretty thrilling science-lite confections out of relativity- and quantum-related weirdness.

It provides a good avenue for developing a somewhat superficial appreciation for science, does this physics weirdness. And I should know. I’ve flitted around the edges of actual science for much of my life — intermittently overwhelmed, bored and even depressed by it, but never able to completely let it go. So, I’ve gone after a writing and editing career, but I’ve mostly worked in several forms of science communication. I dropped biology for English, but kept gravitating (so to speak) toward literary intersections with science. All the way up until my MA thesis, which was a look at technology and religion in Rushdie. It was a probably pretty terrible look at technology and religion in Rushdie, but they let me have the degree.

But I still remember those early encounters with the weirdness of physics, and how they made the science seem like something worth devoting your life to. Reality is like nothing you suspected, these theories born of squiggly maths said. The everyday world is a fascinating realm of ghosts and apparitions, and what’s even better, it is on good authority that the world is this way.

Suddenly, the authority figures, professors and scientists, are slipping you drugs.

I remember clearly one such experience of the weirding of the world, and it didn’t even come from Hawking or from any of his brilliant ilk. It didn’t even come from the far-out fields of advanced physics. Just basic particle physics in a high school textbook.

In physics, we covered the structure of the atom, of course. You remember: that solar-system image of an electron doing its 1950s swing around the central cherry of the nucleus. Here we came upon the factoid that an atom is mostly empty space. And I had a holy-shit moment.

It’s possible I imported that ‘neato science factoid’ from a pop-sci book. It sounds more like it would come from them. But, nevertheless, it was in Mr. Dick Winder’s physics class. I looked across at the black surface of the science class tabletop, and I imagined an illusion — a ghost, tricking us with its reflection of light beams, but a nearly empty network of mist and cobwebs behind that.

Sure, it would cut your forehead, and concuss your brain should you slip on a sheet of notebook paper and fall onto a corner of that table — but that was tantamount to an mirage. Just billions of electrons, spewing their force vectors forth into the aether. There was no THERE there. Or nearly so. These tables, these teenage limbs — mine scrawny, other kids’ muscular and capable of hurling footballs — just blobs of misty space.

We live in a Swiss Cheese universe, and you people are worried about the labels on your jeans?

That’s what the weird views of physics could mean to me as a, you might have guessed, nerdy and isolated teenager.

But I got immune to that mystery, eventually. Make it into college, and science dissolves into a slew of equations and figures. It’s a lot of memorization. I’m giving myself excuses. The truth is, actual science is hard. And I didn’t have the brain-stomach for it.

I’m studying science, real science, again — in an online bioinformatics program. It’s discipline, and sacrifice, and boredom, and tired brainwaves — and wagon-loads of self-doubt. Balanced, hopefully, by the conviction that this stuff matters.

So it can be good to be reminded of the goofy, enthralling, mystical side of pop-sci physics. Here’s an example of that, from the Smithsonian: the universe as a hologram, the universe as a computer simulation. I read books about these flights of fancy when I was a younger nerd. It’s still good stuff.

You Can Look in a Volcano

via The Guardian

We live at a time in which you can stick your head inside an active volcano while you’re in your underwear eating cold cereal.

In fact, I advise you to do just that.

Over the years, sociologists and historians have devised various means of measuring the progress of human civilization. You might look at how the average diet has improved. Or take a look at general health and life expectancy. Measure the decreasing size of circuit boards, or take note of the amount of scientific information produced in a given year. Compare the godlike abs of today’s superhero actors to their flabby forerunners.

A less frequently used metric is the ease with which you may stick your head in an active volcano.

As it happens, you can do that right now. Live webcam footage of the mouth of Icelandic volcano Bardarbunga brings hell-on-Earth to your iPad.

What you’ll see, as you wipe Cheerio-flavored milk from your lip, is the glowing inferno of Earth’s molten belly spewing forth into the very air we mortals breathe. AKA, you will see the march of human progress. To our forefathers, such a view likely meant either imminent death or hallucination.

Now, it is mild entertainment to fill that small gap between re-checking your Gmail and drafting fantasy football players.

But to linger a bit more, here is what is actually happening in that haze of thick smoke and redly glowing globs of light. Bardarbunga is Iceland’s second-tallest mountain, a volcano that reaches more than 6,500 feet above sea level. After several weeks of seismic activity (read: earthquakes), the volcano this week turned to eruption, spewing red lava some 160 feet (a good half football-field) into the sky.

If you’re looking at liquid lava, that means it’s 1,300 to 2,200 degrees Fahrenheit as it first erupts, otherwise known as extremely damn hot. Heated by geothermal energy (80% from radioactive decay, 20% energy left over from Earth’s original formation), that makes lava hot enough to ignite most human possessions it touches, if it does not bury them first.

Bardarbunga is a “stratovolcano,” meaning its profile comes from the progressive buildup of layers of cooled lava. That makes it of the same type that buried Pompeii, a people famously known for not being able to peer at the mouth of an active volcano in their pajamas. Think on those mummified losers, and marvel at how far we’ve come.

Breaking the Conservative-Christian Stereotype

How the denominations vote (via Tobin Grant)

When I think about “religion and politics” in the United States, my instinct is to oversimplify: I imagine a be-suited Evangelical Republican, praising his God by voting for small government.

You may do the same. This is an unfortunate instinct, as things are rarely that simple. And, ironically, it plays into Republican strategy. Conservatives have waged a very successful campaign to tie being Christian with being Republican.

Millions of individual cases violate this “rule” — enough, in fact, that it should probably not really count as a rule. There are plenty of Christian Democrats. There are plenty of Christian progressives. And, not to be lost in all of this, there are plenty of “religious” people in the United States who are not Christian. And they cover a range of political identities.

Still, conservatives have succeeded in linking religion in the U.S. culturally with Christianity, and Christianity with Republicanism. That this is even a stronger association among those who disagree with both those philosophies is a testament to the effort’s success. More importantly, I’m sure it weighs heavily on those who subscribe to the teachings of Jesus and His Merry Men. A good Christian is a good Republican. This is in the culture. Even their opponents (pricks like me) may sling this as a stereotype and/or insult.

So it must be true.

Perhaps this has lessened a bit since what I remember as its heyday, when George W. Bush was the Evangelical in the Oval Office. I know, anecdotally, of people with definite Christian beliefs who were off-put by the war on terror and even the tax cuts for the rich. “The Christian Left” Facebook group boasts over 180,000 “likers,” and posts things like this:

“If the USA can’t afford to provide basic medical care, feed the poor, protect the environment, maintain our infrastructure, or teach our children anymore, then what exactly is our bloated military budget defending?”

Good question, right? But, anyway, that group states as its goal, “To follow Jesus by taking actions on behalf of the oppressed, the sick, the hungry, the poor, the incarcerated…” and other greatest hits of alleged Christian concern.

The group also argues vociferously that it exists — i.e., that progressive Christians are real, live people. “We can’t let the right-wing dominate Christianity like they do. They’ve twisted it into something that has nothing to do with Christ,” they say in this post. This is how successful the linkage of conservativism and Christianity has been: to be a progressive follower of Christ, you need to work hard to convince people that you are real.

But here’s a more nuanced look at religion and the U.S. ballot box: Tobin Grant of the Religion News Service mapped voting and religious persuasion using Pew data. He looked at how religious groups voted along two axes: 1) Government size (big, with many services vs. small, with few services) and 2) Morality (government that enforces morality vs. one that does not).

These are interesting choices in their own right, but they also seem to break out along the traditional Democrat/Left vs. Republican/Right divide in U.S. politics. Generally speaking, the Dems like big government and the attendant social programs, while the GOP favors a smaller government that enforces, for example, “traditional” definitions of marriage.

So, unless every Christian denomination appeared in the upper right corner (smaller government, greater protection of morality), then the popular association of Republicanism and Christianity would falter. And of course it did. Evangelicals, unsurprisingly, are up there. But check out Catholics: pretty much a circle around the center of the graph. One of the biggest denominations in the country, therefore, comes in all flavors: traditionally Democrat to traditionally Republican — and an equal number of adherents who combine beliefs across party lines.

That may be, to me, the most interesting part of this graph: It not only breaks apart the Republican-Christian identification, it also cuts across the two-party continuum. Check out the Anglicans and Presbyterians, who want a smaller government that also stays out of morality. See the Baptists, who want more morality, but also more services. And, of course, the very diverse Catholics. Not to mention the non-Christian groups: Buddhists and Atheists like governments that offer services and stay out of morality (ok, so they’re likely Democrats/Progressives). Hindus, however, tend toward more morality paired with greater services. Neither pure Democrat nor pure Republican ideology would serve their needs.

It’s a big mash, in other words. Religious identity does seem to be associated with politics, but in more-complicated ways than the popular prejudice would suggest.

ThinkProgress  links this at least partially to economics. Churches with poorer flocks generally like more government services. Catholics include a relatively even distribution of economic groups, so those folks cluster around the center.

But there are almost as many economic outliers. Hindus tend to make good bank, but they vote for services. Many Evangelicals make modest incomes, and often rely on social services themselves, but oppose big government. So if religious groups do cluster into clear political quadrants, money doesn’t explain why — not entirely. Neither does political party. Faith is part of a complicated network of identities — economics, race, ethnicity, region, immigration history, and I’m sure many more — that could affect political ideology.

But there doesn’t seem to be much inherent in any particular religion’s teachings that leads people to choose a particular political identity. As ThinkProgress writes:

Regarding the two issues discussed above, the data hints that a voter’s religious affiliation is a strong indicator of their political beliefs, but it’s not totally clear whether religious teachings are the main force shaping those political beliefs. A longer analysis of history, theology, and actual voting patterns of parishioners would be required to get a more accurate picture of what’s going on here.”

Economics is part of the greater identity matrix that shapes political beliefs. So is religion. And so are the ways that political parties themselves attempt to define your religion for you.

Banned in War, Why Is Tear Gas OK Against Civilians?

 

via boiseweekly.com

Assuming the United States adheres to international conventions it has signed (not always the case), we can’t use tear gas in war. The Chemical Weapons Convention treaty , which went into force in 1997, banned the substance’s use in warfare. But we’re cool to use it on our own citizens, as this quite effective meme from OurTime.org has pointed out.

That little shareable quote is effective because it immediately raises questions. First of all, it raises the question of “What the fuck?” Follow-up questions include: “Wait, is that true?” And “How does that work?” There are two main points to look at here: What does the treaty say, and how “bad” is tear gas? In other words — is it banned in warfare? SHOULD it be banned in warfare and/or anywhere else?

Trick or Treaty?

First, the fun stuff: treaty stipulations!! *hysterical cheering* Is the meme right that the United States has pledged not to use tear gas in warfare? Politifact checked this, and ruled that it “is close to being accurate.” But close only counts in horseshoes and chemical warfare, so how close are we talking? Basically, yes, the Chemical Weapons Convention, or CWC, broadly bans “the development, production, acquisition, stockpiling, retention, transfer or use of chemical weapons by States Parties.” And the convention defines tear gas as a chemical weapon. Specifically, tear gas is included under the umbrella of “riot control agents” that cause sensory irritation and other unpleasant things. 

The meme is a little iffy on the year (the treaty went into force in 1997, and was only drafted in 1993), but is otherwise correct. Politifact dings them for eliding some of the context, however. The treaty makes a special provision for using tear gas as domestic riot control. Politifact says:

“[The meme] tries to leverage the Chemical Weapons Convention’s decision to ban tear gas as evidence of why the technique should be illegal for policing, yet that very same convention explicitly allows its use for domestic law enforcement purposes.”

Ok, but that depends on what you mean by “should.” Should as in, “mandated by international law”? Then, no. The Ferguson police are not explicitly violating a binding treaty. Should as in, “the right thing to do”? The meme makes a stronger case on that front. I think the point with this meme was that the banning of tear gas in warfare implies that this is a terrible substance to use on people. So it is also terrible for police to use it. Especially terrible, actually, since these are their fellow citizens.

Here’s how OurTime co-founder Jarrett Moreno characterized the motivation behind the meme when challenged by PolitiFact: “The focus of our post was raising an ethical and moral question: If we can’t use tear gas on our enemies, why is it acceptable to use on our own citizens?”

Yeah, I think that’s a point the meme actually makes pretty clearly. That the convention makes an exception for use by police forces is interesting — and a bit troubling. As Politifact found during its fact-check, it’s a bit odd for a treaty to make such a domestic-use exception. But the meme’s core point still stands: An international treaty has declared this stuff off-limits for war. You can do a lot of bad stuff in war. You can, to name a few, fire machine guns at people and drop bombs. So, tear gas must be pretty bad. And police are using it against U.S. citizens angry because one of their own was executed.

How awful is awful?

So, I think the meme is effective and honest in what it is trying to do. It effectively suggests that tear gas is a terrible thing. But here’s where my second question comes in: Is tear gas as awful as its inclusion in the CWC ban implies? Is it, perhaps, grouped among far-worse agents as a sort of overreach or excessive caution? Does it stand beside sarin gas in the “chemical weapons” lineup the same way that a wiffle bat and a Tommy gun are both weapons?

Well, sort of. Sarin gas will kill you. Tear gas, in most cases, will not. Its use is not without casualties. Some people “controlled” with high levels of tear gas have suffered heart failure and death. At least one person died because the exploding canister hit him in the head.

But tear gas intends to make you feel unpleasant. Sarin intends to make you dead. There is a huge difference there, and that at least partially explains the treaty’s two-faced approach to tear gas. Riot control gas should be kept away from the battlefield, in part, because it could be mistaken for something more deadly. In other words, tear gas is dangerous because it looks like actually dangerous stuff. Politifact quotes political scientist Richard Price: “Part of the thinking is that soldiers in the field don’t have the ability to readily distinguish in the heat of battle if a gas being used is tear gas or something more lethal.” 

Signers of the CWC treaty, however, argued that tear gas is crucial for riot control. Once a riot starts, few things are as effective in stopping it without casualties, these parties said. And, thus, the bifurcated mandate was negotiated.

Let’s Get Gassy

All this sounds like minimizing. So, let’s finally answer the question: How bad is tear gas? No, it won’t (in most cases) kill you, and is (in almost every case) not intended to do so. But it is not benign. “Unpleasant” is a sanitizing word, so let’s actually imagine our sensitive eyeballs and nerves invited to a tear-gas party:

If you get tear-gassed, it means you got hit with one of three chemicals. One of those is pepper spray, of the kind used to casually Weed-Be-Gone some of the Occupy protestors. The others are Mace (chloroacetophenone, or CN) and CS (chlorobenzylidenemalononitrile). The Ferguson cops are probably using CS.

Both CS and CN work by irritating mucous membranes. These are the awesome slimy things that let your eyelids slide over your sight-orbs and keep sex from turning into dry, joyless friction (ideally). This means it makes you get that burny feeling in your eyes, mouth, nose and lungs. Eyes will burn and tear up. The gas makes it hard to breathe and can give you chest pains. If you get really super “controlled,” stuff may come out of both ends, as they say. And, I’m not sure about this, but given that your membranes will be burning, I imagine that this will be some painful barf/squirts.

Here, a (self-alleged) soldier on Yahoo! Answers says that, “It sucks. Your eyes start running and it feels like you’re breathing in fire.” Some have said that the sensation is like drowning. Your body produces mucus, filling up your airways with fluid. That’s why it feels like asphyxiation.

Yes: unpleasant. In another context, we’ve debated whether “drowning sensations” qualify as torture. Remember waterboarding? The international community was pretty clear on that: Yes, it’s torture. I’m not claiming that getting tear gassed is the same as getting George W. Bushed. But, as with the battlefield ban, the association of tear gas with a bigger, badder cousin does point out its own awfulness. It’s not sarin. It’s not torture. But it’s on the continuum.

If nothing else, let’s use the unsanitized words: It’s a “crowd-management agent that causes unpleasant sensations,” yes. But it’s also a “chemical weapon that makes you feel like you’re drowning.” Just because it’s legal to use it against civilians doesn’t mean it’s benign — or that it deserves only benign descriptions when spoken of in that context.

Dr. Peter MacMuffin’s Fantasy Drive

via wikia.net

In the year Two-Thousand-and-Whatever-Year-You-Are-Reading-This-In, Dr. Peter Macmuffin — mad scientist extraordinaire, super fan, ComicCon never-misser, and fully funded emeritus professor at the University of Wisconsin-Madison — realized the dream of perpetual, real-life fan fiction: He created the Fantasy Drive. And he just about ruined his pants when he realized what he’d done. With this device, Dr. MacMuffin could travel anywhere the minds of geeks and nerds had dreamed. This is his first adventure.

Dr. Peter MacMuffin created a Fantasy Drive, and stepped through it into the past.

Not the real past, mind you, but a fantasy version. Brooklyn: 1940. A street where a young Steve Rogers was getting bumrushed in an alley. Peter MacMuffin knew exactly where to find Steve Rogers, you see, because that is how the Fantasy Drive worked. It took you where you wanted to go.

And this is where Dr. Peter MacMuffin wanted to go first. To save, meet, and become super-best-buds with the future Captain America. They would drink old timey beers together. They would catch a Dodgers game. Dr. Peter MacMuffin would probably get Steve Rogers all kinds of laid.

Here was the alley where Steve Rogers — at this point, still a scrawny, 90-pound, hilariously rat-faced little fink of a guy — would vainly try to defend himself against some ’40s neighborhood toughs. This was it! This was the alley. Peter MacMuffin recognized it from the movie stills and the comics he had pored over. The Fantasy Drive had worked. It had worked! He was in Earth-199999, in December of 1940, on the corner of Hicks St. and Leaman Place, Brooklyn, New York, United States. Across the seas, the war against the Nazis and Emperor Hirohito waged. And, good Lord, Dr. MacMuffin thought — the Red Skull. The Red Skull himself was creating superweapons, in the flesh. (And bone. Red bone. Boner. Dr. MacMuffin had a boner.)

Peter MacMuffin flipped the collar of his long, thick trench coat (he had dressed himself in the style of the time before stepping through the Fantasy Drive, naturally), and strode down the alley. He could hear their voices already, harsh and Brooklyn-y and careening around within the stone and brick walls of the narrow passageway.

“Myeah, stay down, Rogers,” one of the toughs said. “Myeah.”

But puny little Steve Rogers would not stay down. Bullshit, he would fucking  stay down! This was the future Captain America! Peter MacMuffin thought. Like fuck he’d stay down! The little guy staggered up, grabbing a trash can lid. (Like in the movie! Peter MacMuffin thought.) “I can do this all day,” Rogers said. (Like in the movie! Peter MacMuffin thought.)

Dr. MacMuffin had come just in time. A second later, and Bucky (movie-Bucky, mind you) would have swaggered in and saved the day. All very nice and good, great, yeah, we all loved it, but this was Dr. MacMuffin’s show. Fuck some Bucky shit.

Peter MacMuffin threw back the tails of his trench coat and stood grandly, hands on hips. “Ahoy there, young neighborhood toughs,” he intoned. “Unhand Steve Rogers.”

All eyes turned to look. Scrawny Steve Rogers let the trash can shield (FORESHADOWING!) descend on his scrawny arm. The two neighborhood toughs turned slowly around, ready to spit or punch or yell, or whatever the situation required. “Who the gosh-darn are you?” one of them said.

“Myeah!?” the other one said.

“Dr. Peter MacMuffin,” Dr. Peter MacMuffin said. “Remember that name.”

As the two Brooklyn-y neighborhood toughs were kicking the shit out of him, Dr. Peter MacMuffin remembered that he did not have any fighting skills or any significant physical strength or any real plan beyond showing up and being at least bigger than puny, pre-Captain American Steve Rogers.

Here was the problem in his adventure.

In the end, Bucky (movie-Bucky), came and saved the day again (or still, or…whatever). Only this time, he saved the pretty goddamn-awful dinged up Dr. Peter MacMuffin, while puny Steve Rogers watched from the side, trash-can shield still in hand, wondering just who the hell this dude was.

On his way to the hospital in the neato 1940s ambulance, Dr. Peter MacMuffin realized he would need a different way to get close to Steve Rogers. Maybe he should learn some fighting skills? Step back through the Fantasy Drive and bulk up in the real world for a while before teleporting back for another go?

But that seemed like a lot of work. And, like, physical work. He might have to do warm-up stretches. No, Dr. Peter MacMuffin thought, as he faded in and out of unconsciousness due to the blunt-force trauma he had just endured: he would use his kick-ass science mind to do this right. He’d made the Fantasy Drive. He could make something for Captain Steve Rogers, too.

Of course. That was it. He’d goose the Super Soldier Serum.

“I’ll goose the Super Soldier Serum!” he shouted. But the EMTs did not understand what he was talking about. Because they did not know what a Super Soldier Serum was, and because Dr. Peter MacMuffin’s lips were nearly swollen shut and he’d lost five teeth. So, he basically talked like a man eating 15 marshmallows.

But in his mind, it was triumphant.

But the EMTs put him down as potentially mentally disabled.

Tune in next time, when Dr. Peter MacMuffin returns for ‘Fantasy Drive: Peter MacMuffin Gooses Captain America.’ Same Bat-place. And you can read it at whatever time is most convenient for you.

Also, We’re in Earth-Debt

via postcarbon.org

We’re in debt.

Yes, I know — the United States owes its balls to China. But I don’t mean that debt.

Yes, I know — your college education turned you into a lifelong indentured servant to the bank. But I don’t mean that debt, either.

Yes, I know — predatory loans have hung mortgage-albatrosses around the necks of the un-bailed-out. But I don’t mean that debt either.

We’re in ecological debt, too. This is much more unfortunate, because here we’re not just dealing with numbers sliding around on some investment banker’s screen. This is not the fiction of currency, in other words — it’s real-world natural resources. We’re using the Earth itself, earlier and earlier each year.

As a planet, as of today, we have already spent all of the Earth’s resources for the year. And, in case you have forgotten, it’s still summer. Called “Earth Overshoot Day,” Aug. 19 marks the day on which we’ve used all the resources the Earth can produce in a year.

If you’re wondering why all commerce and industry have not ground to a halt, it is because we do not simply use what the Earth produces in a year for that year. (If only.) We’ve been mining the Earth’s historically produced resources for generations. Mother Nature had billions of years to churn out biological and mineral goodies before we hairless apes came up with first agriculture and then industry. So, we’ve largely treated the Earth as an infinite resource. You can understand the mistake: You “arrive” on the scene with resource needs/desires, and start chipping away at billions of years of production, it feels like that shit’s just going to keep on keepin’ on.

Somewhere along the line, however, modern humans learned about scarcity. Even industrialized humans figured this out. But the United States still has a problem, collectively, wrapping its Mountain Dew-addled grey matter around the idea of ecological debt.

Again, unsurprising. As a nation, we’re clearly not good on debt or long-term-planning-type things. We tend to treat plastic like a bottomless cup of hot steaming free money, with average U.S. credit card debt at over $15K per household (!). More broadly, people in the U.S. are more likely to cast shade on climate-change claims, with one of the two major political parties routinely pushing for more drilling — because there’s always more to get.

That makes sense. At a smaller scale, the colonization of the American landmass recapitulated the appearance of agricultural and industrial societies on a 4.5-billion-year-furnished planet. For people in the pre-United States and early United States, here was a wonderland of forests, minerals, farmable land and other resources that had barely been exploited.

I recently interviewed a West Virginia agronomist for an article, and he talked about how early colonizers and U.S. pioneers dined on 100s of years of forest growth. The U.S. land mass, from sea to shining sea, had pumped centuries of photosynthetic industry into miles of free wood.  It seemed like a free lunch. An Olive Garden never-ending-pasta-bowl of old-growth forest, if you will.

Nowadays, by contrast, U.S. loggers nibble at a mere 70 years or so of arboreal growth.

But that feeling, that the United States is a land of Olive Gardenian plenitude, remains with us. It is the, ironically, “conservative” position, to believe that you can just keep drilling. Conservative, because it harkens back to America’s early glory, when we were young and the top soil went down for miles and the forests were infinite and we could throw a perfect spiral right into the end zone from 50 yards out.

But not, you know, “conservative,” in the actual dictionary meaning of the word. Quite the opposite. In the ecological equivalent of racking up a ledger with China to finance wars and tax cuts, we are spending through the Earth’s yearly resources in 8 months — and that number has steadily shrunk, with Earth Overshoot Day not falling until October in 2000.

Such numbers should maybe shock us into some sort of truly conservative actions. But so should the data on rising average temperatures. In a country in which college students will be paying off credit card debt into their 70s, I imagine we’ll just slide that eco-debt onto the ledger and buy something nice for ourselves.

Someone Finally Made D&D Nerdy

check out this pair (via technobab.com)

You know what Dungeons & Dragons fans probably get tired of? Just, the persistent air of “cool” that hangs over the game.

You know, when everyone hears D&D and assumes “bad-ass, good-looking loner.” That must get old.

Every Dungeon master tires of being asked, “So, how many varsity sports did you letter in?” Or, “Do you ever get tired of all the sexual intercourse you are having?” You walk into a bar, hoping for a quiet night sharing drinks with friends, then some stranger overhears you discussing mage armor, and now you’re awash in free drinks and romantic propositions.

Listen, people. Just because you imagine adventures in fantasy, swords-and-sorcery realms while rolling 6-sided dice, that does not automatically mean you are a sex god who can’t make it to tonight’s party because you have too many others to go to and because of all your dates.

But finally, someone has done something to nerd up fantasy role-playing games a bit. It’s about time. Blogging for the World Science Festival, Roxanne Palmer writes that casting water-breathing spells can come with the side effect of a basic statistics lesson. That is, to play the game, it helps to understand probability: What are your chances of rolling the 4 you need so that your sword-hit lands on the owlbear, Palmer asks? You find that out fairly simply by adding up the total number of dice outcomes and dividing into that the number that would produce your owlbear-slaying numeral.

Basic probability. And so, a good thing. Few sources of authority are so frequently misused and abused as statistics. Plenty of people say, “It’s true because God said it!” but statistics is a god that can even fool the atheistic scientist, if she’s not careful. It is a malleable and confusing god, with impressive intonations like “standard deviation” and “confidence intervals.”

After all, Mark Twain didn’t say “Lies, damned lies and logical fallacies.” (He probably didn’t say the other thing, either, or at least, did not originate it, but that’s beside the point.)

Playing Dungeons & Dragons may not, by itself, keep you from getting hoodwinked by unscrupulous political pollsters or poorly written pop-science. But it’s a start. And maybe next time a tragically handsome D&D-er saunters into the bar, you’ll mention Bell Curves and significance instead of asking to polish his rad motorcycle.

 

Facebook and Celebrity Deaths

 

via dawn.com
via dawn.com

I can be uncharitable. Suspicious and cynical. So sometimes I think the worst of people’s motivations. That at least partly explains my reaction to social media when celebrities die.

Instinctively, I don’t like what happens. A big-name celebrity passes, and everyone (or so it feels) needs to make a statement on Facebook or Twitter. And here is the (uncharitable) reason why it bothers me: in many cases, these proclamations seem to be more about the person making them than about the person lost. At their worst, they aim to make this event, this death, into an advertisement for the individual speaking.

“I am a person who cares about an actor/writer/singer like this. That is part of my personality, I want you to know. I am connected to this person in this way, and I want everyone to know that.”

The death of a human being, a stranger, comes to serve the same purpose as listing a particular band in your About Me section.

I, of course, can’t prove that this motivates any of the statements people make. But, let’s be honest, a lot of social media is about this sort of personal branding. When that’s applied to a person’s death, I feel gross about it.

No need to tiptoe around it: The obvious prompt for these thoughts was Robin Williams’ recent death by apparent suicide. I don’t mean to paint every social media posting about this extremely troubling death as such a shallow personal brag — or even to say that any of them were entirely all about that. But when the news of a death like this hits Twitter (because that’s where it hits first these days), and people rush to Facebook-post “I remember watching ‘Mrs. Doubtfire'” or to repost a meme someone made with Robin Williams and a quote — at least part of it feels self-serving. And that grosses me out.

I’ve felt that about other celebrity deaths, as well — and have refrained from adding my own post for that reason (also, just because I did not feel impelled to post anything). But Robin Williams is a special case, I think. He was a huge star. A big Hollywood name for decades who starred in films that were huge parts of many people’s childhoods: “Hook,” “Mrs. Doubtfire.” He could exemplify the archetype of the sad clown — veering in his roles and even within his very expressive eyes from manic, comedic joy to what appeared to be, at least, deep sensitivity, if not deep inner sadness.

So, I get why people were affected. I was affected. And I’ve seen several social media posts and articles (and even some memes) with mature, intelligent and touching reflections on the actor’s life and death. Like I said, Robin Williams was a special case. The social media and journalistic response to his death seems to have continued past that initial outpouring of reaction posts into something more meaningful — even what could be a beneficial discussion of depression, and the mystery of what goes on inside another person’s mind, especially the mind of a suicide.

But those initial Facebook posts, by so many people, still bug me. They will probably bug me again the next time a well-known figure dies. But, now that I’ve plumbed the uncharitable side of me that bristles at such posts, let me be more understanding — or, at least, neutrally inquisitive.

Because, I am curious: Why must everyone (or so many, anyway) RUSH to Facebook and share their proclamation on a celebrity death? Why do people feel the need to make a social media statement?

Perhaps it is just because that is how we communicate now. We don’t transmit, one-to-one, we broadcast. In the past, we would still, of course, have discussed Robin Williams’ death. But we would have done it person-to- person. Talking about it on Facebook may happen simply because that is where we talk about everything.

Particularly the big things that everyone hears about.

Social media has so taken over the public sphere that anything that happens in the public sphere –a big event, a political result, a celebrity death — seems, instinctively to us, to happen within social media. It will be reported and discussed on social media; that is where the conversation will take place. So, to not comment on it in the social-media sphere would be to act as if it had not happened. Facebook is the giant room in which we all stand; who are you to ignore the elephant in the middle of it?

I do get that. And I’m no stranger to responding to that new sense of the world. Lord knows, I overuse Facebook and Twitter, too. Lord knows, I’m prone to make some snarky comment about national news because I feel like Twitter is the place to talk about it. Lord knows, I post jokey statuses to Facebook in an effort to garner “likes,” because it feels good to get likes.

But, again. It’s also how we communicate today. This is, frequently, how I interact with old and current friends: I post something I hope they’ll like. I comment on what they’ve said. And this all happens out in the social media courtyard, everyone shouting their conversations for the world to hear.

There was a time I felt weird about that. Writing on people’s Facebook walls, instead of simply emailing them privately, seemed bizarre at first, back in 2006 (!). I didn’t see why people did it. I suspected it was because the wall-message wasn’t really about the person being messaged, but about how the messenger wanted to appear to everyone else. I thought that was a bit icky.

But now I do it. Because it is what people do. And I’ve come to accept that posting on someone’s wall is a way to both communicate with that person and, yes, get some attention from the crowd. Maybe garner some likes. That both are happening at once doesn’t necessarily diminish either. Besides, it’s what everyone does.

I’m sure when people post their reactions to the deaths of Robin Williams and other beloved celebrities, they are similarly multitasking. As I said above, I doubt anyone’s Facebook post about the news was entirely about personal branding. But social media is a weird form of communication. It does double-duty in that way: every post goes out to everyone. You are broadcasting. And so, it is both about whatever you’re saying to one individual and about projecting yourself in front of the millions. To do so, even if only in part, over the death of a fellow human being, and a stranger — I think I’ll continue to feel strange about that.

Island Paradise Insect Monsters

via wikipedia
via wikipedia

Something coconut flavored in a big, bowl-shaped glass. The sweet smell of suntan oil and the sweat of attractive young people. A sunset. Coronas with lime. Jimmy Buffet.

A 2-foot-long stick insect crawling up your thigh.

The thing about tropical paradise is that the insects can be insane. Sky-blue water and lush vegetation typically coincide with sci-fi spiders and beetles that dine on small children (I am slightly exaggerating.) “Heaven-on-Earth” is disgusting, people, iswhatimsaying. (I like to imagine the real Heaven, if it exists, as a place where smiling angels and cherubs are constantly swatting at baseball-sized flies and fleeing Mothra.)

I’ve always lived in a cold-weather state (Iowa, New York, Wisconsin). And it’s funny: in the depths of winter, nothing seems as desirable as summer. Going to the lake. Picnics. The Fourth of July. You never imagine the insects. You just don’t think of them. “It will be warm!” your soul breathes. The permafrost will melt, seep back into the cracks in the soil, disappear down the storm drains. You will jog. There will be iced coffee and bike rides. People will get tans on their shins.

And you spend the Fourth of July slapping the back of your neck. “Oh, right. That.” You forget about the streams of ants into your kitchen, the mosquitoes buzzing in your ear just as you doze off, those clouds of little gnats above the sidewalks.

A corresponding thing happens when us frigid Northerners contemplate the tropics, I think. You envision beaches, but you will not be entirely comfortable there. Instead, you will be sticky, and never quite sure if that is sweat dribbling down your leg or a huge bug, like the one you saw in the corner of your cabin and crushed, screaming, with a loafer. Bugs, bugs everywhere. Your week in paradise will turn into a mild meth withdrawal.

But it is another level. The lifelessness of a Wisconsin winter is to the insect-world of a Wisconsin summer, as the insect-world of a Wisconsin summer is to the bugaocalypse of the tropics.

Godzilla vs. Mothra

These thoughts on the big-bugged tropics were spurred by the annual Bloomin’ Butterflies exhibit at Madison’s Olbrich Botanical Gardens, where I volunteered this weekend. Dan Capps, a Madison-based amateur entomologist who’s been collecting bugs since 1958, had brought some of his mounted butterflies (and moths). One moth must have had a 6-inch wingspan.

This Mothra came from Mexico. I asked Jeff Capps (the collector’s son, who had dropped by to talk butterfly with visitors) if the big ones usually come from warmer, tropical regions. Because, I mean, that seems right. I know the insects in India were, aside from being ever-present, often big. Huge beetles, that just might show up in your rice (ok, it happened once, but it was scarring.) I’ve heard tales of tropical rainforest spiders, etc.

Some light Internet research confirms that it’s generally the case, for at least a couple reasons. Interestingly, mammals tend to grow bigger in colder climates, because the extra body volume helps them retain heat (and, vice versa, lower surface-area-to-volume helps mammals dissipate heat in surfing country). This is called Bergmann’s rule, after the German biologist who thought it up.

Insects, broadly speaking, violate this rule, however. Since the bugs don’t rely on internal heat, the other advantages of warm-weather climates can help them get big. Cold snaps freeze or kill off the bugs, so insects that celebrate a white Christmas have limited growing periods. Warm-weather bugs can keep pushing right on through the holidays, and they have more abundant food, to keep pumping up those thoraxes.

Another “biological rule” also plays a role: the “island rule,” named after Benjamin Island, founder of Island Records. Just kidding, it’s named for the actual masses of land. The isolated island gives small animals more food, with less competition and fewer predators. Since tropical islands tend to, you know, be islands, this rule can give insects a boost. As this post notes, New Zealand, for example, boasts huge-ass beetles, the weta and other big, gross things.

via pinterest.com

Of course, nature rarely makes itself amenable to simplifying blog posts, and some insect species grow bigger up north, while microclimate can matter more than latitude.

Still, this guy lives in Thailand. You can get some decent Thai food in Madison, but you’ll never find that thing crawling in it. Thank you, winter.