The Meming of Life: on secular parenting and other natural wonders

My fundamentalism

stonetabletI am a fundamentalist.

No, this isn’t one of those glib non-confessional confessions ( “If loving my country too much is a crime, then I’m guilty as sin!”). I think fundamentalism, even in the name of something good, is a bad thing.

Fundamentalism is best described as the uncompromising adherence to a set of basic principles. Adhering to principles isn’t the sticking point. It’s the uncompromising part that presents the problem — the unwillingness to allow any other concerns into the discussion lest they distract from a laserlike focus on your guiding light.

My particular fundamentalism is free expression. I’ve become convinced that it is an essential good to be protected at all costs. Some people take this as a license to act badly, and I wish they wouldn’t, but their lack of judgment shouldn’t trammel this good and glorious thing, which in the end, torpedoes be damned, leads to a better future for everyone.

If you doubt that this is a kind of fundamentalism, read that paragraph again, changing “free expression” to “Christianity” or “Islam” or “the love of my country.” First principles are fine, but nothing should ever get exclusive control over our decisionmaking.

Free expression has defined a large portion of my adult life. My college teaching career was ended as a direct result of a free expression issue. As a result of this and other experiences, I tend to see free speech issues in fairly black-and-white terms.

It was my free-expression fundamentalism that led me last week to support Everybody Draw Muhammad Day. I looked at the issue, checked my free-speech compass, and BOOM, knew what was right.

But critics of EDMD have cited several concerns that they say should have shared the stage with free speech issues in this case, among them:

– That the existing atmosphere of general hostility toward Muslims is only exacerbated by the event;
– That the event represents a powerful majority attacking a less powerful minority;
– That moderate Muslims are unfairly attacked along with the extremists, increasing distance at precisely the time we need to be decreasing it;
– That “diluting the fatwa” is meaningful only in the abstract, and actually increases the chance of harm coming to those who most prominently depicted Muhammad;
– That many who participated took the opportunity to create intentionally obscene or demeaning images of Muhammad, and that this was inevitable;
– and more.

Not all of the arguments are equally good, and some are irrelevant (including at least one of those above, in my humble). The canard that the event represented “offense for the sake of offense” is the weakest of all, an assertion that really means, “I haven’t taken the time to figure out your point, so I’ll declare it nonexistent.” I think just about any argument that includes the avoidance of “offense” as its driving principle is hollow and misguided. Finally, I am still troubled by the assertion that those who participated out of ignorant or hateful motives irreparably taint those who did not.

But I’m also becoming more pragmatic in my dotage, and outcomes matter as much to me as abstract principles. (Those who have never thrown your entire family under the wheels of your principles may not know where I’m coming from, and that’s okay. My 30-year-old self agrees with you.)

In addition to some thoughtful opinion pieces, several people have offered convincing analogies. “There are campaigns to remove Mark Twain’s books from school libraries [because of the use of the word ‘nigger’],” said FB friend Bruce Ayati. “Would a campaign to use that racial slur, only to prove you can, be the right thing to do?” Not bad.

“Given the position of atheists in this country,” he continued, “it’s not hard to imagine something similar happening to us, where an Angry Atheist somewhere does something terrible, and we are all subjected to undeserved hostility in the name of ‘standing up’ to us and the supposed threat we pose to this country.”

Damn, that’s a good one. Damn.

The best of these and other arguments, offered by smart and articulate people, slapped me out of my hypnotic free-expression trance long enough to first confuse the issue for me, then to lead me to a change of mind. I’m now of the opinion that EDMD was not the right thing to do, and will in the end have done more harm than good. I still defend the right to do it, of course, and especially support those who are working so hard to do it right.

Considered in glorious isolation, the free-expression question was always open and shut. But nothing in human life exists in isolation, and a more thorough consideration of the context has led me to change my position. Not with 100 percent certainty. Anyone who registers complete certainty in a case like this is hereby invited to have a very nice day indeed, and my you’re looking fit.

And as before, and as always, I may be wrong. Most important, I continue to offer my strong support to those who choose to participate. How can I not, with articulate and thoughtful supporters like this?

[Thanks as well to commenters nonplus and yinyang and my old friend Scott M. for their part in slapping me awake.]

Anyone else have a principle so beloved that it sometimes blinds you to other considerations?

Mama don’t take my heike crabs awaaaay

Ohhh, the pain. The pain. One of my cherished beliefs is under attack, and I’m doing what we monkeys do when that’s the case. Resisting. Bargaining. Denying.

There are two illustrations of selection — one natural, the other artificial — that I’ve always adored for their explanatory power and elegance.

One is the peppered moth. Peppered moths are light grey with dots of black and brown all over–perfect camouflage for the local light-colored tree bark in 18th century England. A few were completely black, but only a few, because they were easy for birds to spot and eat.

In the 19th century, factory smoke blackened the tree bark in the moths’ range. The black moths were now perfectly camouflaged and quickly became the favored phenotype, while the light grey became visibly delicious. The proportions switched — almost all of the moths in the forest were now black and only a few light grey.

Experiments were conducted to confirm the hypothesis in the mid 20th century. Errors subsequently discovered in those experiments led creationists to trumpet the supposed dethroning of the peppered moth as an illustration of natural selection. But subsequent, better-designed experiments have re-confirmed the original hypothesis to the satisfaction of the relevant experts.

In the book Moths (2002), Cambridge biologist Michael Majerus sums up the consensus in the field: “I believe that, without exception, it is our view that the case of melanism in the Peppered moth still stands as one of the best examples of evolution, by natural selection, in action.”

Sure enough, several other experts in both moths and industrial melanism have also written to reaffirm the peppered moth story as a robust exemplar of natural selection writ small.


But there’s another selection story I adore — and that turn of phrase tells you all you need to know about my vulnerability on this one. It’s the story of the heikegani, a crab found in the waters of the Inland Sea in Japan near Dan-no-ura.

The sea was the site of a major battle in 1185 between Heike and Genji warriors. The Heike were trounced, and the survivors are said to have thrown themselves into the sea in disgrace.

heikecrab4309In telling the story of the struggle, an epic called the Heike Monogatari refers to a species of crab in the Inland Sea as reincarnations of the Heike warriors defeated at the Battle of Dan-no-ura. And no wonder — the shell of the crab includes markings that evoke a scowling samurai warrior. And I don’t mean “evoke” like Ursa Major evokes a bear (psst, it doesn’t). I mean the crab looks like a scowling samurai warrior.

In the original Cosmos series, Carl Sagan offered the heike face as an example of artificial selection.1 Fisherman in the area have known the legend for eight centuries. During that time, if the nets pulled up a crab with markings resembling a human face, even mildly so, the fisherman — understandably loathe to disturb the spirit of the samurai — would throw it back. Crabs with less facelike markings would end up dipped in butter. The more facelike, the more likely it would be tossed back in with a girlish scream, free once more to fornicate with others of its uncanny ilk.

Eight hundred years of this and you’ll find yourself looking at some pretty scream-worthy samurai crabs.

What’s most awe-striking about this is the fact that unlike other examples of artificial selection — dog breeding for example — the selective pressure exerted by the fisherfolk is wholly unintentional, but still works. It combines random variation and decidedly nonrandom selection in a way that mimics natural selection incredibly well.

I happen at the moment to be putting the finishing touches on a new seminar (this one based on Raising Freethinkers) to be offered for the first time at UUC Atlanta on January 11. While polishing a section on helping kids understand evolution, I remembered that I didn’t just have moths to work with, I also had crabs. Ahem.

But in Googling for images, I came across the last thing I ever wanted to see: a sturdy, possibly even convincing attempt by a reputable scientist to debunk the hypothesis, claiming that the crabs are seldom kept and eaten regardless of markings, and that nearly identical markings are found on fossil crabs. And some other stuff.

Now the only worthy response to this news is Oooo, truth beckons, let’s follow this lively gent wherever and to whatever abysses he shall lead, lest we miss the chance to glimpse our precious reality more clearly!

Instead, I recoiled. Nooooooo, I thought. Bad man. Stranger danger.

I may have mentioned that I love the story, love the elegance of the hypothesis. I want it to be true. It is too beautiful to not be true.

I KNOW, I KNOW. Don’t lecture me, people. This is confessional literature here. These are the moments that make me empathize with religious folks who are disinclined to lift the veil on their own favorite bedtime stories. Once in a while, I feel their pain.
1Though Sagan got it from a 1952 article by biologist Julian Huxley.
Postscript: When Erin asked for “something new” as a bedtime story last night, I told her the tale of the heikegani, from battle to Cosmos. But when I reached the hypothesis, I did the right thing: “Some scientists think it looks like a face because…” The caveat made it no less cool to her.

can death give birth to wonder? (revised)


[NOTE: In preparing the following blog entry, I fell prey to a classic critical thinking error that goes by several names: “selective reporting,” “confirmation bias,” and “being an idiot.” Though the first several paragraphs are impeccably sound, the section on the Woodward paper is, unfortunately, complete rubbish. I say ‘unfortunately’ because it would have been fascinating if true. Ahh, but that’s how we monkeys always step in it, isn’t it now? I’ll leave the post up as a monument to my shortcomings and prepare another post about the specific way in which I misled myself.]


You’ve probably seen the studies confirming the low frequency of religious belief among scientists, and the fact that the most eminent scientists are the least likely to believe in a personal God. Very interesting, and not surprising. Uncertainty would have been profoundly maladaptive for most of our species history. The religious impulse is an understandable response to the human need to know, or at least to feel that you do. Once you find a (much) better way to achieve confidence in your conclusions, one of the main incentives for religiosity loses its appeal.

Psychologist James Leuba was apparently the first to ask scientists the belief question in a controlled context. In 1914, Leuba surveyed 1,000 randomly-selected scientists and found that 58 percent expressed disbelief in the existence of God. Among the 400 “greater” scientists in his sample, the figure was around 70 percent.1 Leuba repeated his survey in 1934 and found that the percentages had increased, with 67 percent of scientists overall and 85 percent of the “eminent” group expressing religious disbelief.2

The Larson and Witham study of 1998 returned to the “eminent” group, surveying members of the National Academy of Sciences and finding religious disbelief at 93 percent. All sorts of interesting stats within that study: NAS mathematicians are the most likely to believe (about 15 percent), while biologists were least likely (5.5 percent).

[Here’s where the nonsense begins. Avert your eyes.]

But I recently came across a related statistic about scientists that, given my own background, ranks as the single most thought-provoking stat I have ever seen.

As I’ve mentioned before, my dad died when I was thirteen. It was, and continues to be, the defining event in my life, the beginning of my deepest and most honest thinking about the world and my place in it. My grief was instantly matched by a profound sense of wonder and a consuming curiosity. It was the start of the intensive wondering and questioning that led me (among other things) to reject religious answers on the way to real ones.

Now I learn that the loss of a parent shows a robust correlation to an interest in science. [Not.] A study by behavioral scientist William Woodward was published in the July 1974 issue of Science Studies. The title, “Scientific Genius and Loss of a Parent,” hints at the statistic that caught my attention. About 5 percent of Americans lose a parent before the age of 18. Among eminent scientists, however, that number is higher. Much higher.

According to the study, 39.6 percent of top scientists experienced the death of a parent while growing up—eight times the average.

Let’s hope my kids can achieve the same thirst for knowledge some other way.

Many parents see the contemplation of death as a singular horror, something from which their children should be protected. If nothing else, this statistic suggests that an early encounter with the most profound fact of our existence can inspire a revolution in thought, a whole new orientation to the world — and perhaps a completely different path through it.

[More later.]


1 Leuba, J. H. The Belief in God and Immortality: A Psychological, Anthropological and Statistical Study (Sherman, French & Co., Boston, 1916).
2 Leuba, J. H. Harper’s Magazine 169, 291-300 (1934).
3Larson, E. J. & Witham, L. Nature 386, 435-436 (1997).
4Woodward, William R. Scientific Genius and Loss of a Parent, in Science Studies, Vol. 4, No. 3 (Jul., 1974), pp. 265-277.