Tag Archives: evolution

Santa’s magic, children’s wisdom, and inequality

Eric Kaplan, channeling Francis Pharcellus Church, writes in favor of Santa Claus in the New York Times. The Church argument, written in 1897 and barely updated here, is that (a) you can’t prove there is no Santa, so agnosticism is the strongest possible objection, and (b) Santa enriches our lives and promotes non-rationalized gift-giving, “so we might as well believe in him.” That’s the substance of it. It’s a very common argument, identical to one employed against atheists in favor of belief in God, but more charming and whimsical when directed at killjoy Santa-deniers.

All harmless fun and existential comfort-food. But we have two problems that the Santa situation may exacerbate. First is science denial. And second is inequality. So, consider this an attempted joyicide.


From Pew Research comes this Christmas news:

In total, 65% of U.S. adults believe that all of these aspects of the Christmas story – the virgin birth, the journey of the magi, the angel’s announcement to the shepherds and the manger story – reflect events that actually happened.

Here are the details:


So the Santa situation is not an isolated question. We’re talking about a population with a very strong tendency to express literal belief in fantastical accounts. This Christmas story is the soft leading edge of a more hardcore Christian fundamentalism. For the past 20 years, the General Social Survey GSS has found that a third of American adults agrees with the statement, “The Bible is the actual word of God and is to be taken literally, word for word,” versus two other options: “The Bible is the inspired word of God but not everything in it should be taken literally, word for word”; and,”The Bible is an ancient book of fables, legends, history, and moral precepts recorded by men.” Those “actual word of God” people are less numerous than the virgin-birth believers, but they’re related.

Using the GSS I analyzed the attitudes of the “actual word of God” people (my Stata data and work files are here). Controlling for their sex, age, race, education, political ideology, and the year of the survey, they are much more likely than the rest of the population to:

  • Agree that “We trust too much in science and not enough in religious faith”
  • Oppose marriage rights for homosexuals
  • Agree that “people worry too much about human progress harming the environment”
  • Agree that “It is much better for everyone involved if the man is the achiever outside the home and the woman takes care of the home and family”

This isn’t the direction I’d like to push our culture. Of course, teaching children to believe in Santa doesn’t necessarily create “actual word of God” fundamentalists. But I expect it’s one risk factor.

Children’s ways of knowing

A little reading led me to this interesting review of the research on young children’s skepticism and credulity, by Woolley and Ghossainy (citations below were mostly referred by them).

It goes back to Margaret Mead’s early work. In the psychological version of sociology’s reading history sideways, Mead in 1932 reported on the notion that young children not only know less, but know differently, than adults, in a way that parallels social evolution. Children were thought to be “more closely related to the thought of the savage than to the thought of the civilized man,” with animism in “primitive” societies being similar to the spontaneous thought of young children. This goes along with the idea of believing in Santa as indicative of a state of innocence.

In pursuit of empirical confirmation of the universality of childhood, Mead investigated the Manus tribe in Melanesia, who were pagans, looking for magical thinking in children: “animistic premise, anthropomorphic interpretation and faulty logic.”

Instead, she found “no evidence of spontaneous animistic thought in the uncontrolled sayings or games” over five months of continuous observation of a few dozen children. And while adults in the community attributed mysterious or random events to spirits and ghosts, children never did:

I found no instance of a child’s personalizing a dog or a fish or a bird, of his personalizing the sun, the moon, the wind or stars. I found no evidence of a child’s attributing chance events, such as the drifting away of a canoe, the loss of an object, an unexplained noise, a sudden gust of wind, a strange deep-sea turtle, a falling seed from a tree, etc., to supernaturalistic causes.

On the other hand, adults blamed spirits for hurricanes hitting the houses of people who behave badly, believed statues can talk, thought lost objects had been stolen by spirits, and said people who are insane are possessed by spirits. The grown men all thought they had personal ghosts looking out for them – with whom they communicated – but the children dismissed the reality of the ghosts that were assigned to them. They didn’t play ghost games.

Does this mean magical thinking is not inherent to childhood? Mead wrote:

The Manus child is less spontaneously animistic and less traditionally animistic than is the Manus adult [“traditionally” here referring to the adoption of ritual superstitious behavior]. This result is a direct contradiction of findings in our own society, in which the child has been found to be more animistic, in both traditional and spontaneous fashions, than are his elders. When such a reversal is found in two contrasting societies, the explanation must be sought in terms of the culture; a purely psychological explanation is inadequate.

Maybe people have the natural capacity for both animistic and realistic thinking, and societies differ in which trait they nurture and develop through children’s education and socialization. Mead speculated that the pattern she found had to do with the self-sufficiency required of Manus children. A Manus child must…

…make correct physical adjustments to his environment, so that his entire attention is focused upon cause and effect relationships, the neglect of which would result in immediate disaster. … Manus children are taught the properties of fire and water, taught to estimate distance, to allow for illusion when objects are seen under water, to allow for obstacles and judge possible clearage for canoes, etc., at the age of two or three.

Plus, perhaps unlike in industrialized society, their simple technology is understandable to children without the invocation of magic. And she observed that parents didn’t tell the children imaginary stories, myths, and legends.

I should note here that I’m not saying we have to choose between religious fundamentalism and a society without art and literature. The question is about believing things that aren’t true, and can’t be true. I’d like to think we can cultivate imagination without launching people down the path of blind credulity.

Modern credulity

For evidence that culture produces credulity, consider the results of a study that showed most four-year-old children understood that Old Testament stories are not factual. Six-year-olds, however, tended to believe the stories were factual, if their impossible events were attributed to God rather than rewritten in secular terms (e.g., “Matthew and the Green Sea” instead of “Moses and the Red Sea”). Why? Belief in supernatural or superstitious things, contrary to what you might assume, requires a higher level of cognitive sophistication than does disbelief, which is why five-year-olds are more likely to believe in fairies than three-year-olds. These studies suggest children have to be taught to believe in magic. (Adults use persuasion to do that, but teaching with rewards – like presents under a tree or money under a pillow – is of course more effective.)

Richard Dawkins has speculated that religion spreads so easily because humans have an adaptive tendency from childhood to believe adults rather than wait for direct evidence of dangers to accumulate (e.g., “snakes are dangerous”). That is, credulity is adaptive for humans. But Woolley and Ghossainy review mounting evidence for young children’s skepticism as well as credulity. That, along with the obvious survival disadvantages associated with believing everything you’re told, doesn’t support Dawkins’ story.

Children can know things either from direct observation or experience, or from being taught. So they can know dinosaurs are real if they believe books and teachers and museums, even if they can’t observe them living (true reality detection). And they can know that Santa Claus and imaginary friends are not real if they believe either authorities or their own senses (true baloney detection). Similarly, children also have two kinds of reality-assessment errors: false positive and false negative. Believing in Santa Claus is false positive. Refusing to believe in dinosaurs is false negative. In this figure, adapted from Woolley and Ghossainy, true judgment is in green, errors are in red.


We know a lot about kids’ credulity (Santa Claus, tooth fairy, etc.). But, Woolley and Ghossainy write, their skepticism has been neglected:

It is perplexing that a young child could believe that his or her knowledge of the world is complete enough to deny the existence of anything new. It would seem that young children would understand that there are many things that exist in the real world that they have yet to experience. As intuitive as this seems, it appears not to be the case. From this perspective, development regarding beliefs about reality involves, in addition to decreased reliance on knowledge and experience, increased awareness of one’s own knowledge and its limitations for assessing reality status. This realization that one’s own knowledge is limited gradually inspires a waning reliance on it alone for making reality status decisions and a concomitant increase in the use of a wider range of strategies for assessing reality status, including, for example, seeking more information, assessing contextual cues, and evaluating the quality of the new information.

The “realization that one’s own knowledge is limited” is a vital development, ultimately necessary for being able to tell fact from fiction. But, sadly, it need not lead to real understanding – under some conditions, such as, apparently, the USA today, it often leads instead to reliance on misguided or dishonest authorities who compete with science to fill the void beyond what we can directly observe or deduce. Believing in Santa because we can’t disprove his existence is a developmental dead end, a backward-looking reliance on authority for determining truth. But so is failure to believe in germs or vaccines or evolution just because we can’t see them working.

We have to learn how to inhabit the green boxes without giving up our love for things imaginary, and that seems impossible without education in both science and art.

Rationalizing gifts

What is the essence of Santa, anyway? In Kaplan’s NYT essay it’s all about non-rationalized giving — for the sake of giving. The latest craze in Santa culture, however, says otherwise: Elf on the Shelf. According to Google Trends, interest in this concept has increased 100-fold since 2008. In case you’ve missed it, the idea is to put a cute little elf somewhere on a shelf in the house. You tell your kids it’s watching them, and that every night it goes back to the North Pole to report to Santa on their nice/naughty ratio. While the kids are sleeping, you move it to another shelf in house, and the kids delight in finding it again each morning.

Foucault is not amused. Consider the Elf on a Shelf aftermarket accessories, like these handy warning labels, which threaten children with “no toys” if they aren’t on their “best behavior” from now on:


So is this non-rationalize gift giving? Quite the opposite. In fact, rather than cultivating a whimsical love of magic, this is closer to a dystopian fantasy in which the conjured enforcers of arbitrary moral codes leap out of their fictional realm to impose harsh consequences in the real life of innocent children.


What does all this mean for inequality? My developmental question is, what is the relationship between belief in Santa and social class awareness over the early life course? In other words, how long after kids realize there is class inequality do they go on believing in Santa? Where do these curves cross?


Beyond worrying about how Santa rewards or punishes them individually, if children are to believe that Christmas gifts are doled out according to moral merit, than what are they to make of the obvious fact that rich kids get more than poor kids? Rich or poor, the message seems the same: children deserve what they get. Of course, I’m not the first to think of this:



I can’t demonstrate that believing in Santa causes children to believe that economic inequality is justified by character differences between social classes. Or that Santa belief undermines future openness to science and logic. But those are hypotheses.

Between the anti-science epidemic and the pervasive assumption that poor people deserve what they get, this whole Santa enterprise seems risky. Would it be so bad, so destructive to the wonder that is childhood, if instead of attributing gifts to supernatural beings we instead told children that we just buy them gifts because we love them unconditionally and want them — and all other children — to be happy?


Filed under Uncategorized

Mildly altruistic blog post rooted in the brain

Brain science is super interesting and important, of course. In fact, “the brain” is gaining on “the mind” as a topic of our brain-mind’s fixation (Google ngrams):


I take a tiny share of responsibility for this trend, as during one of my journalism careers I wrote a 1995 news article about “brain-based learning” for a newsletter sent to more than 100,000 K-12 educators.

On the plus side, in my old article I devoted considerable attention to the issue of brain plasticity, or how brains change in response to time and experience. That plasticity perspective was conspicuously absent from Michelle Trudeau’s NPR story this morning about the brains of extreme altruists. The story was based on a paywalled PNAS article which reported that a nonrandom group of 19 anonymous kidney donors had bigger right amygdalas, and heightened emotional response to pictures of faces, than a nonrandom group of 20 controls. The authors conclude that “these findings suggest extraordinary altruism [is] supported by neural mechanisms that underlie social and emotional responsiveness.”

Or, maybe the cumulative experiences of adults who turn out to be extraordinary altruists change their brains. (Or even, maybe the experience of giving a kidney itself affects people’s brains.) It appears that amygdala size changes within people over time, and that it is correlated with the size of people’s social networks. So, the causal sequencing here is something to consider.

What if, as they imply, something about the way people are born makes them more or less likely to be an extraordinary altruist versus a psychopath (a group this researcher previously studied). How much of the real-life variation in altruism might such a genetic or anatomical influence account for? If that proportion is low, then this is a fascinating evolutionary question with little social implication — worth studying, but not worth writing about with headlines like, “Good Deeds May Be Rooted In The Brain.”

The PNAS authors conclude:

It should be emphasized, however, that the mechanisms we have identified are unlikely to represent a complete explanation for altruistic kidney donation, given the extreme rarity of this phenomenon, and given the overlapping distributions we observed for the variables we measured. Acts of extraordinary altruism are likely to reflect a combination of the neurocognitive characteristics identified here, along with other individual- or community-level variables.

That seems like a safe bet, given this distribution of amygdala size across the two groups:


In short, we should consider the possibility, however slight, that altruism also has social causes. Disciplinary culture, I suppose, but I’ve never finished an article with a caution to readers that I may not have completely explained the phenomenon under study.

Leave a comment

Filed under Research reports

Nicholas Wade followup, deeper dive edition

I’m very happy with the editing and fact-checking they did at Boston Review for my review of Nicholas Wade’s book, A Troublesome Inheritance: Genes, Race and Human History, and I don’t want to undermine their work (thanks to managing editor Simon Waxman and associate web editor Nausicaa Renner). If you only have time to read 4,000 words on it, their version is what you should read. It’s up here for free.

But in the thousands of words that ended up on the cutting room floor, there were a few ideas I’d like to post here, for the very interested reader.

Photo from Flickr Creative Commons by epSos.de

Photo from Flickr Creative Commons by epSos.de

Human bones

A number of critics have said that Wade’s early chapters are good, and the book only gets crazy-racist in the second half when he starts attributing social behavior to races and tracing global economic disparities to evolution by natural selection. But I did want to stress that he’s got plenty wrong in the early part of the book as well. In particular, I highlighted the question, why did human bones get thinner in the millennia before they settled down? This isn’t something we worry over much, but I think it’s an important clue to his biases and assumptions. From the published review:

To establish that genes determine social behavior, Wade looks to ancient history, when humans first settled in agricultural communities. “Most likely a shift in social behavior was required,” he writes, “a genetic change that reduced the level of aggressivity common in hunter-gatherer groups.” Of course, many elements were involved—climate change and geography, population pressure, the presence of various plants and animals, advances in tools and weapons, and human biological evolution—but there is no evidence that a behavioral genetic change was required.

I actually spent a fascinating few hours reading the scientific literature on evolution and bone structure, and saw no mention of the reduction in human aggressive behavior as a cause of human bones becoming weaker. To elaborate, Wade thinks natural selection gave people genes for thinner bones because strong bones became less necessary for survival as people fought each other less. He thinks genetic change in behavior led to genetic change in bones. Please correct me if I’m wrong, but I don’t see any literature at all to back this up (Wade doesn’t cite any).

In fact, if I read it right, we might have thinner bones today than people did 50,000 years ago even though our bone genetics haven’t changed much, as a result of diet and lifestyle changes alone. How is that possible? When the bones of young people bear less weight they don’t grow as thick when they’re adults. This is the issue of tool use and the declining “habitual loads” on human limbs. It might also extend to our skulls because we’re not grinding pre-agricultural superfoods with our teeth all day long. Biological anthropologist Christopher Ruff writes: “In a few years, the strength of a person’s bone structure can change as much as the total average change over the past 2 million years of human evolution.” He cites classic research showing the bones of tennis players’ arms are thicker on the side they hold the racket. There is an alternative view that genetic adaptation did drive changes in bone size, having to do with climate change (here is some of that debate). But nothing about aggression I could find.

This point about the bones not-so-subtly underlies his later argument about Africa’s poverty, which he attributes in part to the genetic propensity toward violence among its people. Rather than aggression being an asset as society evolved, Wade speculates that, in the centuries leading up to the first settlements, “the most bellicose members of the society were perhaps killed or ostracized” (again, no evidence). Cue footage of UN peacekeepers landing in Africa.

Anyway, it’s potentially an important lesson in the malleability of human bodies through life experience rather than (only) through genetic change. The implication is that each generation may still be genetically ready to have thick bones again, but we just keep lucking out and being born into societies with tools and soft foods, so we don’t need to grow them. I find that amazing. I don’t want to push it too far, but I imagine that a lot of behavioral things are like that, too. Evolution has brought us to the point where we have vast potential to grow in different ways, and huge differences between people can emerge as a result our life experiences.

More on the “warrior gene”

In the review I included some discussion of the MAO-A studies:

Wade devotes considerable attention to MAO-A, the gene that encodes the enzyme monoamine oxidase A, which is related to aggression. He singles out studies showing that a rare version of the gene is associated with violence in U.S. male adolescents. Out of 1,200 young men surveyed in the National Longitudinal Study of Adolescent Health, eleven particularly violent young men carried the 2R version of MAO-A, subsequently known as the “warrior gene.” Nine of those eleven were African American, comprising 5 percent of the black male adolescents in the study.

Sometimes in genetics there is some gene or coding that produces some measureable effect, and that’s how most people seem to think about genetics most of the time – there is “a gene for” something. In the days before today’s genome-wide association (GWA) studies, before scientists had the means to investigate hundreds of thousands of genetic markers at a time, they often looked for effects of such “candidate” genes. This approach was valuable, especially when the role of specific genes was known (as in the case of the BRCA1 gene, associated with higher risk of breast cancer). However, with most diseases, and even more so with behavior, which is presumed to be more complicated than single-gene mechanisms, candidate gene studies were (are) often fishing expeditions, with a high risk of false-positive results, amplified by selective publication of positive findings. It is quite possible that’s at least part of what happened with MAO-A and aggression.

Most studies about MAOA have been gene-environment interaction studies, where some version of MAOA has a statistical association with a behavior only in the presence of a particular social factor, such as a history of child abuse (e.g., this one). This kind of study is tricky and offers a lot of opportunity to fish around for significant effects (which I’m specifically not accusing any particular person of doing). The MAO-A 2R studies he cites weren’t interaction studies. But a couple of cautions are important. First, that 2R version of MAO-A is very rare, and the two studies Wade cites about it (here and here) both used the same sample from Add Health – 11 boys with the variant. Two studies doesn’t mean two independent results. You could never get a drug approved based on that (I hope). Second, as far as I can tell there was no strong reason a priori to suspect that this 2R variant would be especially associated with violence. So that’s a caution. I have to say, as I did in the review, that it may be correct. But the evidence is not there (and you shouldn’t say “not there yet,” either). Those two studies are the entire evidentiary basis for Wade saying that genes that shape social behavior vary by race (“one behavioral gene … known to vary between races”.) I didn’t find any other studies that show MAO-A 2R varies by race (though maybe there are some).


Yao Ming and Ye Li

Yao Ming and Ye Li

Modern evolution

Does natural selection still apply to humans? Of course. But I can’t see how it works very efficiently in modern societies, because our demography seems like a poor launching pad for genetic revolutions. Most threats to our survival now occur after we’ve had the opportunity to have children. And it’s getting worse (which means better). The decline in child mortality and the extension of life expectancy beyond the childbearing years means that relatively few people are left of out of the breeding community. That’s how I was raised to understand natural selection: individuals with stronger, better traits breed more than those with weaker, worse traits. In the U.S. today, 97.8% of females born live to age 40, and 85% of those have a birth, so 83% of females born become biological mothers. And a good part of modern childlessness is voluntary, rather than the consequence of a genetic weakness. Even as recently as 1900, in contrast, Census data and mortality statistics show that only 53% of females born lived to be age 40 and had a surviving child. So I don’t know how evolution is working today, but except for really bad health conditions I’m skeptical.

Of course, we have selective breeding producing subpopulations that have concentrations of genetic traits. Yao Ming’s parents were both basketball players, and his wife is 6′ 3″. So they’re on their way to producing a subpopulation of really tall Chinese people. But most social divides we have are not like that — they aren’t based on genetic traits. So I don’t see that being very effective either. To take Wade’s example of Jews and math ability (a chapter I didn’t write about because I was already 3,000 words long), you would need to have Jews not only have good math genes, and only reproduce with each other, but they’d also have to cast out those kids who were relatively bad and math and put the boys and girls who were relatively good at math together. That could happen, but it would be inefficient and very slow, and next thing you know some historical event or trend would come along and mess it all up.

Even the much-discussed increasing tendency of college graduates to marry each other — which gives us about three-quarters of couples today being on the same side of the college/non-college divide — is just sloppy and slow by selective-breeding standards. Maybe it could produce a race of people who like baby joggers and The Economist, but given the low levels of isolation between groups and the length of human generations I just think any progress in that direction would be so slow as to be swamped by other processes pushing in all different directions.


Wade used Australia to argue against Jared Diamond, whose account of world history, Guns, Germs and Steel, dismisses genetic evolution as an explanation, making him the villain in Wade’s story. How is it, Wade wonders, that Paleolithic Age native Australians were unable to build a modern economy, but Europeans could waltz onto the continent and be successful so easily? He writes:

If in the same environment … one population can operate a highly productive economy and another cannot, surely it cannot be the environment that is decisive … but rather some critical difference in the nature of the two people and their societies.

That’s one of the worst head-scratchers in the book. Does Wade really think that Europeans just dropped in to Australia on an equal footing with the local population, and had to figure out how to thrive there on their raw genetic merits, proving their superiority by their relative success? It can’t be that “the nature of the two people and their societies” means the boats, weapons, technology and modern state social organization the Europeans possessed, because then he has made Diamond’s point. So the “nature” he’s referring to must be genetics. To the reader who has a passing familiarity with modern social science, this is just jarring.

Does cancer genetics help?

To help show the dead-end of Wade’s very mechanical view of genetic influence, I drew out an example from cancer genetics (with a little help from my brother-in-law, Peter Kraft, who is not responsible for this interpretation).

What if we found that genetic factors contributed to social behavior in any of the ways Wade imagines? Speculative as that is at present, it is of course a possibility. Most people are concerned about the implications for genocide and eugenics, for good reason. But even if our scientific motives were pure, the functional utility of such information would be questionable.

Consider a comparison to the much better understood genetics of disease. Take prostate cancer, which is known to have a family history component. Genome wide association studies have identified some genetic markers that are significantly associated with the risk of developing prostate cancer, such that a genetic test can identify which men are at highest risk. However, a review of the statistical evidence in the journal Nature Reviews Genetics pointed out that, even among the high-risk group only about 1.1% of men would come down with prostate cancer in a five-year period. That’s much higher than the 0.7% expected in the general population, but what do you do with that information? Invasive procedures, medications, or preventative surgery on millions of men would not be worth it in order to prevent a small number of cases of prostate cancer – the side effects alone would swamp the benefits. On the other hand, we don’t need any genetic tests to tell smokers to quit, or urge people to eat better and exercise.

This is just one example. Risk factors for this and other diseases are the subject of intense research, and there are actionable results out there, too. But I suspect that genetic influences on social behavior, if discovered, would present an extreme version of this problem: slight genetic tendencies implying tiny increases in absolute risks – and interventions with huge costs and side effects – all while more effective solutions stare us in the collective face.

To complete the analogy: In other words, if – big if – we could identify them, should we incarcerate, surveil, or segregate a subpopulation with a small increased odds of committing crime – thereby preventing a tiny number of crimes while harming a large group of innocent people? And should we isolate and elevate the children of some other subpopulation because of their slightly higher odds of success in some endeavor? Or should we instead devote our resources to improving education, nutrition, employment and health care for the much larger population, based on the well-established benefits of those interventions? We know lots of effective ways to affect social behavior, including against “natural” inclinations.

I’m really not against scientific exploration of behavioral genetics. But the risk of exaggerated results and inflated importance seems so high that I doubt the research will be useful any time soon.


Filed under Me @ work

Reviewing Nicholas Wade’s troublesome book


I have written a review of Nicholas Wade’s book, A Troublesome Inheritance: Genes, Race and Human History, for Boston Review. Because there already are a lot of reviews published, I also included discussion of the response to the book. And because I’m not expert in genetics and evolution, I got to do a pile of reading on those subject as well. I hope you’ll have a look: http://www.bostonreview.net/books-ideas/philip-cohen-nicholas-wade-troublesome-inheritance


Filed under Research reports

Take it from the Pope


For the “World Day of Peace,” which is today, instead of congratulating the newly weds — who are upholding the transformed but still living (for better or worse, in sickness and in health) institution of marriage — Pope Benedict (Ratzinger) issued a statement that included this about homogamous marriage:

There is also a need to acknowledge and promote the natural structure of marriage as the union of a man and a woman in the face of attempts to make it juridically equivalent to radically different types of union; such attempts actually harm and help to destabilize marriage, obscuring its specific nature and its indispensable role in society.

These principles are not truths of faith, nor are they simply a corollary of the right to religious freedom. They are inscribed in human nature itself, accessible to reason and thus common to all humanity. The Church’s efforts to promote them are not therefore confessional in character, but addressed to all people, whatever their religious affiliation. Efforts of this kind are all the more necessary the more these principles are denied or misunderstood, since this constitutes an offence against the truth of the human person, with serious harm to justice and peace.

I’m not enough of a Pope-ologist to know how rare this is, but what struck me was his claim that his opinion is “accessible to reason and thus common to all humanity.”

There is a convention in the U.S. that we can criticize each other’s opinions, but it’s impolite to criticize each other’s beliefs (as long as those beliefs are religious, meaning not too recent in origin). So it’s fine for me to say that you are wrong about secular subjects, like physics and sports, but it’s impolite to say you are wrong if you believe that God speaks directly to you or that cavemen played with dinosaurs. Or, more directly relevant to the Pope, scientists can say that virgin conception is generally unlikely, but it would be impolite to say it never ever happened, not even once.

Anyway, that’s a long way of getting around to the point that I find the Pope’s statement galling. If he wants to express political opinions, fine. I have no objection to that as long as the giant, multibillion-dollar real estate and educational empire he runs isn’t tax exempt.

But if he’s going to make statements with that hat on — that is, subject to a declaration of infallibility* — he should lay off the social-science proclamations. If he wants to argue in the realm of reason, rather than faith, then we may weigh his record of expressed belief in fairy tales against his scientific credibility.

Believe it or not

Learning as I go here: turns out the Pope has a whole scientific academy called the Pontifical Academy of Sciences (where the “peer review” is not done by your peers, if you know what I mean). And naturally they’ve been all over this subject of reason and faith. I read a 2006 talk titled “Secularism, Faith and Freedom,” which was apparently presented to this audience:


And I thought the American Sociological Association conference was a dynamic scene!

The paper says it’s necessary for religious people to argue their positions freely in a secular state’s public square. These positions include, “Faith is the root of freedom,” and “a proper secularism requires faith.” That is because liberal democracy otherwise is a moral vacuum of pragmatic consumerism with no higher purpose. So I gather that, just as any “gaps” in the fossil record summon Creation as an explanation, so does any lack of morality in the public sphere demand to be filled by faith — specifically, a “Creator who addresses us and engages us before ever we embark on social negotiation.” Absent that presence, “the liberal ideal becomes deeply anti-humanist.”

Although, after reading this whole paper and the Pope’s statement, I confess (my word choice) that I’m not sure “humanist” is really what they’re going for.

Can. 749 §1. By virtue of his office, the Supreme Pontiff possesses infallibility in teaching when as the supreme pastor and teacher of all the Christian faithful, who strengthens his brothers and sisters in the faith, he proclaims by definitive act that a doctrine of faith or morals is to be held.


Filed under In the news, Politics

Who’s teaching creationism to kids?

…and when did I get so touchy about it?

When someone gave us this chunky dinosaur puzzle, I did a double-take. Yes, that’s a caveman there with the dinosaurs:

The blurb on the company’s website says that, along with the puzzle, ” The accompanying board book teaches young learners about dinosaurs.” Teaches, that is, with lessons like this:

A little harmless fun, or a little creationist indoctrination? (Do sociologists even believe in “harmless fun”?)

According to the Shure company, they deliver these “common threads” in all their products: “Originality and inventiveness; Excellence in design; Attention to detail; Exceptional quality; Educational merit.” So, not just entertainment.

A quick perusal suggests the rest of their products are not creationist — just the usual toy-gendering. They do have a Noah’s Ark puzzle, but it doesn’t claim to be educational. In that Shure is just keeping up Melissa & Doug (whose puzzle is at least Genesis-correct in not naming Noah’s wife):

And anyway, the story of Noah’s Ark is actually not a bad way to talk about reproduction.

But back to dinosaurs and people. Dinosaurs are not really more problematic for creationism than any other creatures that pre-date humans. But maybe because kids love dinosaurs so much, creationists spend inordinate energy trying to place them chronologically with people. Writes one such site:

The idea of millions of years of evolution is just the evolutionists’ story about the past. No scientist was there to see the dinosaurs live through this supposed dinosaur age. In fact, there is no proof whatsoever that the world and its fossil layers are millions of years old. No scientist observed dinosaurs die. Scientists only find the bones in the here and now, and because many of them are evolutionists, they try to fit the story of the dinosaurs into their view.

Up against this kind of propaganda, it is tempting to bring the hammer down on “harmless fun” featuring humans and dinosaurs playing together. That would mean none of these, either:

That is basically the argument of James Wilson, a University of Sussex lecturer, who has a talk on the subject here on Youtube.

For non-biologists, like me, who like evolution and want some ammunition to defend it, I recommend Richard Dawkins’ recent book The Greatest Show on Earth. Some do find it a little dogmatic, and in the grand scheme I prefer Stephen Jay Gould, but it’s good for this purpose. Because rather than block access to dinosaur cartoons, I would rather arm myself – and the surrounding children – with the tools they need to handle them with confidence.


Filed under In the news, Me @ work

Color gender by the numbers

Men and women weigh in on their favorite colors.

Update: I’m curious. Will you take a color preference survey here?

More on the many mysteries of pink and blue, this time from college students expressing their own preferences, rather than adults’ choices for children.

This research is from 2001, but I just stumbled on it. In a survey of 5,000 college students from several dozen universities, men and women were asked to express, on an open-ended form, their favorite color.

MEN                                                     WOMEN

Favorite colors for college students, 1990s.

Other responses not shown (4% of men, 5% of women). My chart from data in the article.

These men have a strong blue preference; the women are more diverse in their choices. Proportionally, the biggest differences are on pink (women 10.6-times more likely to choose) and blue (men 1.8-times more likely).

Here is the interpretation of the authors:

Without ruling out any possibility at this point, we are inclined to suspect the involvement of neurohormonal factors. Studies of rats have found average sex differences in the number of neurons comprising various parts of the visual cortex. Also, gender differences have been found in rat preferences for the amount of sweetness in drinking water. One experiment demonstrated that the sex differences in rat preferences for sweetness was eliminated by depriving males of male-typical testosterone levels in utero. Perhaps, prenatal exposure to testosterone and other sex hormones operates in a similar way to “bias” preferences for certain colors in humans.

You really have to love it. Although it’s not as far gone as the speculation that color preferences evolved from the gender division of labor in the hunter-gathering prehistory, it’s not a theory well suited to the rapid historical change we’ve seen in the case of dressing children, at least.

If I were making up an explanation, I’d say maybe these college students were generally pushed toward girl-pink/boy-blue from infancy, and then the girls more actively incorporated color choice into their identities (the idea of having a “favorite color”) — resulting in greater diversity of choices. On the other hand, maybe boys were more likely not to have a color affinity in their identity toolbox and thus are more likely to have a stuck-in-childhood response that matches the preference their parents had for them, or one they consider socially desirable. How’s that?


Filed under Me @ work, Research reports