Bigger Than Phil: When did faith start to fade?

In Tom Stoppard’s 1970 play “Jumpers,” the philosopher hero broods unhappily on the inexorable rise of the atheist: “The tide is running his way, and it is a tide which has turned only once in human history. . . . There is presumably a calendar date—a moment—when the onus of proof passed from the atheist to the believer, when, quite suddenly, the noes had it.” Well, when was that date—when did the noes have it? In 1890? In 1918, after the Great War? In 1966, when Time shocked its readers with a cover that asked whether God was dead? For that matter, do the noes have it? In most of the world, the ayes seem to be doing just fine. Even in secularized Manhattan, the Christmas Eve midnight Mass is packed tight with parishioners, and the few who came for the music are given dirty looks as they sheepishly back out after the Vivaldi.

The most generous poll never seems to find more than thirty per cent of Americans saying they are “not religious or not very religious,” though the numbers get up to around fifty per cent in Europe. But something has altered in the course of a century or so. John Stuart Mill said in the early nineteenth century that he was the only youth he knew who was raised as a skeptic; by the end of his life, skeptics were all around him. Yet, though the nineteenth-century novel is roiled by doubt, there isn’t one in which the doubters quite dominate. Whatever change has occurred isn’t always well captured by counting hands. At a minimum, more people can say they don’t think there is a God, and suffer less for saying so, than has been the case since the fall of Rome. The noes have certainly captured some constituency, obtained some place. What, exactly, do they have?

There’s a case to be made that the change is more like pulses than like tides. If the nineteenth century ended with freethinkers in every front parlor, for most of the twentieth century the sound of atheism became more agonized and muted. Madalyn Murray O’Hair, the firebrand head of the American Atheists, had an occasional spot on Johnny Carson, but it was always in the last ten minutes of the show, the same spot that, ahem, Johnny gave to authors. (Billy Graham got on right after the monologue.) The glamour lay in faith. Nearly all the great modernist poets were believers: Auden and Eliot in Anglo-Christianity, Yeats in some self-crafted Hibernian voodoo. Wallace Stevens, whose great poem “Sunday Morning” is all about what to do when you don’t go to church, saw his atheism treated very discreetly, like Hart Crane’s homosexuality.

Only in the past twenty or so years did a tone frankly contemptuous of faith emerge. Centered on the evolutionary biologist Richard Dawkins, the New Atheists were polemicists, and, like all polemics, theirs were designed not to persuade but to stiffen the spines of their supporters and irritate the stomach linings of their enemies. Instead of being mushy and marginalized, atheism could proclaim its creed. But why did the nonbelievers suddenly want stiffer spines and clearer signals? Why, if the noes indeed had it, did they suddenly have to be so loud?

A history of modern atheism—what did Voltaire say to Diderot? what did Comte mean to Mill? who was Madalyn Murray O’Hair, anyway?—would be nice to have. The British popular historian Peter Watson’s “The Age of Atheists: How We Have Sought to Live Since the Death of God” (Simon & Schuster) could have been that book, but it isn’t. Beginning with Nietzsche’s 1882 pronouncement that the big guy had passed and man was now out on the “open sea” of uncertainty, the book is instead an omnium-gatherum of the life and work of every modern artist or philosopher who was unsettled or provoked by the possible nonexistence of God. Watson leads us on a breakneck trip through it all—Bloomsbury and Bernard Shaw, Dostoyevsky and German Expressionism, Sigmund Freud and Pablo Picasso. If it’s Chapter 3, this must be Vienna.

This makes sense of a kind, the nonexistence of God being an issue for modern people, and rising up everywhere. But reporting on every place you see it doesn’t help to see it more clearly. (On one page, we hear about Anna Clark, Tennessee Williams, Stefan George, James Joyce, Philip Roth, Henry James, Wilhelm Reich, Valentine de Saint-Point, Léger, Milan Kundera, Michel Foucault, Jacques Lacan, Jean-François Lyotard, H. G. Wells, Gerhart Hauptmann, Aldous Huxley, John Gray, Eugene Goodheart, Jonathan Lear, and, of course, Nietzsche.) Argos, the hundred-eyed watchman, might have had more sight than other giants, but he didn’t have sharper sight. Would Matisse really never have painted “The Red Studio”—which Watson takes as a paradigm of post-religious art, with the artist’s self-made space replacing divine nature—if Nietzsche hadn’t made that memorable P.R. statement about the Deity’s demise?

The problem is that godlessness as a felt condition is very different from atheism as an articulate movement. Watson doesn’t distinguish clearly, or at all, between the two, and so his book manages to feel at once breathless and long-winded—much too rushed in its parts and too diffuse as a whole. Even his chronology of ever-growing disbelief seems off. “Modern art is a celebration of the secular,” he states confidently, meaning Picasso and his like, and although he backtracks quickly, he can’t backtrack far enough, since so much of modern art—Kandinsky, Mondrian, Rothko—has been religious or mystical in nature.

Only in the last hundred or so pages does the real contention of the book appear. For Watson, we are divided not so much between believers and non- as between what might be called Super-Naturalists, who believe that a material account of existence is inadequate to our numinous-seeming experience, and Self-Makers, who are prepared to let the human mind take credit even for the most shimmering bits of life. His enduring sympathies lie with the unduly forgotten historian and novelist Theodore Roszak and with the philosopher Richard Rorty. Both are conciliatory Self-Makers, who sought to elevate experience over arguments and, dissatisfied with science, made of religious feeling its own religion. Watson regards phenomenology as “the most underrated movement of the twentieth century,” and finds in its emphasis on happy sensations, on the thisness of life, the happiest alternative to old-time religion. Atheism sanctifies less of the world but names more of it, he seems to say, and this is in itself enough. This seems to leave the door open for believers to engage in expanded “naming” of their own, which would turn mighty Jehovah into little Tinker Bell—if you say his name enough, he lives. Still, for Watson this is the right, positive, mystery-affirming, life-enhancing, and pragmatic-minded faith to end up within.

That really useful history of atheism would, presumably, try to distinguish between Watson’s subject, the late-arriving romantic agony of Nietzsche and his disciples—which responds to God’s absence the way fifth graders respond to the absence of the teacher: you mean now we can do anything?—and the older tradition of Enlightenment rationalism: the tradition that gave God a gold watch and told him the office was now so well ordered he wouldn’t be needed any longer. This more polite but, finally, more potent form of non-faith has played a larger role in politics than the no-more-teacher kind, if a lesser role in the arts. It’s the subject of “Imagine There’s No Heaven: How Atheism Helped Create the Modern World” (Palgrave), by the N.Y.U. journalism professor Mitchell Stephens.

Stephens’s book is entirely a story of atheism as an articulate movement—one that he is rooting for, a little too hard at times. We learn an enormous amount about figures censored out of history, and about the persecution that freethinkers suffered until shockingly recently. His martyrs fill our hearts; his heroes inspire. He prefers Denis Diderot, the French encyclopedist, to Voltaire, for, where Voltaire merely mocked at a safe distance, Diderot was a real moralist, who changed Parisian minds. And he devotes many moving pages to the underappreciated life of Charles Bradlaugh, the defiant Victorian atheist—he allegedly called Christianity “a cursed, inhuman religion”—who nonetheless got himself elected to Parliament many times over, and was finally allowed to take his seat there, against the Queen’s wishes. (He was celebrated after his death with a seven-foot statue in his constituency.) Stephens does remind the reader at moments of the mother in Philip Roth who, reading about a plane crash, always counts the Jewish names first. Stephens counts the atheists first, emphasizing their role in the anti-slavery movement, even though, as he knows, the Christian churches played a far larger one.

In Stephens’s telling, a new human epoch of feeling is kicked off by each big name. So with godlessness: Diderot wrote this, Nietzsche said this, Darwin saw this, Bradlaugh stood his ground, and now the liquor stores are open all day Sunday. The difficulty, as always with popular chronicles of ideas, is not that ideas don’t matter; it’s that we too readily skip over the question of how they come to matter. Who seeded the ground is the historian’s easy question; what made the ground receive the seed is the hard one.

Indeed, much of the argument against God works less well as argument and thesis than as atmosphere and tone. The sappers who silently undermined the foundations of the Church did more damage than the soldiers who stormed the walls. Two luminaries of the English and French Enlightenment, Edward Gibbon and August Comte, entirely elude Stephens’s story. Neither was a nonbeliever by argument or by avowal, yet both helped kill God by implication and by insinuation. The most effective and far-reaching case against Christianity in eighteenth-century England is Chapter 15 of Gibbon’s “Decline and Fall of the Roman Empire.” Gibbon concedes—that is, “concedes”—the obvious truth of the Christian religion, and then asks, deadpan, what worldly mechanism would nonetheless have been necessary for its triumph? In a manner still not improved upon for concise plausibility, he enumerates the real-world minority politics that made it happen. The Christians had the advantage of cohesion and inner discipline that the dissipated majority, pagans and Epicureans alike, did not. Religious history becomes a question of human causes and events. Divinity is diminished without ever being officially doubted.

Comte, in his way, did more damage to organized religion than Diderot, not by quarrelling with it but simply by imitating it. He brought an aggressive form of “humanism” to nineteenth-century France, inclining toward a form of worship that replaced the God above with Good Men below. His kind of humanism created chapels (one still exists in Paris) filled with icons of the admirable: Héloïse, Abélard, Galileo. It’s still a cozy space. Instead of making us God-size, he made faith us-size. Just as religious tolerance was established less by argument than by exhaustion, infidelity was made appealing by atmosphere. Argument mattered chiefly through the moods it made.

There do seem to be three distinct peaks of modern disbelief, moments when, however hard it is to count precise numbers, we can sense that it was cool to be a scoffer, trendy to vote “No!” One is in the late eighteenth century, before the French Revolution, another in the late nineteenth century, just before the Russian Revolution, and now there’s our own. A reactionary would point out, with justice, that each high point preceded a revolution that turned ugly enough to make nonbelief look bad. Very much like the Christians in the Roman Empire, the noes have had it most often less through numbers than through discipline and self-confidence and an ambition for power, even claiming, like Christians, the assent of a state: first Republican France, then the Soviet Union.

Yet the need for God never vanishes. Mel Brooks’s 2000 Year Old Man, asked to explain the origin of God, admits that early humans first adored “a guy in our village named Phil, and for a time we worshipped him.” Phil “was big, and mean, and he could break you in two with his bare hands!” One day, a thunderstorm came up, and a lightning bolt hit Phil. “We gathered around and saw that he was dead. Then we said to one another, ‘There’s something bigger than Phil!’ ” The basic urge to recognize something bigger than Phil still gives theistic theories an audience, even as their explanations of the lightning-maker turn ever gappier and gassier. Expert defenders are more and more inclined to seize on the tiniest of scientific gaps or to move ever upward to ideas of God so remote from existence as to become pure hot air. Stephen C. Meyer’s best-selling “Darwin’s Doubt” (HarperOne) reinvents the God of the Gaps—a God whose province is whatever science can’t yet explain—with a special focus on the Unsolved Mysteries of the Cambrian explosion. Experience shows that those who adopt this strategy end up defending a smaller and smaller piece of ground. They used to find God’s hand in man’s very existence, then in the design of his eyes, then, after the emergence of the eye was fully explained, they were down to the bird’s wing, then they tried the bacterial flagellum, and now, like Meyer, they’re down to pointing to the cilia in the gut of worms and the emergence of a few kinds of multi-cellular organisms in the Cambrian as things beyond all rational explanation. Retreat always turns to rout in these matters.

As the explanations get more desperately minute, the apologies get ever vaster. David Bentley Hart’s recent “The Experience of God: Being, Consciousness, Bliss” (Yale) doesn’t even attempt to make God the unmoved mover, the Big Banger who got the party started; instead, it roots the proof of his existence in the existence of the universe itself. Since you can explain the universe only by means of some other bit of the universe, why is there a universe (or many of them)? The answer to this unanswerable question is God. He stands outside everything, “the infinite to which nothing can add and from which nothing can subtract,” the ultimate ground of being. This notion, maximalist in conception, is minimalist in effect. Something that much bigger than Phil is so remote from Phil’s problems that he might as well not be there for Phil at all. This God is obviously not the God who makes rules about frying bacon or puts harps in the hands of angels. A God who communicates with no one and causes nothing seems a surprisingly trivial acquisition for cosmology—the dinner guest legendary for his wit who spends the meal mumbling with his mouth full.

What’s easily missed in all this is something more important: the clandestine convergence between Super-Naturalists and Self-Makers. Surprisingly few people who have considered the alternatives—few among the caucus who consciously stand up, voting aye or nay—believe any longer in God. Believe, that is, in an omnipotent man in the sky making moral rules and watching human actions with paranoiac intensity. The ayes do believe in someone—a principle of creation, a “higher entity,” that “ground of being,” an “idea of order,” an actor beyond easy or instant comprehension, something more than matter and bigger than Phil. And they certainly believe in some thing—a church, a set of rituals, a historical scheme, and an anti-rational tradition. But the keynote of their self-description typically involves a celebration of mystery and complexity, too refined for the materialist mind to accept. Self-Makers often do an injustice to the uncertainty of Super-Naturalists, who, if anything, tend to fetishize the mystery of faith as a special spiritual province that nonbelievers are too fatuous to grasp, and advertise their doubt and their need for faith quite as much as their dogma. “Can’t Help Lovin’ Dat Man,” not “Onward, Christian Soldiers,” is the Super-Naturalists’ anthem these days.

But, just as surely, most noes believe in something like what the Super-Naturalists would call faith—they search for transcendence and epiphany, practice some ritual, live some rite. True rationalists are as rare in life as actual deconstructionists are in university English departments, or true bisexuals in gay bars. In a lifetime spent in hotbeds of secularism, I have known perhaps two thoroughgoing rationalists—people who actually tried to eliminate intuition and navigate life by reasoning about it—and countless humanists, in Comte’s sense, people who don’t go in for God but are enthusiasts for transcendent meaning, for sacred pantheons and private chapels. They have some syncretic mixture of rituals: they polish menorahs or decorate Christmas trees, meditate upon the great beyond, say a silent prayer, light candles to the darkness. They talk without difficulty of souls and weapons of the spirit, and go to midnight Mass on Christmas Eve to hear the Gloria, and though they leave early, they leave fulfilled. You will know them by their faces; they are the weepy ones in the rear.

If atheists underestimate the fudginess in faith, believers underestimate the soupiness of doubt. My own favorite atheist blogger, Jerry Coyne, the University of Chicago evolutionary biologist, regularly offers unanswerable philippics against the idiocies of intelligent design. But a historian looking at his blog years from now would note that he varies the philippics with a tender stream of images of cats—into whose limited cognition, this dog-lover notes, he projects intelligence and personality quite as blithely as his enemies project design into seashells—and samples of old Motown songs. The articulation of humanism demands something humane, and its signal is disproportionate pleasure placed in some frankly irrational love. Stephens, for that matter, takes his title from the seemingly forthright John Lennon song “Imagine.” Lennon, having flirted with atheism for about nine months, from Christmas of 1970 to the fall of 1971, fell back into a supernaturalist web of syncretism of his own, flying the “wrong,” or westerly, way around the world and practicing astrology. Stephens says diplomatically that Lennon “remained intermittently susceptible to belief”—but in truth Lennon was entirely captive to whatever superstition had most recently tickled his fancy, or his wife’s. Imagine there’s no Heaven—but pay attention to the stars and throw the I Ching as necessary. The maker of the great atheist anthem was anything but an atheist.

Doctrinaire religionists and anti-religionists alike reserve special derision for such handmade syncretism, perhaps because they see it as the real threat to their authority. (Christian rites were mocked among the Romans for their vulgarity long before they were denounced for their absurdity.) “Being Jewish is incredibly important to me, but I’m not observant,” one such syncretist, a novelist, says in an interview:


At the same time, I cared deeply that my son know himself as Jewish—not just culturally, but be steeped in the traditions and rituals. His Bar Mitzvah last year—which was completely homegrown, eclectic, held in a church, led by a female Rabbi with whom we’ve become close, with readings from Coleridge and Hannah Szenes, as well as the whole congregation singing Leonard Cohen’s “Broken Hallelujah” with my son playing his ukulele and me on the piano—was one of the highlights of my life.
 

One is supposed to disdain such festivals as slack and self-pleasing—that double-blasphemed church! that lady rabbi! that ukulele! But in truth they are no more or less “made up” than the older religions, which were also forged from disparate parts, and looked just as ridiculous to outsiders at the time. This is not a bastardized or lesser form of faith. It is faith as it has always existed.

Good news, right? Doesn’t this mean that we are, to cite a chastened optimist, less divided than our theological politics suggest? Probably it means that we are even more divided than our religious politics suggest, because it is the point of politics to divide. It is not an accident that the crucial moment of voting in the British Parliament is called a “division.” Our politics are a mirror not of our similarities but of our differences. That’s why they’re politics. We were less divided than our politics made us seem right on the brink of the Civil War, too. We were just divided on one big point. And the big point that divides us now is that the Super-Naturalists don’t want only to be reassured that they can say their prayers as much as they like to whomever they like. They also want recognition from the people they feel control the culture that theirs is an honored path to truth—they want Super-Naturalism to be respected not just as a way of living but as a way of knowing.

And here we arrive at what the noes, whatever their numbers, really have now, and that is a monopoly on legitimate forms of knowledge about the natural world. They have this monopoly for the same reason that computer manufacturers have an edge over crystal-ball makers: the advantages of having an actual explanation of things and processes are self-evident. What works wins. We know that men were not invented but slowly evolved from smaller animals; that the earth is not the center of the universe but one among a billion planets in a distant corner; and that, in the billions of years of the universe’s existence, there is no evidence of a single miraculous intercession with the laws of nature. We need not imagine that there’s no Heaven; we know that there is none, and we will search for angels forever in vain. A God can still be made in the face of all that absence, but he will always be chairman of the board, holding an office of fine title and limited powers.

Given the diminishment in divine purview, from Galileo’s time on, the Super-Naturalists just want the language of science not to be actively insulting to them. And here we may come at last to the seedbed of the New Atheism, the thing that made the noes so loud: the broad prestige, in the past twenty years, of evolutionary biology. Since the Enlightenment, one mode of science has always been dominant, the top metaphor that educated people use to talk about experience. In most of the twentieth century, physics played the role of super-science, and physics is, by its nature, accommodating of God: the theories of physics are so cosmic that the language of physics can persist without actively insulting the language of faith. It’s all big stuff, way out there, or unbelievably tiny stuff, down here, and, either way, it’s strange and spooky. Einstein’s “God,” who does not play dice with the universe, is not really the theologian’s God, but he is close enough to be tolerated. With the great breakthroughs in understanding that followed the genomic revolution, evo-bio has become, insensibly, the model science, the one that so many of the pop books are about—and biology makes specific claims about people, and encounters much coarser religious objections. It’s significant that the New Atheism gathered around Richard Dawkins. The details of the new evolutionary theory are fairly irrelevant to the New Atheism (Lamarckian ideas of evolution could be accepted tomorrow, and not bring God back with them), but the two have become twinned in the Self-Making mind. Their perpetual invocation is a perpetual insult to Super-Naturalism, and to the right of faith to claim its truths.

“Cosmically, I seem to be of two minds,” John Updike wrote, a decade ago. “The power of materialist science to explain everything—from the behavior of the galaxies to that of molecules, atoms, and their sub-microscopic components—seems to be inarguable and the principal glory of the modern mind. On the other hand, the reality of subjective sensations, desires, and—may we even say—illusions composes the basic substance of our existence, and religion alone, in its many forms, attempts to address, organize, and placate these. I believe, then, that religious faith will continue to be an essential part of being human, as it has been for me.” Does religion alone address the reality of our subjective sensations? It’s perfectly possible to believe that there are many things that will never be subjects of science without thinking that they are therefore objects of faith. Human beings are unpredictable. We can’t know what songs they will sing, what new ideas they will come up with, how beautifully they will act or how badly. But their subjective sensations do not supply them with souls. They just make them people. Since Darwin’s starting premise is that individual variation is the rule of nature, it isn’t surprising that the living things that are able to have experiences have them in varied and individual ways. The plausible opposite of “permanent scientific explanation” is “singular poetic description,” not “miraculous magical intercession.”

In the end, these seem questions more of temperament than of argument. Mitchell Stephens speaks of the agonized struggle within the soul and the mind between belief and nonbelief. This struggle is a modern piety, but I wonder how many people actually experience it. The sight of an open sea strikes some as beautiful and others as scary, and the line between them seems no more a matter of principle than that between people who like oceans in the summer and those who prefer ponds. Some people of great sensibility and intelligence—Larkin, Auden, and Emily Dickinson, to name three—find intolerable the idea of open seas, of high windows letting in the light, and nothing beyond. If the leap to God is only a leap of the imagination, they still prefer the precarious footing. Others—Elizabeth Bishop, William Empson, and Wallace Stevens—find the scenario unthreatening, and recoil at the idea of a universe set up as a game of blood sacrifice and eternal torture, or even with the promise of eternal bliss not easily distinguishable from eternal boredom. They find a universe of matter, pleasure, and community-made morality the only kind of life possible, and the only kind worth living. The differences, first temperamental, then become theological.

What if, though, the whole battle of ayes and nays had never been subject to anything, really, except a simple rule of economic development? Perhaps the small waves of ideas and even moods are just bubbles on the one great big wave of increasing prosperity. It may be that the materialist explanation of the triumph of materialism is the one that counts. Just last year, the Princeton economist Angus Deaton, in his book “The Great Escape,” demonstrated that the enlargement of well-being in at least the northern half of the planet during the past couple of centuries is discontinuous with all previous times. The daily miseries of the Age of Faith scarcely exist in our Western Age of Fatuity. The horrors of normal life in times past, enumerated, are now almost inconceivable: women died in agony in childbirth, and their babies died, too; operations were performed without anesthesia. (The novelist Fanny Burney, recounting her surgery for a breast tumor: “I began a scream that lasted unremittingly during the whole time of the incision. . . . I felt the knife rackling against the breast bone, scraping it while I remained in torture.”) If God became the opiate of the many, it was because so many were in need of a drug.

As incomes go up, steeples come down. Matisse’s “Red Studio” may represent the room the artist retreats to after the churches close—but it is also a pleasant place to pass the time, with an Oriental carpet and central heating and space to work. Happiness arrives and God gets gone. “Happiness!” the Super-Naturalist cries. “Surely not just the animal happiness of more stuff!” But by happiness we need mean only less of pain. You don’t really have to pursue happiness; it is a subtractive quality. Anyone who has had a bad headache or a kidney stone or a toothache, and then hasn’t had it, knows what happiness is. The world had a toothache and a headache and a kidney stone for millennia. Not having them any longer is a very nice feeling. On much of the planet, we need no longer hold an invisible hand or bite an invisible bullet to get by.

Yet the wondering never quite comes to an end. Relatively peaceful and prosperous societies, we can establish, tend to have a declining belief in a deity. But did we first give up on God and so become calm and rich? Or did we become calm and rich, and so give up on God? Of such questions, such causes, no one can be certain. It would take an all-seeing eye in the sky to be sure.