For the past eight or nine years I’ve been a student of skepticism. French and English skepticism in the Renaissance, to be precise. It’s an obscure interest—dry as dust in many respects—and I usually don’t talk about it unless someone pays me to do so, which hardly ever happens. Skepticism is one of those topics that can give academe a bad name. It conjures images of unworldly philosophers who claim they’re nothing but brains in a vat—bleary-eyed men who doubt the existence of other minds, or who swat mosquitoes after arguing that physical reality is an illusion. It can make us long for the defiant common sense of Dr. Johnson, whose impatience with arcane speculation was unforgettably recorded by James Boswell: “We stood talking for some time of Bishop Berkeley’s ingenious sophistry to prove the non-existence of matter; I observed that though we are satisfied his doctrine is not true, it is impossible to refute it. I shall never forget the alacrity with which Johnson answered, striking his foot with mighty force against a large stone, ‘I refute it thus‘.”
Yet Johnson himself was a skeptic in many ways-above all in his readiness to debunk what he called “cant”: trite, conventional, unexamined sentiment. And it is this sort of skepticism that particularly intrigues me. Bertrand Russell later conflated such doubt with an optimistic defense of reason, and his skeptical rationalism seemed powerfully persuasive to many intellectuals in the 20th century. But because my interests lie earlier-and because Renaissance doubt is always connected with faith—I find the meditations of Montaigne and Pascal especially useful in their strategies of questioning bedrock assumptions of European social life. Both men were believers, yet both found doubt compatible with belief, and both offered devastating critiques of pure reason. Montaigne employed skepticism in highly original ways, claiming for instance that “it is better to incline towards doubt than towards certainty…one overvalues one’s conjectures in using them to burn a man alive.” Yet elsewhere he was willing to defend the possibility of miracles, arguing that to reject anything that strikes us as improbable is to imply that we know the limits of the possible. And Pascal agreed with him: it’s a mistake for humans to suppose that reality can be entirely comprehended by the finite capacities of human understanding. What we know—or what we think we know—may well be true; the problem is that we don’t know how much we don’t know.
Shakespeare, too, was concerned with these matters. In King Lear, perhaps his greatest play, he allows one of his characters to suggest that “nothing almost sees miracles but misery.” There’s a perspectivism embedded in this claim, a sense that what passes as unremarkable for some may seem miraculous to others—especially those who suffer. And no Shakespearean play confronts suffering more directly than Lear. This may explain why my students, generally speaking, dislike it. Entering class on the first day of the semester, they often bring with them a set of deeply-held assumptions about the world, some of which might be encapsulated as follows: “everything happens for a reason,” “we reap what we sow,” “each person has a soul mate,” “things work out for the best,” and “we’re free to choose our destiny.” Armed with these postulates, an assiduous sophomore can transform even a vexing and intractable play like The Merchant of Venice into an airtight crowd-pleaser, laced up and buttoned down. But King Lear resists such manipulation. It does so in many ways—above all in its presentation of the death of Lear’s loyal and loving daughter, Cordelia.
Why does Cordelia die? She’s innocent, after all. She’s courageous. She’s forgiving. And she lives happily ever after in each of the sources Shakespeare consulted as he wrote his play; only in Lear does she perish. What was the Bard thinking? Even Johnson, one of Shakespeare’s most brilliant editors, was baffled. “A play,” he wrote, “in which the wicked prosper and the virtuous miscarry may doubtless be good, because it is a just representation of the common events of human life. But since all reasonable beings naturally love justice, I cannot easily be persuaded that the observation of justice makes a play worse, or that if other excellencies are equal, the audience will not always rise better pleased from the final triumph of persecuted virtue.” Well, yes. Except that pleasing an audience isn’t always the foremost consideration. Sometimes there are more crucial tasks.
Cordelia dies because Shakespeare, at his best, abandons poetic justice and confronts the chaotic flux of life. Every death that makes us weep—from Jesus Christ to Anne Frank, from Joan of Arc to Emmett Till—lies behind the death of Cordelia. That the innocent and the virtuous miscarry is not only true but commonplace; yet Lear still troubles its readers, tending as it does to unsettle comforting suppositions about human existence. It doesn’t negate them absolutely, but it casts them into doubt. And one of the duties of a teacher is to give such doubt a hearing.
I wonder about what happens to people when they believe too strongly, or accept too readily, or fail to look beyond, or behind, or beneath. I wonder what they see when they view a phenomenon solely through the prism of some favored theory. The extreme examples are ludicrous: Holocaust denial, the Flat Earth Society, or Erich von Däniken’s Chariots of the Gods, which credits extra-terrestrials for the architectural wonders of the ancient world. Remember the Nazca lines-those enormous drawings of lizards and hummingbirds in the Peruvian desert? According to the aesthetically-challenged von Däniken, they’re really just runways in an intergalactic space-port established long ago by alien travelers to earth.
But these are limit cases. Far more common are conflicts in which religious belief collides with empirical evidence. Creationism, as we all know, tends to be confuted by geology, archaeology, evolutionary biology, astronomy, comparative theology, and linguistics, not to mention logic. Still, it’s remotely possible that the famous 17th-century scholar James Ussher was correct when he ascertained the precise date of Creation: October 23rd, 4004 BCE. Johannes Kepler, after all, had placed it 12 years later, and Sir Isaac Newton seemed quite happy to split the difference, settling for a round figure of 4000 BCE. I’m pretty sure that Newton and Kepler were a lot smarter than I am—or anyone I know, for that matter—so I can only assume that the best minds in those days found it virtually impossible to draw the kinds of conclusions drawn by ordinary minds today. It’s a good lesson in humility, if nothing else: for broadly speaking, people think within the intellectual framework of their own historical moment. Who knows how our opinions will be received half a millennium from now?
So while I have no immediate plans to join the Flat Earth Society, I’m not unhappy that it exists. As John Stuart Mill argued in Victorian England, a free country will make sure that challenges to received opinion are heard. Those challenges may be irrational, reactionary, or offensive—or they may be right—but whatever their truth-status, their very presence allows widely accepted views to be contested, and this in turn helps to prevent such views from degenerating into unexamined assertions. Skepticism functions in much the same way. It can forestall a too-willing acquiescence to the-way-things-are; it can distance us from dogmatism and ward us away from zealotry; it can expose our mistakes. Of course we can’t entirely escape from biases and presuppositions, and to imagine that doubt can free us from ideology is to reimpose the most basic positivist assumptions that have been overturned during the past century. But doubt can make us more self-aware; it can keep us vigilant; it can render belief stronger, certainty more meaningful. Faith in doubt can give us the
backbone to change our minds.
Will Hamlin is an English professor
at WSU. The author of two books and many essays, he enjoys watching
soccer, listening to Bach and Mozart, and observing academic