On my way home from Hardees this morning, it occurred to me that the Islamic account of the revelation of the Koran was a prime example of what Jacques Derrida called “the simulacrum.” “But it is a difference without reference, or rather a reference without a referent, without any first or last unit, a ghost that is the phantom of no flesh, wandering about without a past, without any death, birth, or presence. Mallarmé thus preserves the differential structure of mimicry or mimesis, but without its Platonic or metaphysical interpretation, which implies that somewhere the being of something that is is being imitated.” (Jacques Derrida, Dissemination. Trans. Barbara Johnson (Chicago: University of Chicago Press, 1981, p. 206). Here’s what I have in mind.
Islam believes that, when the Prophet Muhammad uttered the oracles subsequently written down as the Surahs of the Koran, he was repeating what the angel Gabriel was reading to him from the Mother of the Book, an eternal volume resident in heaven. But of course there is and was no such book resting on a pulpit in the sky. Not only did cosmonaut Yuri Gagarin not see it, but the whole notion is laughable if only because the Surahs, many of them, address specific historical circumstances, not eternal verities. It presupposes an ironclad historical determinism, predestinarianism, a script written in advance. This reduces even the supposed history of Islam to a charade: were the “crimes” of unbelief against which Allah blusters and fulminates in the Koran all scheduled by him in advance? Actually, the Koran says that they were. “Oh yeah? You refuse to knuckle under to Allah? Well, my friend, the joke’s on you, ‘cause Allah made you not believe in him so he could send your ass to Gehenna! So there!” Can such childishness really be a revelation from God? Well, I guess it starts sounding pretty plausible if the alternative is to get your head chopped off.
Also, the Koran speaks again and again of supposed events of the past vis-à-vis Muhammad, how things happened long ago to Abraham, Noah, Moses, and Jesus. Obviously, these things are referred to in the past, but if the Koran was eternal in the heavens, it has the tenses all wrong! It must have originated in a historical period when these (supposed) events were long past.
But, a la (not Allah) Derrida and Mallarmé, the infinite regress continues apace. The idea that the Surahs of the Koran are transcriptions (from memory!) of things witnesses heard a man named Muhammad say is equally fictitious. They are “copies” of something that had hitherto failed to exist in the first place! It has even become questionable whether there was ever a historical Muhammad upon whom these “revelations” might be fictively fathered.
All this falderal is perhaps meant to conceal a genuine original from which the Surahs were derived, namely Arabic Christian hymns. Gunter Lüling showed how much sense it makes of the Koranic text once one strips away an obscuring layer of redaction and rewriting by Muslim scribes and theologians. Even rules of grammar had been reverse-engineered to make the original Christian texts look Islamic. Lüling estimated that as much as a third of the extant Koran stems from Christian hymnody. Ibn Warraq, a scholarly historian of Islamic origins and Koranic criticism, notes that, as it stands, much of the text is simply incoherent. English translations of the Koran differ so widely because the translations are perforce largely conjectural.
The title of the present essay is of course based on the opening salvo of Jesus’ preaching assummarized by Mark the evangelist: “The time is fulfilled; the kingdom of God is at hand. Repent and believe the good news” (Mark 1:15). This is not actually presented as the quoted words of Jesus, despite the usual punctuation in our Bible translations. Instead, it is indirect discourse: Mark’s words, not those of Jesus. Did Jesus say this? Did he even say the sayings the gospels do offer us in the form of direct quotations? I doubt it very much. In fact, the case is much like that of the Surahs of the Koran. It is just plain fanciful to imagine that people immediately took pains to memorize Jesus’ brief aphorisms, much less his longer, more detailed parables, what characters in them said back and forth, etc. Get real! Isn’t it obvious that the whole “oral tradition” approach is a far-fetched attempt to build a bridge to nowhere? To retroject later literary creations back onto a possibly fictional figure of the past? It’s a mirror image of the Star Trek writers cooking up the Ferengi Rules of Acquisition and ascribing them to an alien race in the distant future. Of course, they are presupposing the viewer’s “temporary, willing suspension of disbelief.” So are the gospel writers, minus the “temporary” qualifier.
Look at the Markan summation of Jesus’ apocalyptic message. The urgency of repentance is predicated upon the prospect of nearly immediate Armageddon. And, like the whole series of doom-saying predictions in the subsequent centuries, this prediction failed. There was no prior truth to which it corresponded, no heavenly deposit of information that Jesus’ preaching reflected.
Of “what” is one to repent in order to believe in the good (fake) news? I suggest it implies that one must repent of sober, rational thinking. Mustn’t you do that if you are going to decide to believe in something for which there is and can be no evidence? Repent of reason? That’s faith, what faith requires, because otherwise you will never arrive at the desired conclusion.
Who could want you to do that? It must be someone who desperately wants you to hop aboard their bandwagon, to accept their prescribed dogma, and who knows that no careful weighing of available, verifiable evidence would get you there. You cannot buy the product with the crummy pocket change of rational thought and data, so you’d have to beg, borrow, or steal the needed funds. It’s the unintended admission of the parable of the pearl merchant (Matthew 13:45-46) who sells his entire inventory in order to purchase one giant, shining bauble. Not a good idea. Caveat emptor!
I guess it all boils down to what my old pal Tony Glidden, fellow Gordon-Conwell alumnus, once said: “When people say, ‘God said it! I believe it! That settles it!’ what they really mean is, “I said it! God believes it! That settles it!’” Tony is a smart guy.
Imagine for a moment how vastly different theological (and political) discussions would be if we resolved never to appeal to unverifiable “authorities” like Jesus, Muhammad, the Bible, or the Koran. Suppose we were to eschew anything but reason and evidence, and our own evaluation of them, not just what our favorite “experts” say. We should very quickly come to realize just how little basis there is for any conviction on many questions, how little justification for dogmatism, and how surprisingly adequate it is to deal with life without settled beliefs on unknowable questions. It might work.
Carol and I were watching The Rifleman this morning. It’s still on as I write. I watch an hour of it every day. Ditto The Andy Griffith Show. There are significant parallels between the two shows’ premises. Both center on the adventures of a widowed father who is a lawman (Rifleman’s Lucas McCain is often deputized or serves as fill-in marshal). Both run through a series of girlfriends but never remarry. Each is a great dad, raising his young son without a mother. There is a surrogate mother character in each show: shopkeeper Hattie in The Rifleman, Aunt Bea in Andy Griffith. (The same actress who portrayed Hattie went on to play Clara Edwards, Aunt Bea’s best friend!) The Rifleman premiered in 1958, Andy Griffith in 1960. I guess it’s possible for the one series to have borrowed from the other, but I doubt it.
Lucas McCain is much like Star Trek Captains Kirk and Picard. The Rifleman exemplifies unimpeachable character and integrity, together with a sense of duty to his own code of honor as well as to his community. He has courage and can back it up with power. He is a forgiving Christian, eager to believe the best about people and to give them a second chance. Each episode is a morality play with a lesson to teach, yet without sanctimonious priggishness. The story is always told for its own sake, but given the kind of person Lucas is, any stories told of him must contain moral lessons. I wish parents and schools would have kids watch The Rifleman, both as a set of history lessons and as character education.
But that’ll never happen.
Why? Because most episodes end with Lucas shooting down the bad guys, albeit in self-defense. Even in its initial run, some condemned the show as the most violent on TV. How much more objectionable the gun violence must seem to Politically Correct pantywaists today when little kids get suspended from school for pointing fingers as imaginary guns! And that’s too bad since such prissiness fosters the nonsensically naïve notion that guns are not necessary to ensure a safe society. But they still are, just as in the 1880s. There are two episodes that face this problem directly, showing how attempts to control crime by confiscating firearms wind up inviting the very chaos they aimed at avoiding. The show makes the point loud and clear: the righteous as righteous must be prepared to kill the wicked—lest they become the wicked by neglecting to prevent their wicked deeds. It takes courage. And courage is a virtue.
Sheriff Andy Taylor is quite different on this point. Like an English bobby: he does not usually carry a gun. Like that of his transatlantic counterparts, Andy’s policy presupposes a civilized society whose crimes are not usually violent and thus do not require a violent response. Andy does take a rifle off the rack on those rare occasions when an escaped criminal from the outside world infiltrates the mini-Shangri-La of Mayberry. But, as I remember, he has never had to fire it.
Andy takes turns teaching, usually Socratically, character lessons to his son Opie and learning from the boy who doesn’t hesitate to point it out when the Emperor of Mayberry has no clothes. As the years passed, the character of Andy became even more morally upright, no longer the truth-stretching rascal he was in earlier seasons. Unfortunately, he became a bit of a bore. When he finally left the show for a new series, Headmaster, where he was in charge of a private school, the transition seemed too completely smooth.
Some Iron in Your Diet
But it is not just character lessons in general that such TV shows teach, as important as they are. Without trying to, these shows illustrate what it is we increasingly lack in early twenty-first century America: normative nuclear family structures and the dynamic relations between relatives that we evolved to depend upon. A terrific book setting forth these relations is poet Robert Bly’s Iron John, very popular in the 1990s. The title refers to one of Grimm’s Fairy Tales in which the King drains a swamp in which a furry anthropoid (think Bigfoot) lurks. The locals fear this creature, called Iron John. The soldiers manage to take him captive and to place him in a cage, where he poses no danger and serves as a spectacle for the curious. One day the child Prince is playing with a ball which rolls into Iron John’s cell. The giant offers to return it to him if the Prince will fetch the key to the cage, which is kept under the Queen’s pillow. This is a dicey proposition because the King has decreed that anyone who releases the Wild Man will be executed. The boy goes ahead with the scheme and manages to filch the key. He frees Iron John and, afraid his crime will be discovered, leaves with Iron John. They live in the forest, where it turns out Iron John guards a vast hoard of riches and relics stored down a well. The man-monster must go away on various errands, leaving the prodigal Prince minding the store. He charges him never to enter the well, but eventually the boy accidentally falls partly over the edge. This turns his hair to gold.
Iron John sees what has happened in his absence and sends the lad away, deciding it is time for him to leave Toyland and enroll in the School of Hard Knocks. So the Prince hits the road. Eventually, he enters the service of a foreign king. He proves his mettle in battle, then wins the hand of the king’s daughter. Finally, he is reunited with his parents at the royal wedding. Iron John shows up, too, but he no longer looks like a Yeti. He is a glorious, golden king, freed from a curse which had transformed him into his apish form until he should find someone worthy of freeing him.
Bly saw in this tale, quite plausibly, a lesson of male maturation. The central character is not really the boy who matures but rather his initiator, without whose guidance and challenges the youth cannot mature. A male mentor (father, uncle, coach, whatever) separates a lad from his mother’s doting care. He helps the boy to discover his male energy, what Bly calls his “Zeus energy.” It is power, ferocity, animal instinct. The young man must by no means suppress his urges but instead learn to find their proper place, the right times and occasions in which to manifest them. The ideal would be a man who is quick to show compassion but from a position of superiority. One for whom a compromise is condescension rather than surrender, who is “big enough” to forgive; i.e., he forgives freely, not because he is obliged to do so.
In our bizarre era, male mentoring has become more and more rare, and this is the result of the breakdown of the family unit. On the one hand “Great Society” programs have gutted ghetto families by invalidating the husband-father’s labor by making government assistance easier and more lucrative than working at a job. No longer needed and feeling useless, these men are robbed of their natural dignity and abandon their families. Their male children have no adult male to assist their transition to maturity. Their Zeus energy has to go somewhere, so these kids join violent street gangs, seeking legitimation, affirmation, i.e., respect, through crime, proving their valor by daring but destructive exploits.
On the other hand, in the hands of feminist ideologues, young men are urged toward “androgyny,” i.e., emasculation, feminization, sissification. Their Zeus energy is extinguished along with any idea of heroic achievement, of individual greatness that earns its stature above the faceless collectivity.
Woodman, Spare That Tree!
Neo-Freudian Jacques Lacan speaks of socialization as the imposition by society of “the law of the father.” Individual identity depends, ironically, upon definition in terms of the culture one is born into. And Robert Bly is, I think, affirming the importance of that patterning. But I believe his approach is predicated on the rejection of one of today’s regnant dogmas, that gender identity is a purely arbitrary social construction with no necessary relation to biology. By contrast, Bly presupposes the existence and importance of biologically assigned and essential maleness and femaleness. If these are neglected or allowed to go out of control or are denied by fiat, there will be destructive consequences to the individual and to society. Biology is destiny, as Freud said, at least insofar as it provides what Aristotle called an entelechy, an inborn tendency which may or may not be fulfilled, depending on external circumstances. The entelechy of an acorn is to become an oak, which it will actually do if not thrown off track by some tree disease or by a guy with an ax. Likewise, males are born with maleness, females with femaleness, gays with gayness, and to reroute that train somewhere along the line will most likely cause a trainwreck.
A rival vision is that of Gilles Deleuze and Felix Guattari (in their Anti-Oedipus: Capitalism and Schizophrenia), who issue a call to freedom, a deconstruction of the indoctrinated law of the father, giving birth to a kind of Dionysiac schizophrenia. This is what the Politically Correct Radical Left is promoting when they seek to dissolve traditional family structure, marriage, and gender identity. You can, they say, decide what gender you are, and some offer over fifty boxes to check. One may even, they say, choose which race to belong to, and this from the same ideologues who complain about “cultural appropriation”! They force (on college campuses) the adoption of artificial gender-neutral pronouns.
Ultimately these trends must create a condition of anomie, normlessness, moral and political anarchy in both individuals and society. I contend that such freedom, which is but a euphemism for “nothing left to lose,” is Nihilism. It creates a mind that is so open that it needs to be closed for repairs. Integrity means you’ve “got it all together,” that your thoughts and actions are in harmony and that your values are consistent with one another, and with certain underlying principles. I submit that it is not possible to have moral character, hence personal maturity, if one lacks such a foundation. Thus I think it should be no surprise that today’s Leftists embrace the utterly cynical tactics of the devilish Saul Alinsky. “By any means necessary.” Uh, necessary for what? For the “freedom” that is really just an epileptic spasm? We are already feeling the sea-sickness from the ship pitching on the agitated ocean.
One of my all-time favorite films is Ingmar Bergman’s Fanny and Alexander (1982). The Ekdahl clan operates a local theatre and drama troupe in the early days of the twentieth century, so it is natural for them to see things from a theatrical perspective. Once the curtain falls on the annual Christmas play, theatre owner Oscar Ekdahl offers some year-end reflections, and he speaks of the function of their theatre and of all theatres. “I love the little world inside the thick walls of this playhouse… Outside is the big world, and sometimes the little world succeeds for a moment in reflecting the big world, so that we understand it better.” Oscar might have called the theatre a mirror of the big world. He pretty much did. How does the theatre mirror the world in such a way as to make it understandable?
The assumption is that our ordinary perception of the big world is (to borrow Kant’s term) an “undifferentiated manifold of perception.” Oscar describes his thick-walled playhouse as “a small room of orderliness, routine, conscientiousness, and love.” That is not like the big, blooming, buzzing world outside, an onrushing ocean of random experience and nigh-unintelligible signals. But once that big world is represented inside the little world, it will begin to take on the orderliness of the little world. It must or it cannot be squeezed inside. This means that the plays performed inside have selected certain elements of the larger, “real” world and imparted structure that the outer world lacks. In thus choosing key features of that big world to represent it, we make that world understandable, isolating the clues to what the playwright thinks is “really” going on. The outer world is a dumb block of marble from which the playwright extracts an intelligible object. He might have chosen a different resultant shape, and of course other playwrights (interpreters) do all the time.
Fun House Mirror
Plays, movies, television, novels can all function as mirrors as we see ourselves in them. We may identify with characters portrayed there, but it can be just as revealing when we find ourselves despising characters. Is it possible we react so strongly to characters whose negative traits mirror our own? It is a way of “projecting” a la Jung (or engaging in “reaction formation” as Freud called it). We despise these depicted traits, scape-goating them onto fictional characters (also onto other real-life people).
If your reflection in the mirror of media seems to you an offensive distortion, you would be well advised to take a closer look at that image. Is it possible that the disconnect is between the truth about you and your carefully crafted image of yourself? It would be like most people’s reaction to first hearing their recorded voice. “Uh, I sound like that? Ugh…” And it may not be only your self-image that is reflected stripped of illusion; maybe your accustomed worldview is endangered. I think of H.P. Lovecraft’s story “The Outsider.” The title character is a mysterious amnesiac who has just emerged from a buried castle and blunders into a party full of genteel gaiety. Upon his entry, the partiers scatter in panic. The Outsider sees the revolting, horrific image of that which so frightened everyone else. Initially paralyzed by the same fear, he cannot flee. But the horror does not move either. As the man reaches out toward the intruder, he sees the monster, a hideously decayed figure, reaching out toward him. His questing finger meets that of the other, but in that moment his finger meets an unyielding glass surface, not of a window, but of a mirror.
Through the Looking Glass
There are a couple of Star Trek episodes that require discussion here. The first is “Mirror, Mirror” in which Kirk and some of his crew get switched with their counterparts from an alternative universe where things are, to put it mildly, topsy-turvy. Instead of the benevolent Federation, we find the malevolent Empire. The whole crew has turned into their Hyde versions. Kirk has to play a creative, desperate game to maintain the required façade while finding ways to avoid the brutal atrocities expected of him. Back in the familiar Star Trek universe, the alien Kirk quickly reveals himself as a raging Caligula. Interestingly, Spock is a constant in both worlds! One might have expected he would be letting his human half rip with unbridled emotions! And this is a clue.
Why would the Mirror world versions of the characters be negatives? The answer is to be found in the other episode, “The Enemy Within,” in which a transporter beam malfunction splits Kirk into two, one possessing the original’s intelligence and conscience, the other possessing Kirk’s courage, decisiveness — and basest instincts. So it turns out that in the Mirror universe the characters’ baser nature is dominant, while in “our” world” their noble character is manifest.
Parenthetically, “The Enemy Within” is a great parable of the Jewish doctrine of the Good vs. Evil Yetzers, or instincts. Every person has both, like the angel and the devil perched on either shoulder, each trying to convince a morally conflicted mortal to follow his advice. When we sin, we are yielding to the Evil Yetzer, but the pious person mustn’t seek to eradicate the evil tendency as religious Perfectionists (like Christian Holiness Sects) do, because you need that evil instinct! Why? Without cruelty we should be too soft on criminals. We would not fight Nazis and terrorists. We would be suicidal bleeding hearts, and the world would descend into chaos. Perfect righteousness in a wicked world aids and abets evil. So you need both.
It can be quite the rude shock that we would rather deny and discard. James 1:23-24 describes us well when he alludes ironically to “a man who looks in the mirror at the face he was born with, then goes away and at once forgets what he looks like.” James’s metaphorical mirror is scripture, but it can be any kind of feedback. If we don’t like what we see, it is up to us to accept the mirror’s rebuke and make adjustments or to continue repressing the awareness of our spiritual or ethical ugliness. We would rather keep seeing the world through rose-colored glasses—or rather through mirror lenses on the inside.
Back to Fanny and Alexander. After Oscar succumbs to a stroke, his widow marries Bishop Edvard Vergerus, a universally respected moral leader of the community. It does not go well. The challenges of his wife, talented actress and strong-willed individual, and his devilishly smart new stepson Alexander, who refuses to take any crap from him, rapidly force Edvard’s cruelty and sanctimonious hypocrisy into the stark light of day. He is horrified by what is revealed: “I always thought people liked me. I saw myself as wise, broad-minded, and fair. I had no idea that anyone was capable of hating me.” But does he change? Hell, no. He still sees himself as wise, broad-minded and fair—despite all the evidence!
The Mirror Master
Remember the Twilight Zone episode in which a nervous, sweating little man named Jackie, a small-time crook, awaits his boss’s orders for the night’s job? He catches his image in the mirror of his dingy fleabag hotel room. Of course he has done this, and seen this, thousands of times over his disappointing life, but never like this, because his scowling reflection begins talking back to him, upbraiding him for his life-long cowardice and begging him to let his long-suppressed better side out of the mirror so he can take the reins before it is too late. At first terrified at the prospect of change, Jackie finally realizes the contemplated switcheroo is the only thing that can save him from his soon-to-crash life of crime and cowardice. Jackie becomes John and strides out to make a new life. It’s possible.
The image of (and in) the mirror reminds us of the motto of Apollo’s Oracle at Delphi: “Know thyself.” That is what we are talking about here. And there is more in that mirror than you may think. I go back to Fanny and Alexander one more time. My absolute favorite character in the film is Uncle Isak, Isak Jacobi, a family friend who pretends to be an antique dealer but is really a dweller in his own private museum. He lives with his nephews Aron and Ishmael. Aron creates fantastically elaborate marionettes and puppets, and these are the real source of their income. We are back to the “little world” in which we gain a glimpse of the “big world.” Isak’s dwelling, crammed with antiquities and curiosities and mystic relics (like a still-breathing mummy), is a microcosm of the divine macrocosm of the Kabbalah, of which Uncle Isak is a student and an adept, a magician in fact. At the center of his labyrinth, locked in his private cell, dwells Ishmael. Appearing to be an androgynous angel, the Angel of the Lord, and in essence God himself, mild and gentle, he is nonetheless dangerous because of the terrible knowledge he possesses and power he wields.
In 1 Corinthians 13:12 the poet writes, “Now we see in a mirror dimly, but then face to face.” Ever ask yourself whose face is seen in a mirror? Your own, right? But the image here refers to knowledge of God. I think Uncle Isak would know what this means.
Seven Years Bad Luck
The mirror doesn’t necessarily reveal the truth since what we see in it may vary with our perceptions, our perspective. Yet this is nonetheless an apt reflection because things arenot a certain way. It is a pluriverse, not a universe. It is up to us to observe and select phenomena and to act on what we see. It’s up to us to make the image into the reality, to make the little world into the big world. An example, though perhaps a strange-sounding one, might be theologian Gordon Kaufman’s argument that, since God must be incomprehensible, God cannot be simply described, and thus humans will necessarily create models, images, to stand for God. The danger, as Paul Tillich warned, is to confuse our God-concepts with the God-reality and so commit idolatry. But there is no other option, and we just have to be mindful of what we’re doing. Religious leaders and teachers, Kaufman says, must take responsibility for fashioning a God-model that can serve as a moral paradigm for religious people, one incompatible with, say, burning witches, suppressing science, beheading infidels.
But this is not just an example. Rather, it is a key to what is going on every time we construe what we see going on in the world and make an assessment of it. We are responsible for selecting what seem to us the relevant and important facts and factors we will use as the touchstone for interpreting and evaluating everything else. This process is inevitable and proper, but, of course, different individuals and different identity-groups see (and mirror) things very, very differently. This makes for a smashed mirror, a pile of sharp-edged shards, and that’s where we’re at as a culture. If you step back and try to make sense of the whole thing, you can only compare it to the Tower of Babel, the Land of Confusion.
It’s not the sheer diversity of beliefs and opinions that shatter the mirror, but the increasing intolerance of one faction toward another, the refusal to engage one another in dialogue, the attempts to shout one another down, even to riot and destroy. A shattered mirror and a patchwork quilt are very different things. Our society is fast becoming a David Cronenberg movie in which the parts of the body are at war with each other and the organism rips itself apart. “The eye cannot say to the hand, ‘I have no need of you,’ nor again, the head to the feet, ‘I have no need of you’” (1 Corinthians 12:21). But that’s what we’re doing in America today. Maybe if we look seriously in the national mirror and realize what’s going on, we can snap out of it and start gluing Humpty back together again.
I get a huge kick out of Donald Trump, and this is not just because I like what he’s doing. I’m talking about his unfiltered remarks like asking the clergy at the National Prayer Breakfast to pray for Arnold Schwarzenegger’s ratings on Celebrity Apprentice. I loved every crack, every dig, every barb he let fly during the debates. I just may be in the minority here, I realize. I even like the way he brags on himself and his achievements. But Trump is not my point here. He’s simply a prominent example of what I want to discuss this time. Is it bad to brag? Not necessarily.
Many of the President’s critics seem to pounce on anything he says, just because he is the one saying it. They like to suggest that Trump’s boasts are dead giveaways that he is deranged and should be removed from office. Others, with less of an axe to grind, are content to accuse Trump of overcompensating for an inferiority complex. That may be, but it is a little disturbing to me when bold self-confidence is taken as a symptom of insanity.
It is of course pathetic when people boast who have nothing to boast about. But I think the contempt we automatically feel should morph into compassion, even if it’s only to think, “Poor schmuck!” But what if the boasting is justified? Sometimes it is. Some people are great and have done great things. When they brag, they are simply being realistic. I won’t say that, in that case, it is humility or modesty, but it almost is.
I find it more offensive when people clothe themselves in false humility, either because they are fishing for compliments or because they superstitiously fear being struck down by Karma if they do openly rejoice over their talents or their accomplishments. Such humility may mean that a gifted individual is shirking his destiny. I just watched the old movie Abe Lincoln in Illinois, which depicted unassuming Abe as just wanting to be left alone. His shrewish wife, Mary Todd, made life miserable for him, but she was right to recognize his destiny and to nag him into fulfilling it. Was Abe’s humility an admirable trait? One level, yes. But on another, this seeming virtue was a stumbling block. Conversely, the braggadocio of someone like Trump may just mean he is full, not so much of himself, but of his destiny. I’m no mind reader, so I don’t pretend, as others rush to do, to psychoanalyze a man at long distance, through the TV. I’m exploring something in the abstract.
Proverbs says “Don’t praise yourself; let others do it.” I like that. But I feel like proper humility is, again, essentially just realistic self-assessment. If you’re talented, admit it and show you are thankful or lucky. There is no pretension there, and it’s pretension that I hate. I guess the dividing line is what you are trying to achieve by acknowledging your greatness. Are you fishing for compliments? That is pathetic. But when the great remark on their greatness as a simple observation, what’s so bad about that?
Many years ago, the pastor of my church plainly viewed himself as a new Kierkegaard. But that was okay with me, since he was a new Kierkegaard. He wasn’t trying to get you to think he was the new Dour Dane. You could just pick up on his self-understanding. If he had been self-deluded, it would have been ludicrous. Quite a difference.
“The difference is that they’re not in love with themselves. Instead, they’re in love with their mission.”
Sometimes great people chafe those around them with their self-absorption and their tendency to take their followers’ efforts for granted. You may not like their “dictatorial” style, their headlong pace, their insistence on their own way. But don’t you see? That’s part of the package! It’s not necessarily narcissism; it’s momentum. The difference is that they’re not in love with themselves. Instead, they’re in love with their mission. I have been lucky to know and to work with such individuals.
Paul Kurtz was one such. It was sometimes hard to work with him. He tended to alienate donors who, e.g., wanted their contributions directed to some particular project, often of their own invention. Paul always responded, “Thanks, but why don’t you just donate the money and let us decide what to do with it?” He’d hire people to do some job but then wouldn’t let them do it. These were flaws, but they were side-effects of his strengths, without which there would have been no Council for Secular Humanism, no Center for Inquiry, no Prometheus Books, no Free Inquiry.
Bob Funk, co-founder of the Jesus Seminar, was another powerhouse, another human locomotive. You could climb aboard, but he was the conductor, and people occasionally got run over. But no Funk, no Jesus Seminar. No Westar Institute. No Fourth R. No Five Gospels, and so on and so on. You just had to get used to it. Was his endeavor something you wanted to lend your energies to? If so, you had to get used to not being pampered. Maybe not even appreciated. Ask any soldier.
I never met him, but Arkham House publisher, naturalist, poet, and author August Derleth was yet another one. He was a dynamo. He made plenty of omelets and did not hesitate to break some eggs. He cut corners. He lived a sloppy life in some respects and invited numerous criticisms, but these volcanic doers and achievers are hurtling onward and cannot always contain themselves or pause to ponder every decision. Such is the price they, and we, pay for the creativity and dynamism that create the things we remember them for.
Should they pause and submit their ideas to an oversight committee? There might be less controversy, but less would be achieved. Here I believe Ayn Rand was right: the collectivity dilutes and smothers the unique greatness of the gifted individual. Nietzsche was right, too: the Superman must not allow himself be bound and gagged by the Lilliputians who hate him for being a living rebuke to their cowardice and mediocrity.
The danger, of course, is that an impatient achiever may exceed the prescribed limits of his power. Such an executive may become a dictator. You remember how the Roman Republic made provision for a legally empowered dictator in times of emergency when there’s no time to weigh all opinions. (Here, for some reason, I can’t help thinking of the time I was invited to address an Ethical Culture Society meeting about the historical Jesus question. They went around the circle having everyone present pose their questions for me to address all at once. But when they were finished, the meeting was over! No time left! They felt, as they admitted, that it was more important to hear all the members out than to hear the invited speaker! Ayn, are you listening?) I suspect what the Romans did would be impossible today, to our peril. Even when decisiveness is needful, I think we will be paralyzed with dithering and debating.
I do have one suggestion for curbing megalomania, for of course even the genuinely great (Greek megalé) can suffer from megalomania. I think the key to perspective on oneself is having a sense of humor about oneself. Be able to see the comical dimension of yourself. Don’t take yourself too seriously. Catch yourself being pompous and think how silly others look doing the same thing. Go ahead and brag if you’ve got something to brag about, but be just as quick to laugh at yourself. That ought to do it. That’ll puncture your hot air balloon. As a windbag, I know whereof I speak.
I have many times pointed out that, despite my polemics against Christian apologetics and my critiques of Christian theology at vital points, I have no interest in persuading individuals to discard their allegiance to the Christian religion. None of my damn business, I always say! Some readers must think I am being disingenuous, intentionally misleading people, or sincerely deluding myself. Some fellow atheists no doubt wish I would just come right out and declare opposition, enmity, to Christianity and join them as comrades-in-arms. But I don’t want to, and it’s not from any desire to lull Christian readers/listeners into a false sense of security before I lower the boom. So I thought it might be worth trying to explain my approach in positive terms: what I am trying to do, not what I deny doing. And of course the natural way to begin is to compare myself with Satan. Here goes.
No, I’m not a Satanist, though I have occasionally thought how fun it would be to declare myself one. Just imagine the publicity: the first Satanist New Testament scholar! What a riot! It might even get me a spot on TV, you know, one of those quick “freak show” features. But no.
If you are a Bible Geek listener, you have probably heard me explain how the Satan character began as a special son of God or angel in charge of divine sting operations. The Satan (at first it wasn’t even a proper name, but a noun meaning “the Adversary” in the sense of a prosecuting attorney) was so zealous for the honor of his Lord that he kept close tabs on mortals and was quick to sniff out any whiff of pretense on the part of God’s ostensible servants. What was their real motivation? Let’s find out! So the Satan would mount a scenario to “try men’s souls.” Was Job really as pious as his reputation would suggest? Try taking away his material wealth, his family, then his bodily health and see if he’s still such a big fan of the Almighty. Is King David’s reliance on God’s mighty arm absolute, or does he have back-up plans in case God disappoints him? Maybe whisper in his ear that it might be prudent to take a census of available soldiers?
Obviously, all this is completely mythological. And that’s not just compared with modern, materialistic science. No, there is a far more serious clash with theology: the scenario of a deity sitting on his throne up in the sky, surrounded by a flock of godlings including a special agent in charge of security—it is sheer, polytheistic myth. Is that the god you believe in? No, of course your God-concept is defined (or redefined) by the philosophical reasonings of Saints Augustine, Anselm, and Aquinas, even if you’ve never heard of them. You’re kidding yourself if you think you’re getting your God straight out of the Bible. It’s far from being that simple. Do you believe “God” needs an intelligence agency? Isn’t he supposed to be omniscient? You take that for granted as “baked into” the very definition of God, but the Bible writers sure didn’t. But back to Satan.
Satan became a villain subsequent to the massive Persian (Zoroastrian) influence on post-Exilic Judaism. Jewish thinkers saw the utility of the Zoroastrian concept of Ahriman, the nefarious anti-god responsible for all the evil in the world. That would seem to get God off the hook! So the once-innocent Satan was retconned as a Jewish Ahriman. But not consistently. If you wanted to harmonize the conflicting concepts of Satan, I guessed you’d say he went from seeing if you’d get caught in the act sinning to trying to get you to sin. From testing to tempting.
But even in the New Testament, he’s most often depicted doing his old job, testing the servants of God, to see what they’re made of, as when Satan meets Jesus in the desert and asks if he’d want to change rocks into rock candy, leap tall buildings in a single bound, and swear fealty to him instead of God. He seems to be putting Jesus, newly crowned Son of God, through his paces. Like Yoda.
My diabolical task, as I see it, is that of the loyal opposition. I have too much experience, much of it quite positive, with religion in general and Christianity in particular, simply to fight against it tooth and nail. It would be pathetic and quixotic. It would say more about me than about Christianity. I would have turned into a crazy, bitter ex-boyfriend. No thanks.
I have seen so much of Christians of all stripes and of Christianity in its many variations that I cannot pretend there is no good side to it. There is much to be loved, and I still love it. And this sentiment seems to me basic to any study of religion, period. You have to try to understand Islam, Buddhism, etc., from all sides including the inside. Unless you see what is loveable about it, you will never see why its adherents love it.
I disagree with Christians in their beliefs (even while agreeing on many moral questions), but that doesn’t make me their enemy. Remember the saying, “Love the sinner and hate the sin”? My motto is “Love the believer but reject the belief.” Thus when I refute Christian apologetics, I am trying to defend the Bible, which they and I both love, from an opportunistic abuse of it. When I critique points of Christian doctrine, I am weighing it in the balance, challenging Christians to purify their creed, to get their act together. I am demanding they stop selling an inferior product. I may even suggest how they might go about it. For instance, getting rid of the sadistic superstition of hell which corrupts moral motivation and demeans a supposedly righteous and loving God. (I was already doing this a few paragraphs up when I pointed out the difference between biblical and philosophically chastened God-concepts.)
Personally, I have come to the opinion that the patient is too far gone, but that hardly settles the question. And no matter who is ultimately correct, it has to be in everyone’s best interest for Christianity to purify itself, to reduce the amount of mind-twisting nonsense it requires its adherents to sign on to.
So you see, I am playing the role of Satan, trying to keep Christians and Christianity honest, to make them honest, exposing their intellectual dishonesties, their suppression of their better judgment, out of loyalty to a party line. Such faith-based fudging corrupts and vitiates the faith in whose name it is deployed.
I once scoffed at the claim made by fundamentalists, advocates for “Scientific Creationism,” that Secular Humanism is in fact a religion, a variety of paganism. Their aim was to dismiss Evolutionary Biology as the “creation myth” of this rival faith. And if it is okay to teach the Secular Humanist creation myth, why can’t you teach Creationism in public schools? Of course, fundamentalists did not mean to say (to admit) that their account of divine creation in six days is itself a myth. They were just saying that Evolution and Creation were alternative accounts of origins and of the diversity of life-forms, each account with its own reading of the data of biology, geology, paleontology, etc. They were willing to admit that their own version was dictated by religious presuppositions, and they wanted evolutionists to admit that their version was equally the result of “religious” presuppositions.
As I say, I have always rejected such claims, and I still do. As I read the works of Creationists and Evolutionists, it seems to me that Creationism’s approach is completely deductive. They feed the scientific data through a theological meat grinder, like old Procrustes, the predatory innkeeper who directed his taller guests to place their legs through holes in the footboard of the short bed, after which he would employ his trusty hacksaw. If the guest was shorter, Procrustes would handcuff him to the headboard and place his feet in stirrups, then start cranking the rack! The guest would fit the Procrustean bed come hell or high water! The “scientific theories” offered by “Scientific Creationists” are farcical pseudoscience.
I agree with the late, great philosopher of science, Paul Feyerabend (Against Method) that it doesn’t matter where one derives one’s heuristic paradigm. “The only axiom that does not inhibit research is ‘anything goes!’” Velikofsky got his astronomical model from the Bible and other ancient sources (though not in the same manner as Scientific Creationism). But so what? Might as well test it out! On one or two of his guesses he lucked out. Harry Rimmer and Henry O. Morris got their “paradigm” from Genesis chapters 1 and 6-9? You can’t laugh it off a priori. If you did, that would be the genetic fallacy. You have to check it out: how naturally, how economically, does it seem to fit the data? Does it depend on a lattice of ad hoc hypotheses? If so, well, next contestant, please! This is where and how Scientific Creationism strikes out.
Evolutionism, by contrast, proceeds inductively, forming a tentative hypothesis, trying to connect the dots provided by the evidence. Scientists are not trying to accommodate the data to an alien framework, whittling square pegs so they can be jammed into round holes. And, as Ed Suominen and I have argued in Evolving out of Eden: Christian Responses to Evolution, it turns out that there is no place left for a Creator as a needful causative factor. Not only do the data not require an Intelligent Designer; the data is incompatible with such a Being. He or She or It dies the death of a thousand cuts by Occam’s Razor. One does not start out with a commitment to Atheism or philosophical Naturalism; such a belief is instead the result of the inquiry.
So Creationism and Evolutionism are not on a par. Evolutionism is not religious in the sense Creationists claim, as if scientists were having to manipulate the evidence into conformity to prior (atheistic) beliefs. But in recent years I have begun to think that in another, equally important, sense, Secular Humanism is after all a religion, a set of incorrigible dogmas held, essentially, by faith. I think playwright David Mamet sums it up pretty effectively.
The new religion will not be identified as such. It will be called Multiculturalism, Diversity, Social Justice, Environmentalism, Humanitarianism, and so on. These, individually and conjoined, assert their imperviousness to reason, and present themselves as the greatest good. (Mamet, The Secret Knowledge: On the Dismantling of American Culture, p. 39)
But hold on! Mamet is talking about political Liberalism, not Secular Humanism, right? True, but my point is (and this will not surprise you) that Secular Humanism is almost completely committed to Leftism. There is no place in it for those who, though Atheists and Naturalists, are of a politically conservative bent. I served as the Director of the New York Metro Center for Inquiry until the higher-ups fired me because I voted for George W. Bush. Don’t get me wrong; I’m not bitter about it.
Nothing could change my fond affection for the late Paul Kurtz and my friends Tom Flynn, Eddie Tabash, Joe Nickel, Jan Eisler and others still affiliated with the Council for Secular Humanism. And in retrospect, my departure was inevitable precisely because of our political differences. I didn’t expect it, but clarification is nothing to regret, much less to resent.
And what became clear to me was the indistinguishability of Secular Humanism and Political Leftism. It has become even clearer very recently when Director Tabash sent out an announcement that the primary mission of CSH/CFI going forward would be to battle the policies of the incoming administration of Donald Trump, for whom I happily voted.
I am a skeptic vis-à-vis the Paranormal, the Supernatural, Superstition, and Metaphysical Idealism (especially including Theism). It seems quite natural to me to be equally skeptical toward political ideologies, especially those which refuse to take seriously hard facts of human nature, both individually and collectively. I am skeptical of Utopianism, Globalism, and Socialism. Their most vocal adherents are dogmatic, intolerant, and naively optimistic. They are, as Freud said of conventional religion, projecting a wish world onto the real world. They shame, shun, and despise all who do not share their faith. The complacent arrogance of these believers who simply disdain conservatives as fools and villains (I am frequently their target, not that I’m complaining) is, as I see it, exactly like that of cock-sure evangelical apologists who write off religious skepticism as a smokescreen for moral turpitude.
Secular Humanism bears another prominent mark of religious faith: the Chicken Little apocalypticism of Climate Change. Global Warming believers repeatedly set deadlines for melting ice caps, species extinctions, rising sea levels, etc., etc. In short, the Great Tribulation. And like the doomsday deadlines of Jehovah’s Witnesses and Hal Lindsey (The Late Great Planet Earth), these predictions are embarrassed again and again. Judge Rutherford, Harold Camping, Al Gore—what’s the difference?
By contrast, political conservatism is pessimistic, skeptical, chastened and sober. True, I am no climatologist, but “you don’t have to be a weatherman to know which way the wind blows” (John 3:8). I practice the hermeneutic of suspicion. I ask “Who benefits?” and it seems to me that the whole business is intended to promote two things: first, the gradual transfer of power from elected officials into the hands of unelected technocrats, producing “a state whose parameters, definitions, and prescriptions are controlled by a self-selecting group of ‘experts’ who can never be proved wrong” (Mamet, p. 36).
And second, Globalist redistribution of Western wealth (instead of encouraging capitalistic development, teaching the poor nations to fish instead of just giving them a fish). It even morphs into pious asceticism, the back-patting self-righteousness of needless self-denial. “So the new [religious] group, which is the Left, is prepared and is in the process of sacrificing production, exploration, exploitation of natural resources, and an increasing standard of living upon the altar of ‘global warming.’” (Mamet, p. 40).
One thing that makes me suspicious about climate apocalypticists is that they fudge data. They intimidate the rest of us, telling skeptics to just shut up because “the science is settled.” But it ain’t. They boast that 99% of scientists believe in it, so that, if you don’t, you’re a Flat Earther, as dumb as a Creationist. But I can’t help suspecting that the Birkenstock is on the other foot. I gather the number is more like 52%. And there are reports, including leaked e-mails, that reveal the fudging of data and the enforcement of orthodoxy by barring dissenting views from publication. Sounds like cynical apologetics to me. Oh, I admit Global Warming might be real anyway, but I’m not joining the parade. I am firmly convinced of the Secular “creation myth” of evolution, but I remain wary of its apocalyptic myth. (You may be rolling your eyes at that, but doesn’t that very reaction denote your orthodox intolerance for heresy?).
Mamet is correct:
This new group will, of course, like any group in history, create taboos and ceremonies of its own. But to ensure solidarity these new observances must absolutely repudiate the old; and the cult will indict these previous observances as, for example, paternalism, patriotism, racism, colonialism, xenophobia, and greed. (p. 39).
What else are Politically Correct speech codes and denunciations of “cultural appropriation”? The quota systems and ethnic self-segregations, the trigger warnings and micro-aggressions, the equation of patriotism with jingoism and racism? The Cromwellian-Orwellian crusade to eradicate expressions of traditional faiths in public life? It all amounts to the Shariah of the Left.
I used to say that Secular Humanism was not an alternative kind of religion but rather an alternative to religion. I was wrong.
So says Zarathustra.
 The survey of AMS members found that while 52 percent of American Meteorological Society members believe climate change is occurring and mostly human-induced, 48 percent of members do not believe in man-made global warming.
Furthermore, the survey found that scientists who professed “liberal political views” were much more likely to believe in the theory of man-made global warming than those without liberal views.
“Political ideology was the factor next most strongly associated with meteorologists’ views about global warming. This also goes against the idea of scientists’ opinions being entirely based on objective analysis of the evidence, and concurs with previous studies that have shown scientists’ opinions on topics to vary along with their political orientation,” writes survey author Neil Stenhouse of George Mason University.
If you’re a Roman Catholic, November 1 is All Saints Day. If you’re a Protestant, November 1 is Reformation Day. So what did I do back in the early nineties when I served as pastor of First Baptist Church in Montclair, New Jersey? Well, naturally, I invited a Catholic monk to preach to us on Reformation Day. (I also invited a Sunni Imam to speak to us on Trinity Sunday, but that’s another story.) Here I would like to draw attention to one particular aspect of Catholic-Protestant differences summed up in the Reformation: the democratization of biblical learning. History is now repeating itself.
In the beginning (don’t you like the sound of that phrase?) the question of lay versus professional access to scripture was entirely moot since most folks were illiterate and the labor involved in hand-copying the Bible made it simply impossible for private individuals to have their own copies anyway. But the Reformation spearheaded by Luther and his pals and the invention of Gutenberg’s printing press coincided and changed everything. It soon developed that anybody (Protestants at least) could read the Bible and actually had the opportunity to do so. But both events are now so far behind us that it feels like the circumstances they introduced have been with us forever. But they weren’t.
We have forgotten (because we never knew) that the Bible was not written for the common reader in the first place. The sacred documents were produced by scribes and priests for the use of their elite colleagues. For one thing, that sure explains the Book of Leviticus! Can you stand the tedium? Have you ever tried to do some casual reading in the Talmud? I wouldn’t recommend it unless you happen to be a Rabbinical student. But if you are, then Bon Voyage! In the same way, Leviticus is a handbook written for the use of Levitical priests, hence the title. It’s just like the Sama Veda in Hinduism.
Today the Bible is widely read, even memorized, but (I dare say) almost just as widely misunderstood for the same reason. It takes some hard-earned expertise to know what to make of it. But wait a minute—am I saying that Psalm 23, the Sermon on the Mount, and 1 Corinthians 13 are beyond the grasp of the average reader? Of course not. Or maybe I am. To put it more charitably, or to be fair about it, we might recall what Augustine said about scripture: it is a river in which a mouse can wade and an elephant can swim. Millions of faithful Bible readers go through the Book of Psalms again and again, but they frequently find themselves startled awake from their pious reverie and stymied at strange-sounding references to the king, his battles, and God killing dragons. What on earth? Similarly, people delight to read the gospels for their edifying content, failing to notice (as I did for so long) the contradictions in reports of the same events or sayings. Or if they do notice them, they treat them as momentarily annoying gnats to be waved away. But my axiom is this: there is nothing more pious than understanding the text. For its own sake, but also because otherwise you are likely to make the Bible into either a ventriloquist dummy or a Rorschach blot.
Once the sacred text became available to public scrutiny, the Catholic Church discouraged, even forbade, its flock to read it. Why? Because they didn’t want to put a machine gun into the hands of a Chimpanzee. (Reminds me of the time S.T. Joshi quipped, “Lovecraft is being read by the wrong people.”) Better leave the Bible to the pros. Common laity would just not be able to make sense of what they’d read, and they’d launch off in all sorts of heretical directions. And they did, though they wouldn’t have described it that way. But this wasn’t exactly the problem, was it? Martin Luther, the arch-heretic himself, was no uneducated layman. The guy taught scripture at a university! He was a Catholic monk, for Pete’s sake! He was skilled in the use of what scholarly tools were then available. His “sin” was to disagree, to reject the Church’s party line. The Catholic hierarchy said that if everyone were to read scripture for themselves, every man would become his own Pope. But what they were really afraid of was that everyone would become his own Luther! And everyone did.
Well, maybe not everyone. Most Protestants continued to understand scripture, even when they did read it for themselves, within the specific boundaries laid down by the hierarchy; they were just changing the channel, now reading the text the way their new church authorities (Luther, Calvin, Zwingli, etc.) said to. Eventually the Catholic Church seems to have realized that the issue was not so much the pew potatoes reading the Bible instead of following orders. Rather, it was a question of which/whose tradition they would follow in interpreting it. Once they grasped that, in the early 60s the Catholic leaders changed their tune: “Go ahead and read it! We want you to! Just remember who tells you what it means.”
Something quite similar is happening today. Bible scholarship from “unauthorized” perspectives has spread like never before via the Internet. Fundamentalist apologist Josh McDowell has warned his partisans that the Internet has become the greatest threat to the Evangelical movement in that it makes atheist thought and arguments readily available to curious Christian youth who would never have encountered them otherwise.
There are also various websites which have sparked a great revival of interest in the theory that Jesus did not exist as a historical figure. As you would expect, conventional biblical scholars (whether Christian believers or not) are disdainful of such sources, as they regard them, of religious misinformation. The late, great New Testament scholar Maurice Casey deigned to address such heretics in a couple of books published just shortly before his death. In them he scorned these non-professional scholars for lacking proper credentials, referring to them as “Blogger So-and-So.”
This condescension is not exactly snobbery, as is sometimes suggested. What is implicit (explicit?) in such language and such polemics is an agenda of safeguarding the reputation and influence of a professional guild. Peter L. Berger and Thomas Luckmann put it well in their great book The Social Construction of Reality: “The outsiders have to be kept out… If… the subuniverse [of meaning] requires various special privileges and recognitions from the larger society, there is the problem of keeping out the outsiders and at the same time having them acknowledge the legitimacy of this procedure. This is done through various techniques of intimidation, rational and irrational propaganda…, mystification and, generally, the manipulation of prestige symbols.”
If this consideration were not operative, we should never be hearing the carping about whether “Blogger So-and-So” has the right degrees or even any degrees at all. All we should be hearing is whether Blogger So-and-So’s arguments make sense. And not simply that they don’t but why they don’t. Personally, I do have the relevant degrees and, though I am a Jesus Mythicist myself, I have not hesitated to critique Mythicist arguments from several authors whose books I find unconvincing. And I think my criticisms are severe enough to risk personal alienation from their authors.
Some scholars who reject Christ Mythicism do not mince words and freely admit that they would automatically reject any candidate for a university teaching post in New Testament if the candidate, though appropriately credentialed, espoused the Christ Myth theory. Until recent years, the same barrier faced scholars like the great Thomas L. Thompson who advocated what is now called “Old Testament Minimalism.” No more. Why not?
As Michel Foucault said, every age, every generation, upholds (whether they know it or not) an “archive” of “accepted” knowledge, information, and assumptions, the sharing of which makes it possible for scholars to come together for discussion. It is not bigotry when someone like me finds himself the odd man out and not invited to the party. It’s just that you have no game if the players cannot agree to a common set of rules. But what do you do if you are one of those wallflowers at the dance? Well, of course, you start your own party.
I believe I witnessed such a sorting out process in the evolution of the Jesus Seminar. When John Dominic Crossan and Robert W. Funk started the group, it seemed like just about every prominent New Testament scholar was on board. But the diversity of perspectives soon revealed itself as an obstacle to the Seminar’s mission rather than a formula for its success. One by one, various factions dropped out. Those who approached the gospels from the standpoint of Social Science were frustrated that their approach was not in the forefront. The Narratology people thought they were getting short shrift. The “apocalyptic Jesus” scholars had to suffer in silence on the sidelines. And the very first meeting I attended was sufficient to show me I would sooner or later have to go my own way. Why not? Let a hundred flowers bloom!
I find myself belonging to a specific “community of interpreters” (as Stanley Fish calls them) that is definitely persona non grata in academia. I call it “ultra-radical criticism” or “New Testament Minimalism.” In most ways it is a revival of the Dutch Radical Criticism of Bruno Bauer and W.C. van Manen. For years I edited The Journal of Higher Criticism to provide a forum for such scholarly work. I was involved in the sadly abortive Jesus Project sponsored by the Center for Inquiry. But these days my efforts are pretty much restricted to offering my Bible Geek podcasts and my books and conducting the occasional independent study course with interested independent scholars.
Nor is it only me. That’s what I meant about the Internet and the Blogosphere. These are intangible refuges for freethinkers who do not buy into the dominant paradigms of professional biblical scholarship even while attempting to uphold the same standards of quality. Often they achieve that; other times they don’t. These refugees are rather like the monks (and the Muslim savants) who kept a tradition of learning alive during the Middle Ages.
There’s even a parallel to a still earlier phenomenon, the catacombs of Rome where various outlaw religious communities met in a time of official persecution. But just as the faithful gathered in those dank and torch-lit chambers to venerate the saints and martyrs of their communities, so do we ultra-radicals huddle together to rediscover and learn from the great critical scholars of the past: F.C. Baur, D.F. Strauss, Wellhausen, Kuenen, Van Manen, Bultmann, Schmithals, and the rest. Their prophetic voices are seldom heard in the ivory towers of academia these days when Green politics and Political Correctness have replaced theology even in seminaries, when Post-Colonialism and Queer Studies and Eco-Feminism (not to mention Evangelical apologetics) have usurped the title of “Biblical Scholarship.” It has, I believe, become a kind of Dark Age for those who once attempted to study the Bible for its own sake, not for the sake of political manipulation and evangelistic propaganda in the hands of powerful interests on the Left and on the Right. Meanwhile, you’ll find me among an unpopular Underground of kindred minds down here in the intellectual catacombs. There’s nowhere I’d rather be.
Once Tony Soprano shared an insight with his therapist. He had “mother issues,” you see. Pretty serious ones: she wanted to have him rubbed out—and tried! Anyway, Tony said our mothers are like buses that drop us off and drive away. We ought to be grateful for the ride. Instead, we chase after the bus, desperate to get back on. Naturally, the wisdom of Tony is applicable to much of life, but here (all too predictably) I want to apply it to our interest in the Bible. What’a ya gonna do?
I am right now working on a book called Bart Ehrman Interpreted. Like me, he was once a fundamentalist, then an evangelical, then an agnostic, and finally an atheist. It happens to a lot of us. In this evolutionary path many factors are at work, but one of the biggies, one might even say the fulcrum, is the Bible and our devotion to it. We get on the Bible Bus in our early years, eager to ride it to heaven! We have been told it will get us there as we take an exciting ride through the exotic neighborhood of the Old Testament, the shining landscape of the New Testament, the farther lands of Church History and Theology. So in order to appreciate these sights all the more, we study up on the guidebook, the Bible and commentaries on it. But every once in a while we are surprised to hear the sputtering ring of the stop signal, and the Bible Bus pulls over to let someone off. This surprises and concerns us! Why would anyone want to disembark from the Bible Bus? It’s so comfortable! The company is so pleasant! The group singing is such fun! Why? We are mystified.
That is, until we find ourselves reaching for the stop cord and getting up from our seat. Now we know why the others did the same. The closer we studied that guidebook, the more troubled we became. There were certain… discrepancies. Hell, was the bus actually going anywhere? We could swear we saw some of the same sights more than once! Were we just going in circles? Were we even moving?
Occasionally we look at the neighborhood we got dropped off in, and it looks pretty drab compared to the storybook paradise we used to think we were headed for. And in that moment, we think, as Tony Soprano said, of chasing that bus, or maybe waiting for another one to come along. But when one does, we realize we don’t have the required fare to get on. We’re plumb out of credulity! We’ve lost the overriding will to believe, otherwise known as the will to make-believe. So we start exploring the new neighborhood. Most of us sooner or later find new accommodations, discovering that living in the real world is not so bad, that it’s pretty good in fact, better than the fantasy dreamscape that once beguiled us.
But there are some of us, like Bart Ehrman, me, and a huge number of others, who manage to sneak back onto the bus, but we’re riding on the bumper! We feel good being back on the Bible Bus—but without getting in! We got hooked on the Bible, and the hook went deep. It might be pretty hard to get it out without disemboweling ourselves! The Bible has become part of us, like it or not. But then what’s not to like?
We hopped off the bus in the first place because the bus was arriving at a crossroad. We wanted to understand the biblical text better and better, as all Bible-believers are supposed to want to do. But even to have gotten to this point we had already worked our way half-way off the bus without realizing it, because everybody in the seats around us wasn’t really that curious about the Bible. They just wanted to cling to the proof texts that promised them admission to Heaven. In fact, they may have been reluctant to explore the rest of the Bible precisely because it might complicate the picture, might loosen their tight grip on that hand full of cherished, out-of-context verses.
And that’s what had happened to us. We saw the crossroads approaching: we would have to decide whether we were more interested in Christian faith or in the Bible, because we saw we couldn’t have both. We had found that the religious beliefs about the Bible were a hindrance to understanding the Bible. “Hmmm… here’s a contradiction! But, oh yeah, there can’t be any contradictions in the Bible! So I guess they’ll have a clever solution to the riddle when I get to heaven!” But we got tired of that after a while. We began to wonder. “Suppose it is a contradiction? How would that have come about? Two different sources editorially combined? Two different writers’ opinions? But if they’re only opinions, that means they’re not ‘revelations’… But it makes more sense that way! Do I want to find out the truth? Or do I want to keep chanting those proof-texts?” That’s where we got off.
And back on, on the bumper—and under the bus, to see exactly how it worked.
Hegel and Kant couldn’t and probably wouldn’t have used a stupid analogy like the one I’ve beaten to death here, but they were making a related point. For them, the Bible and supernaturalist religion were storybook versions of moral instruction for the childish of mind. Kant said that the miracles were like parables, simply a means to an end; if they got their point, their lesson, across, fine. But if not, who needs ‘em? And if they do their job, who needs ‘em either?
It’s much like Zen. The monks found the old doctrines and techniques ineffective in producing Satori, Enlightenment, so they invented new gimmicks like the answerless riddle of the koan (“What is the sound of one hand clapping?”) designed to launch you off the track of mundane reasoning into a higher stratosphere. The main insight of Zen was that, a la Kant, if the old trappings of religion did not do the trick, to hell with ‘em!
But, as I said, even if they did, who needs ‘em? If they served their purpose, if they got you there, isn’t it kind of like keeping the Coke bottle after you’ve finished the soda? I love the Buddhist parable of the Raft. Imagine a man fleeing an angry mob through the forest. Luckily, he has a good head start. He stops short at a riverbank. He’s got to get across! But there’s no bridge, no stepping stones, and those crocodiles aren’t helping either! What are his options? After a moment’s thought, he begins gathering bamboo stalks and binding them together with tough vines. Soon he’s got a makeshift raft and begins poling it across the river, just as his pursuers arrive at the river. They shout curses and shake their fists, but there’s nothing more they can do. (I’m tempted to say their quarry gives them the finger, but I guess that’s inappropriate for a Buddhist parable.) Well, he makes it across. He’s safe! Now what do you think he ought to do with that raft? Out of gratitude to the raft, should he strap it onto his back and carry it around like a tortoise shell? Of course not! What a useless encumbrance it would be! Better just to dump it and get moving!
In the same way, the various features of religion are purely instrumental in nature. I just ate a couple of yummy fiber brownies. Is there any reason to keep the wrappers? No, they served their modest purpose, and now it’s time for them to exit Stage Garbage Can. You discard them precisely because they did what they were designed to do! One might object that this analogy is apt only for Eastern religions, where the goal is one’s own Enlightenment. In a paradoxical sense, it is “self”-centered. But in Western religion it is (supposedly) quite different: religion is centered on God. Worship benefits the worshipper, sure, but in the same way turning toward the sun benefits green plants, because it nourishes them. We are made to worship our Creator, so things are out of whack if we don’t. But God deserves the acclaim.
And yet I think the difference is finally overcome, since in Vedanta Hinduism and Mahayana Buddhism one seeks to break down the (illusory) wall between Self and Other, to realize the unity, actually the identity, of the individual atman “within” and the universal Brahman or Sunyata “without.” This amounts to the humble self-abnegation of the individual in favor of the all-inclusive One. This seems to me quite parallel to the worshipper losing himself in the Worshipped.
Anyway, I think the Bible contradictions that shook us loose from fundamentalism are in effect biblical koans. They bring us to the end of faith and bid us jump off the track of conventional religious belief. They catapult us into Enlightenment (think equally of the European Enlightenment or of Eastern mystical Satori), and once they do, we can discard them—because they’ve served their purpose! The Bible nudged (or slapped) us awake so we wouldn’t miss our stop.
Political opinions are like scientific theories: they are cognitive frameworks through which we seek to make sense of the flood of disparate information. Otherwise we are left standing amid a roaring storm of raw data. We realize we must “come in from the cold” of confusion. Furthermore, we need to. We have to formulate a strategy to go further. We need to draw a map if we are to get anywhere, especially where we want to go, whether it is scientific research one is pursuing or a decision on how to vote.
As Thomas Kuhn explains in his classic study The Structure of Scientific Revolutions, a workable paradigm is one that will enable predictability. “If we’ve got it right so far, this or that ought to happen next.” “If we are understanding how these factors work together, we ought to get these results; so-and-so should turn up in the next experiment.” If you don’t get the results predicted by your theory, your theory is apparently wrong. Back to the drawing board! No shame in that. You learn from your mistakes. You reduce the possibilities. It’s the process of elimination.
Suppose the experts have long rallied around a consensus paradigm that has worked pretty well, but there remain stubborn anomalies: things, phenomena, that resist incorporation into the paradigm. Clashing data that we wouldn’t have expected. What to do? There are two possible courses. One might propose ad hoc hypotheses for this and that bit of troublesome data, contrived though they may sound even to those who propose them. Are you going to give up the theory that deals successfully with 90% of the data because of the square pegs constituting the remaining 10%? I’ll get to the second option presently. But how about a couple of examples to flesh out the abstraction?
I guess the classic instance would be the astronomical paradigm shift Kuhn discusses so well. The trouble centered around the problematic “retrograde motion” of the planets. Classical Ptolemaic astronomy posited geocentrism: all the heavenly bodies rotated around the earth in regular circular orbits. Ptolemaic astronomers knew something was amiss because the planets didn’t quite follow their expected courses. Every once in a while they appeared to double back, bob around, then continue on their way. Aristotle thought the planets were sentient beings and just felt like doing a little Two-Step now and then. Later astronomers took a different approach, positing a super-complex system of wheels within wheels, like gears in a watch, atop which the planets rested, borne along on this elaborate mechanical lattice. And it worked! Reverse-engineered from the observed motions of the planets, the system of “epicycles” did fit the phenomena (and sailors still use sextants to navigate on the basis of the Ptolemaic system). But of course it couldn’t predict any new results.
But Nicolai Copernicus thought he could do better. Suppose the earth was merely one of the planets rotating about a central sun? In this case, the retrograde motion of the planets was not real motion on the part of the heavenly spheres. Instead it was all the result of shifting perspectives. The orbits were regular, but our platform of observation was moving, too! This allowed a much simplified method of calculation. Eventually Copernican, heliocentric astronomy won the day. Ptolemaic astronomy was consigned to the Museum of Obsolete Theories.
From this case we can derive the major criterion for preferring one paradigm over another: the paradigm that makes the most economical sense of the data, and without having to posit far-fetched ad hoc hypotheses, is the better model. It’s an application of Occam’s Razor: the simpler explanation is the best. Is the truth always simple? Maybe not, but until we find out differently, we have to go by probability. And almost by definition the simpler explanation is more probable. Why bother multiplying redundant explanations when a simpler one cuts to the chase? There’s just no reason to add in needless complexities.
The second option for dealing with anomalous data is to start with it, theorize a new paradigm based on it, then see if the result can be reconciled with the old paradigm. Newton and Einstein found they were able to make new sense of what had remained baffling for Copernican astronomy, but without having to overturn the whole system. Their adjustments didn’t amount to a new set of contrived epicycles posited to rescue the old paradigm. It all proceeded inductively, and the data now made mutual sense in the same framework, all given equal weight.
It may take a long time for a new paradigm to prevail among the experts, because it must prove itself the superior option, and that quite properly requires a lot of detailed scrutiny. It is possible that some experts who have a big investment in the old paradigm (because most of their professional work was based on it) may have selfish reasons for opposing the new model, but usually that will only prolong a needful process, and most experts will welcome the new approach once they see its advantages. After all, they’re in the game because they want to get closer to the truth, not to defend some hobby horse or party line. At least one hopes so.
The same procedure obtains in biblical studies. For instance, scholars had long pondered the relation between the Synoptic gospels: Matthew, Mark, and Luke. They have much in common, almost verbatim. Why? It seems the three texts are interdependent in some way. But which way? Eventually, most scholars came round to accepting the Two Document Hypothesis, also called Markan Priority. In short, the paradigm runs like this: Matthew and Luke each independently copied material from Mark, making various changes in detail. This would explain the material shared by all three gospels. But there is also a large amount of sayings and stories shared by Matthew and Luke but with no parallel in Mark. Where did this stuff come from? There must have been a second prior source that Matthew and Luke used just as they did Mark. We call that one “Q” from the German word for “source,” Quelle. (Still awake?)
Scholars who accepted this way of making sense of the Synoptic material soon spotted anomalous data, sand in the gears of the paradigm: what to make of several places where Matthew and Luke do not quite agree with Mark but do match each other’s wording? Doesn’t that suggest that Matthew was using Luke or vice versa? It might! Starting with these data that did not fit the Two Document Hypothesis, some scholars have proposed various alternative models. Some say Mark combined material from Matthew and Luke, fusing (harmonizing) them as best he could and leaving the rest on the cutting room floor. Others suggest that Matthew used Mark, and then Luke used both Mark and Matthew. Still others think Luke used Mark, and then Matthew used both Mark and Luke. Thus there needn’t have been a Q document. When, e.g.,
Luke sometimes matches Mark but other times matches Matthew, this would be because he sometimes preferred Mark’s original, sometimes Matthew’s revised version.
I am a partisan of the Two Document (i.e., Mark and Q) Hypothesis. The scholars who advocate the alternative solutions I just mentioned obviously feel it necessary to junk the formerly regnant model, just as Copernicus overthrew the Ptolemaic model. I do not. I think the Two Document Hypothesis requires only minor adjustments. It seems to me that the Matthew-Luke agreements against Mark simply imply that Matthew and Luke were using an earlier edition of Mark which contained the reading Matthew and Luke have in common, and they preserved the original Markan wording. But the specifics don’t matter for our purposes. I’m just trying to give a general impression of how the contest between theoretical paradigms works.
But things get more complicated still! There are also so-called “incommensurable paradigms,” which are invulnerable to change. You see, the closer you look, the more it begins to look as if the criteria for the plausibility of a paradigm (and for whether their adjustments look like special-pleading “epicycles” or rather extensions of the paradigm to incorporate hitherto-anomalous data) are functions of the paradigm itself, i.e., contained within the paradigm. For example, what are readers to do with the fact that, though all four gospels show Peter denying Jesus three times, each gospel has Peter talking to different bystanders? Fundamentalists, who take it on faith that there can be no contradictions in the Bible, find it perfectly natural to posit that Peter denied Jesus six or even eight times, in order to fit in all the denials with their varying details. If the explanation comports with the doctrine of biblical inerrancy, it automatically looks good! That’s their criterion for plausibility.
But critical scholars do not hesitate to say that the gospels contradict one another and that the various gospel writers simply changed the details for whatever reason. To them it seems patently ludicrous to suggest that Peter denied Jesus six times. Why? Because these scholars have rejected inerrantism as a workable paradigm. And why? On account of a different, equally important Protestant axiom: scripture must be interpreted according to the “plain sense,” what the words would seem to mean prima facie. Otherwise you can treat the Bible like a ventriloquist dummy, making it mean whatever you want it to. And no one would read the gospel texts as recordingt six denials unless they were desperate to get out of a tight spot.
There can be no real communication, not even any debate, between these factions. There is no common ground. As Stanley Fish (Is There a Text in this Class?) says, we are dealing here with two insulated “communities of interpreters.” Within each herme(neu)tically sealed community of interpreters there can be much debate and dispute, as when fundamentalists argue over whether the inerrant Bible teaches free will or predestination. Or as when critical scholars discuss what might have motivated the various gospel writers to recast the walk-on roles of the various bystanders in whose ears Peter denied Jesus. But debates between the two communities are useless. They cannot help talking past one another. Everybody ends up where they started.
This is where politics comes in. Having political discussions with your friends (who are not likely to remain your friends for long!), you quickly notice you are getting nowhere, and so are they. Each of you is starting from within a self-contained paradigm from which your opponent’s perspective seems baffling. Recently I was a guest (or was that “sideshow freak”?) on a podcast where I was called to account for my support for Donald Trump. My stunned hosts could not conceive of an atheist skeptic supporting Trump, opposing abortion, doubting Global Warming, etc. Likewise, I could not believe my ears at their arguments for “gun-free zones,” etc. Each side has different criteria for plausibility, deductively derived from their paradigm itself. These criteria will determine how one views data that seems to challenge one’s position. It was clear to me that neither side can make any headway until they dare to question their presuppositions, to ask themselves the question emblazoned on a bumper sticker: “What if you’re wrong?”
Each side of the political divide lives in what Peter Berger and Thomas Luckmann (The Social Construction of Reality) call a “symbolic universe,” an internalized paradigm for construing data. Each side engages in “cognitive world-maintenance.” Each individual is reinforced (like a religious believer) in his convictions by surrounding himself with those who share his viewpoint. For instance, if you watch FOX News, as I do, and you hear people mock “Faux News” it is at once obvious they have never watched it. They are parroting the biases of the ideological “in-crowd” whose mockery is what Berger and Luckmann call a “nihilation strategy” aimed at discouraging one’s fellows from ever taking seriously any arguments from the other side. I am familiar with nihilation tactics from religious apologists who reassure their minions that biblical critics hold “skeptical” views only because they begin by arbitrarily rejecting the supernatural. That is nonsense, but they need to believe it in order to pre-empt any serious consideration of critical views. Political conservatives and liberals explain how the opposing faction suffers from psychological handicaps which incline them to their ill-founded opinions. Have you ever seen a more blatant example of the genetic fallacy?
Paul Watzlawick (How Real Is Real?) discusses the “self-sealing premise,” a belief or opinion or party-line that is invulnerable to disconfirmation by any objection, refutation, or contrary data. There is always a quiver full of explanations, excuses, rebuttals, whether consistent with each other or not. Remember, any argument supporting the presupposed position automatically sounds good, persuasive, and more than plausible to the one who offers it. And the defender is not pretending to believe these rebuttals: he really finds the excuses and alternative readings of the evidence to be convincing no matter how lame they sound to outsiders. (This is the larger reality of which the phenomenon of “confirmation bias” is the iceberg tip.)
This all raises the spectre of falsifiabilty. Karl Popper pointed out that some assertions are revealed, not as false, but as meaningless when the one making the assertion cannot think of any state of affairs that would falsify his assertion. You cannot really even define the state of affairs that is being asserted if you cannot specify what conditions would be inconsistent with it. My favorite example here is, not surprisingly, a theological one. If a religious believer asserts that God is in loving, providential control of the world, we would sort of expect this “hypothesis”
to have predictive value. Wouldn’t it seem to imply that God would protect us from tragedy and atrocity? But the facts do not seem to bear this out. Does the believer admit he was wrong? Not at all! He retreats to the position that God is in control, but that “he moves in mysterious ways.” But then we have to ask: if God’s being in providential control winds up looking just like God not being in control, what’s the difference? What, if anything, is even being asserted? If nothing counts against the claim, then there really is no claim.
And it is the same with political stances. You advocate policies that will, you claim, create more jobs and revitalize the economy, but there are no discernible results. Rather than going back to the drawing board or admitting that your opponents were right, you either “reinterpret” the statistics or stonewall with excuses, perhaps blaming the disastrous results of your policies on the previous administration’s policies which, you say, screwed up the economy worse than you first thought, so you double down on your failed policies. And when they fail again, you will sink still deeper into denial and excuse-making. The policies themselves have become the most important thing, not the goals to which they were originally dedicated.
Next step? Whatever destructive results the policies bring, even once they become undeniably obvious, will be considered noble simply because they are the results of the policy and the ideology underlying it. This is what happens when gun control advocates insist on reducing gun ownership by non-criminals. You would have assumed that safety and crime-reduction were the desired goals. But if, as in Chicago, New York City, and Brussels, the reverse happens, well, that’s still good. These murders were “collateral damage,” an unfortunate by-product of the inherently noble crusade to eliminate those nasty guns. Of course, criminals won’t cooperate, but let’s get rid of as many of those unholy and unclean guns as we can, and you can take them away from law-abiding citizens. “Oh,” you say, “but once we tighten gun laws, gun-owners ipso facto become criminals!”
Fossil fuels are inherently evil and unclean, so we must try to get rid of them. If this will destroy whole industries and many people’s livelihoods and make energy too expensive for the shivering poor, well, that’s just collateral damage! It’s all like pacifism: self-imposed martyrdom for the sake of ideals derived from abstract systems of political theory.
Theory is paramount in politics. Government ideologues attempt to reshape the world to make it conform to their theory’s picture of the world, as when George W. Bush sought to impose Western-style democracy on alien cultures, or when Obama thinks all he needs to do when Russia invades Ukraine is to pontificate that Russia is “on the wrong side of history.”
Worse yet, their policies assume the world already does correspond to the picture their ideology paints. Muslims can’t be terrorists, so they’re not! If you think otherwise, my friend, you suffer from Islamophobia! Crimes must be equally distributed among all population groups, so to claim one group commits a disproportionate amount of crimes can only be racist slander.
Committed to an ideology, the ideologue is living in an impenetrable bubble. Freud’s characterization of religion fits equally well here: “the projection of a wish-world onto the real world.”
Peter Berger (in his The Heretical Imperative)speaks of “relativizing the relativizers.” Once a sociologist of knowledge (like him) succeeds in showing the largely psycho-social origins of any individual’s beliefs, the ground is cleared. We are left with no escape, no option but to try to bracket what we have been taught to think, what we would like to believe, and to try our best to look at the facts inductively. And to ask ourselves why we are inclined to one or another interpretation of the facts. Look, I know that voting is a forced choice. You’ll never get to the polling place if you think you have to master all the facts on every question. But you owe it to yourself (and everyone else) to take a cold, hard look at the facts, and at yourself as the evaluator of facts. Take your best shot. Take what Don Cupitt calls “the Leap of Reason,” launching yourself out of your confining paradigm like baby Kal-el rocketing out of exploding Krypton!
Richard Dawkins has ventured the opinion that theologians are experts in a subject without any subject matter. I take him to mean they are engaged in nothing but “mental masturbation.” They are, he thinks, like a group of Star Wars fans I once knew who believed the events of the movie were real, but in a different universe (in which they would have much preferred to dwell). Or imagine a self-proclaimed zoologist who specialized in werewolves, unicorns, and dragons. There is a large element of truth in such comparisons and thus also in his disdain. And yet I can’t help thinking he is engaging in overkill.
I am sure Professor Dawkins does not discount the importance of numerous fields like Anthropology, Psychology, and Sociology of Religion, much less the History of Religion, because these fields of study serve purposes valuable to secular intellectuals and to society at large. To put it bluntly: you need to know your enemy. The world is aflame with murderous religious fanatics, and we need to understand what motivates those who kill “infidels” with clean-conscience and abandon. We need to know the harm such jihads of madness have done in the past if we want to gauge dangers in the present.
Biblical Studies are, obviously, relevant if only to, as Robert Ingersoll did, demonstrate that the threatening gun is loaded with blanks, to render the Bible useless as a tool of oppression. It is imperative to debunk the notion that the Bible is the infallible Word of God, and this is true even if you think what the Bible says is not so bad, because it is still used as a ventriloquist dummy to impart divine authority to those demagogues who claim its inerrancy for their own opinions. I can think of people on the Left as well as the Right who make Scripture into their own private Charlie McCarthy.
My friend Hector Avalos has called for “the end of Biblical Studies” (in his great book of that title). What, is he trying to put himself out of a job? No, he means to call the bluff of those who merely use scholarship to defend the Bible in the manner of apologetics—spin-doctoring. A good example would be the study of “Biblical Archaeology,” which was essentially founded as an elaborate campaign to defend the accuracy of the Bible against the skepticism of the Higher Critics of the Nineteenth Century. Now this axe-grinding pseudo-archaeology has been revealed as cut from the same cheap cloth as “Scientific Creationism.” The new Ark replica unveiled by Ken Ham is a prime example of both.
What is less obvious as a piece of biblical PR, and from an unexpected quarter, is the sophisticated (but sophistical) production of scholarly studies that attempt to show that the Bible really supports equal rights for women and homosexuals, opposes colonial imperialism, fosters ecumenism, etc. The goal here is twofold. First, there is the cynical, Grand Inquisitor-like attempt to use the voice of unearned authority to trick the pew-potatoes into voting for so-called “Progressives” because the Bible is “really” on the Left, and so you should be, too. It is propagandistic manipulation, what some Liberation and Feminist Theologians call “a useable past.” You know, like the Ministry of Truth in Orwell’s 1984.
But the flip side of that coin is that such scholars also seek to rehabilitate the Bible’s reputation by arguing that it has good “Progressive” things to say (and that the bad stuff is either irony or interpolation). Why do they want to do that? Possibly to justify clinging to a self-identification as a Christian of some type.
I understand Hector to be calling the bluff of both Conservative and Liberal scholars who are grinding “the axe of the apostles.” I don’t read him as demanding that the study of the Bible be banned or boycotted. Rather, I would describe the situation as analogous to a bunch of neo-polytheists urging everyone to take Hesiod’s Theogony and Homer’s Iliad and Odyssey as infallible and historically inerrant. Classicists would enter the fray to demonstrate the mythic-fictional character of those works, but they would not discourage the study of these great old texts.
I think another friend of mine, John Loftus, means the same thing when he urges universities to drop Philosophy of Religion courses. So much of it has been (again) apologetics, propping up the faith of those would-be believers whose intellectual consciences forbid them to exclaim, “Oh, to heaven with it!” and just “believe.” (Still, you can teach Philosophy of Religion with a critical approach.)
I’m not sure Professor Dawkins stops there, however. I remember hearing him deride the Catholic doctrine of Transubstantiation as “a rabbi turning himself into a cracker.” The line got lots of laughs, but I didn’t join in the general hilarity. I knew from this joke that the noted atheist was not bothering to try to look at the question from the inside, the only place it makes any sense, and where it does make sense. It is its own language game, though most of those who play it are not aware that’s “all” they’re doing.
If you’ve been on the “other” side, whether religiously, politically, whatever, then switched sides, you know you can’t simply dismiss the old frame of reference as nothing but gibberish. You know there is a method in the madness, that it’s not just madness, even if you can no longer affirm it. I think of how, when I first read about Deconstruction, I thought it was crazy. But, as Mary Midgley said, when we call something “crazy” it just means we don’t understand it. And then I realized there had to be something to it that was not then apparent to me. So I read more, trying to see it. And I finally did see it.
In fact, I now see what first seem like absurdities as opportunities to expand my mind to see (and possibly embrace) new perspectives. I tried to convey this to a student in an Adult School course I later taught about Deconstruction. Though a very learned man, he went in with an attitude of unremitting determination to see and expose Deconstruction as nonsense. What a shame, because, if it wasn’t nonsense, he’d never be able to see it. You have to have an attitude of teachableness, and this man, for all his intelligence, did not. His loss.
Well, I have been on the side of theologians and believers. Though I have long since switched sides, I remember what it was like. And I know that theology is not a subject without subject matter, like Unicornology. Whether you are a believer or an atheist, you may approach theology as a subdivision of the history of thought that deserves at least as much respect as, say, ancient or medieval Philosophy. You most likely don’t find the views of Parmenides and Xeno to be persuasive. You don’t buy the idea that what the senses report to us has no relation to external reality, an infinite expanse indistinguishable by divisions or distinctions, and so on. But were the Eleatics just gushing verbal salad? You’re the fool if you think so.
In the same way, I just cannot look at the philosophy of Thomas Aquinas or the theological systems of Karl Barth and Paul Tillich and X it all out as a bunch of glossolalic gobbledygook. The interplay of competing concepts is exhilarating to explore! All the more if it is initially strange to you!
Theology thinks it is telling us about God. I don’t think there is a God. But theology has much the same attraction as mythology. Both are the cartography of dream worlds. Doesn’t that mean
Richard Dawkins is right after all: it’s just a stupid waste of time? No. Hans Jonas and Rudolf Bultmann explained how “every statement of theology is at the same time a statement of anthropology,” and that is the basis of demythologizing. To demythologize is to decode stories of, and beliefs in, God(s), to reveal the self-understandings of those who produce and who live by the myths. People who inherit or embrace (factually erroneous) myth systems assimilate their dictates and definitions. The systems of belief are projections of the “symbolic universes” (Peter Berger and Thomas Luckmann) inside the heads of believers. This is the truth of the formula “As above, so below.” Theology does have subject matter: human beings.
I sent my Soul through the Invisible,
Some letter of that After-life to spell:
And by and by my Soul return’d to me,
And answer’d “I Myself am Heav’n and Hell:”