Interestingly, one person disagreed with the last two. OK, that it is important for students who take their classes to be familiar with the Bible -- I suppose if you were a pre-Christian classicist, you could reasonable say "no" to that. But disagreeing that "Western literature is steeped in references to the Bible?" Any English professor failing to agree with that either didn't understand the question or is incompetent (they list the professor's name in the text. I'll not mention it here, under the assumption that she did not really understand the question and does not need the embarrassment.) Everyone else seems to roll their eyes at the question, with such responses as:
- Some scholars? Any scholar...
- A truism.
- Incontestably true.
- It's everywhere.
- Absolutely. Who could deny it? [....] I cannot imagine such a position.
- It's not an opinion. It's just a fact.
- I don't know of any field of English literature you can teach -- or American literature-- that it's not key.
I found the listing of particular books useful. About a third of them listed Genesis by name, 10 listed Matthew, 9 listed Exodus Luke and Mark, but only 6 listed John, Revelation, and Psalms. If I were ranking the Gospels according to importance in studying literature, I'd have ranked them:
- Mark (distant fourth)
In any case, it is a far more engaging read than you might think. You should read it.
The real question to me is this: Why do we even have to discuss this issue? I know that secular American culture tends to be hostile to faith (especially Jewish and Christian faith), but has our culture gone so mad that we debate the question, "Should we teach the Bible as literature?" Shouldn't we instead be asking, "HOW should we teach the Bible as literature?"
hat tip Chronicle of Higher Ed