Enough of academics and literature -- here are two things that tick me off about Halloween:
1. "Kids" who are clearly in high school (or older!) treat-or-treating. If you are old enough that you drove yourself to my house, GET A JOB! If you are old enough that you shaved before donning your costume, BUY YOUR OWN CANDY! In my neighborhood, we always run out of candy. They are literally taking candy from babies when you come to the neighborhood and exhaust the candy supply. I always shame the kids and refuse to give to them. Even though they didn't get candy at my house, they must be getting it from somewhere or else they would give up. Next year, please, everyone join me in refusing and shaming them, so maybe eventually they'll quit.
2. Kids who show up without a costume, or even an attempt at a costume. No matter how little money we had when I was little (remember that I was trick-or-treating during the Carter years, so it was sometimes lean), I always had a costume. One year I took a milk jug, cut it and colored it, added a bent wire coathanger, and made myself an alien mask. At least make some effort to have a costume ... use markers to draw cat whiskers on your face or something. Dress in Mommy and Daddy's oversized clothes. Put a bag over your head and go as "Scary Bag-Over-the -Head Man!" All I'm asking is that you make some gesture toward dressing up. If you aren't in costume, you aren't trick-or-treating; you're just begging.
Now, if you'll excuse me, I have to raid the children's candy supply now that they're asleep.
Monday, October 31, 2005
Halloween and the Decameron
I did not plan to be teaching the Decameron over Halloween. It just worked out that way. Nonetheless, I'm struck with the similarities between the two events.
Children's Halloween and adult's Halloween are two different things. Adult Halloween is about trying on identities. When we are teens, we are able to cycle through identities pretty quickly; freshmen in my classes often end the semester with a different identity than that with which they began. Once we move into our middle 20's, though, we are expected to pick an identity and stick with it (even, I'm sorry to say, if it isn't a particularly good identity). Halloween gives us a time to play dress-up again, which probably explains why so many adult women's costumes are of the vixen variety -- it gives women the chance to dress dangerously without any real threat of social stigma. I've only seen our Chinese students dance at one ISCO dance ever: the Halloween party last week. When in costume and in the dark they lost their inhibitions. They could momentarily pretend to be someone or something else.
Children's Halloween is more like the Decameron; it is about managing our fears. When you are seven years old, you might intellectually understand that monsters aren't real, but deep down in your heart you know that they are not only real, but lurk in the shadowy corners of your house. Children can take control of these fears through Halloween. They can transform themselves into scary creatures, and be surrounded by scary creature all bound to the same quest for candy. Cookie Monster's desire to gobble cookies and the neighbor kid/ghost's desire to gobble candy turn children's fear's into something familiar, comfortable, and even a little fun.
The Decameron is like that. In the introduction, Boccaccio sets the frame story during an outbreak of the plague. He describes a city in which all social order has died: The dead are buried outside of their faith; servants rule over masters; family members are abandoned; the economy is in ruins. Surrounded by horror, the ten young people flee to the country and set up a new social order -- determining who will rule, having med to rule over the women, but fewer men than women to prevent overbearing patriarchy, etc. How do they cope with the death and disorder in the city? By telling stories, of course. Through telling stories about faith, love, and kinship, they can reclaim control of those areas that have been disordered by the plague. Their fears become managable.
Poe's "Masque of the Red Death" is what Halloween pretends to be about. Boccaccio's Decameron is what it is really about.
Children's Halloween and adult's Halloween are two different things. Adult Halloween is about trying on identities. When we are teens, we are able to cycle through identities pretty quickly; freshmen in my classes often end the semester with a different identity than that with which they began. Once we move into our middle 20's, though, we are expected to pick an identity and stick with it (even, I'm sorry to say, if it isn't a particularly good identity). Halloween gives us a time to play dress-up again, which probably explains why so many adult women's costumes are of the vixen variety -- it gives women the chance to dress dangerously without any real threat of social stigma. I've only seen our Chinese students dance at one ISCO dance ever: the Halloween party last week. When in costume and in the dark they lost their inhibitions. They could momentarily pretend to be someone or something else.
Children's Halloween is more like the Decameron; it is about managing our fears. When you are seven years old, you might intellectually understand that monsters aren't real, but deep down in your heart you know that they are not only real, but lurk in the shadowy corners of your house. Children can take control of these fears through Halloween. They can transform themselves into scary creatures, and be surrounded by scary creature all bound to the same quest for candy. Cookie Monster's desire to gobble cookies and the neighbor kid/ghost's desire to gobble candy turn children's fear's into something familiar, comfortable, and even a little fun.
The Decameron is like that. In the introduction, Boccaccio sets the frame story during an outbreak of the plague. He describes a city in which all social order has died: The dead are buried outside of their faith; servants rule over masters; family members are abandoned; the economy is in ruins. Surrounded by horror, the ten young people flee to the country and set up a new social order -- determining who will rule, having med to rule over the women, but fewer men than women to prevent overbearing patriarchy, etc. How do they cope with the death and disorder in the city? By telling stories, of course. Through telling stories about faith, love, and kinship, they can reclaim control of those areas that have been disordered by the plague. Their fears become managable.
Poe's "Masque of the Red Death" is what Halloween pretends to be about. Boccaccio's Decameron is what it is really about.
Saturday, October 29, 2005
Mourier Hoax II
Because of its popularity, I'm continuing the saga of Pierre Mourier, literary theorist extraordinaire. For those who missed where I introduced the Mourier hoax, it can be found here. The e-mail to which I was directly responding can be found here.
Mike,
While it is about time someone finally applied Pierre Mourier to this issue, instead of simply trotting out his worn and tired acolytes Derrida and Foucault, I am surprised at your facile application of *The Rational Irrational*. As I'm sure everyone on this list knows, Mourier wrote the text at the zenith of his opium addiction, and he later recanted much of the work, returning to his earlier (and more important) ideas. For example, in his article "Language and Determined History: Some Thoughts on the Book of Job," Mourier writes:
"God's response to Job satisfies the soul but maddens the intellect. 'Can you pull in the leviation with a fishhook or tie down his tongue with a rope?' expresses the soul's yearning for confirmation that God is in control of thekosmos. Our minds rebel against this self-evident truth, that God lay the foundation of the earth in OBJECTIVE FACT [emphasis mine], leaving our fallen minds with nought but irrational interpretation of the rational world and, by extention, language." (James Dalma's translation)
You can't ignore his groudbreaking interpretation of the Tower of Babel (which a later, lesser acolyte ripped off and called "the panopticon"), nor of the creation accounts of Genesis and the Gospel of John, and act like you are actually representing Mourier! Sure, the article was controversial at the time(whenever an atheist speaks reverently of the Bible, it is bound to ruffle a few feathers), but no serious scholar puts the drug addled Mourier of *The Rational Irrational* above the prescient Mourier of "Language and Determined History!" Rationality is ALWAYS preferable to irrationality in Mourier's legitimate works, and if you think otherwise, you might want to consider re-reading for your Qualifying Exam.
Yours,
Scott Nokes
In case anyone is wondering, James Dalma and his translation of the fictional Mourier are also inventions of my own.
Mike,
While it is about time someone finally applied Pierre Mourier to this issue, instead of simply trotting out his worn and tired acolytes Derrida and Foucault, I am surprised at your facile application of *The Rational Irrational*. As I'm sure everyone on this list knows, Mourier wrote the text at the zenith of his opium addiction, and he later recanted much of the work, returning to his earlier (and more important) ideas. For example, in his article "Language and Determined History: Some Thoughts on the Book of Job," Mourier writes:
"God's response to Job satisfies the soul but maddens the intellect. 'Can you pull in the leviation with a fishhook or tie down his tongue with a rope?' expresses the soul's yearning for confirmation that God is in control of thekosmos. Our minds rebel against this self-evident truth, that God lay the foundation of the earth in OBJECTIVE FACT [emphasis mine], leaving our fallen minds with nought but irrational interpretation of the rational world and, by extention, language." (James Dalma's translation)
You can't ignore his groudbreaking interpretation of the Tower of Babel (which a later, lesser acolyte ripped off and called "the panopticon"), nor of the creation accounts of Genesis and the Gospel of John, and act like you are actually representing Mourier! Sure, the article was controversial at the time(whenever an atheist speaks reverently of the Bible, it is bound to ruffle a few feathers), but no serious scholar puts the drug addled Mourier of *The Rational Irrational* above the prescient Mourier of "Language and Determined History!" Rationality is ALWAYS preferable to irrationality in Mourier's legitimate works, and if you think otherwise, you might want to consider re-reading for your Qualifying Exam.
Yours,
Scott Nokes
In case anyone is wondering, James Dalma and his translation of the fictional Mourier are also inventions of my own.
Friday, October 28, 2005
Correction: Buckyballs and Smart Paper
An anonymous poster offered a correction to my comments about buckypaper. He/She/It suggested that I had conflated the idea of buckypaper with smart paper.
Very likely the poster is right, since as I mentioned in the post I was recalling that application of buckyballs from years ago. I wanted to offer the correction up here, rather than letting it stay buried in the comments section.
So, if anyone else knows more about this topic, is the Gyricon Smart Paper what the poster is writing about? Was there speculation years ago about using buckyballs to make smart paper? And how long will it be until I can get my first smart book? My current books are all rather dumb.
By the way, in the comment, the poster called me a "humanitarian," presumably referring to someone working in the humanities. As my Civ IV post indicates, it is entirely inaccurate to refer to me as a humanitarian.
Very likely the poster is right, since as I mentioned in the post I was recalling that application of buckyballs from years ago. I wanted to offer the correction up here, rather than letting it stay buried in the comments section.
So, if anyone else knows more about this topic, is the Gyricon Smart Paper what the poster is writing about? Was there speculation years ago about using buckyballs to make smart paper? And how long will it be until I can get my first smart book? My current books are all rather dumb.
By the way, in the comment, the poster called me a "humanitarian," presumably referring to someone working in the humanities. As my Civ IV post indicates, it is entirely inaccurate to refer to me as a humanitarian.
Thursday, October 27, 2005
Civilization and my Savage Self
The release of the Civilization IV computer game snuck up on me. I received a breathless e-mail (er, figuratively breathless -- by their nature, all e-mails are without literal breath) from a friend reporting that its release was nigh.
Now I read from Timothy Burke that it is "great." Soon, I'll have another outlet for my megalomaniacal tendancies. Maybe I'll ask Santa to bring it to me.
Someone once wrote to me that I seemed quite sophisticated from my blog; ah, how deceiving appearances can be! Civilization brings out my most savage impulses. I have been known to nuke a single warrior out in the wilderness because his presence offended me. I will declare war at the slightest pretext. On my orders, stealth bombers have destroyed libraries, cathedrals, universities, and hospitals. I have been known to poison the water supplies of towns. On occasion, I have razed a city to the ground amid the wailing of its women. I have kept slaves, possessed huge pirate fleets, and encouraged enemies to destroy my "friends." Oh, yes, I have done all these things, and more.
Lord of the Flies suggests that the removal of boys from society sparks their most savage instincts. In my case, a keyboard and mouse are enough to enable atrocities dwarfing the maddest ambitions of Hitler, Mao, or Stalin.
Now I read from Timothy Burke that it is "great." Soon, I'll have another outlet for my megalomaniacal tendancies. Maybe I'll ask Santa to bring it to me.
Someone once wrote to me that I seemed quite sophisticated from my blog; ah, how deceiving appearances can be! Civilization brings out my most savage impulses. I have been known to nuke a single warrior out in the wilderness because his presence offended me. I will declare war at the slightest pretext. On my orders, stealth bombers have destroyed libraries, cathedrals, universities, and hospitals. I have been known to poison the water supplies of towns. On occasion, I have razed a city to the ground amid the wailing of its women. I have kept slaves, possessed huge pirate fleets, and encouraged enemies to destroy my "friends." Oh, yes, I have done all these things, and more.
Lord of the Flies suggests that the removal of boys from society sparks their most savage instincts. In my case, a keyboard and mouse are enough to enable atrocities dwarfing the maddest ambitions of Hitler, Mao, or Stalin.
Speaking of panning stuff...
Acephalous is panning Don Delillo. Normally, I wouldn't have any strong feelings about Delillo -- just kinda "eh," but after years of hearing the hype about him I finally read White Noise and found it mediocre and disappointing. Perhaps if it hadn't been hyped, I would have found it merely mediocre, but not so disappointing.
But, alas, it was so hyped, and I was so disappointed. I'm glad to see I'm not alone in thinking Delillo overrated.
But, alas, it was so hyped, and I was so disappointed. I'm glad to see I'm not alone in thinking Delillo overrated.
The Paradise of Paradise Lost
The other day, a colleague asked me about my blog, "Do you like anything, or do you just pan s**t?"
I like lots of things, but too much praise can make one sound James Lipton-esque. And no one needs more James Lipton in his life.
Paradise Lost, for example. I love PL. I love about everything in it except for the astronomy lessons. I love the fact that Satan tempts Eve with the same sorts of weakness that caused his own Fall. I love that after eating of the Fruit, the first official post-Lapsarian sin is lust-inspired (rather than love-inspired) sex ... a sin that they have to commit together. I love that Adam and Eve try to blame one another for the Fall. I love the way that both Satan and Adam try to blame God for their troubles. I love that the thing that finally gets Adam and Eve confess their sins to God at the end of Book X is their near-breakup fight.
Perhaps most of all, I love the last two lines: "They, hand in hand, with wandering steps and slow,/ Through Eden took their solitary way." Solitary, but hand in hand -- the human condition.
I like lots of things, but too much praise can make one sound James Lipton-esque. And no one needs more James Lipton in his life.
Paradise Lost, for example. I love PL. I love about everything in it except for the astronomy lessons. I love the fact that Satan tempts Eve with the same sorts of weakness that caused his own Fall. I love that after eating of the Fruit, the first official post-Lapsarian sin is lust-inspired (rather than love-inspired) sex ... a sin that they have to commit together. I love that Adam and Eve try to blame one another for the Fall. I love the way that both Satan and Adam try to blame God for their troubles. I love that the thing that finally gets Adam and Eve confess their sins to God at the end of Book X is their near-breakup fight.
Perhaps most of all, I love the last two lines: "They, hand in hand, with wandering steps and slow,/ Through Eden took their solitary way." Solitary, but hand in hand -- the human condition.
Wednesday, October 26, 2005
Remediation
An interesting piece on college remediation at Cold Spring Shops. Strangely, I had a dream last night about remediation; there must be something in the air.
Mourier Introduced
For those who don't understand the context of this post, you can read more about the Mourier hoax here.
Below is the first e-mail in which a colleague referenced the fictional Mourier. He was responding to a post by "Joe," who was in on the joke. The "JD" here is Jacques Derrida. The "GW" stands for "George W. Bush," since the whole exchange started as a reading of picture of Bush with a target on him in the English office. As I continue to post these in the coming days (or until I get sick of them), see if you can guess who is in on the joke, and who is simply faking knowledge they don't have.
Sorry to say, Joe has lost his way. His post quite clearly shows this with his last line: "perhaps it would be fruitful to read it relationally."Earlier he cites a more rational age, the pre-J.D. era, so that he can lead us to the promised land of clear thinking. As most know, however, those earlier eras were not as critically simple as we latter-day romantics tend to think. In fact, the great J.D.'s own teacher, Pierre Mourier, became famous for his early rejection of this wishful thinking in his book *The Rational Irrational* (1913). "Rationality is a stray path, a forgotten toy, a stale loaf. While once it may have held glory, it now only leads to dissolution" (243). We can go back further to the very romantics who, on the surface, would seem to lead us to the text as such (Keats' urn) and see that true reading is found only in spiritual epiphany (Coleridge's Khan; Wordsworth's Tintern). Irrationality is the only avenue in which to encounter the text. It's apparent then that Joe has lost his way in regards to the whole GW issue (and, perhaps, his own dissertation): reading only elicits meaning if we can follow the spirals of the text to their own fruitless and irrational ends.
mike
Below is the first e-mail in which a colleague referenced the fictional Mourier. He was responding to a post by "Joe," who was in on the joke. The "JD" here is Jacques Derrida. The "GW" stands for "George W. Bush," since the whole exchange started as a reading of picture of Bush with a target on him in the English office. As I continue to post these in the coming days (or until I get sick of them), see if you can guess who is in on the joke, and who is simply faking knowledge they don't have.
Sorry to say, Joe has lost his way. His post quite clearly shows this with his last line: "perhaps it would be fruitful to read it relationally."Earlier he cites a more rational age, the pre-J.D. era, so that he can lead us to the promised land of clear thinking. As most know, however, those earlier eras were not as critically simple as we latter-day romantics tend to think. In fact, the great J.D.'s own teacher, Pierre Mourier, became famous for his early rejection of this wishful thinking in his book *The Rational Irrational* (1913). "Rationality is a stray path, a forgotten toy, a stale loaf. While once it may have held glory, it now only leads to dissolution" (243). We can go back further to the very romantics who, on the surface, would seem to lead us to the text as such (Keats' urn) and see that true reading is found only in spiritual epiphany (Coleridge's Khan; Wordsworth's Tintern). Irrationality is the only avenue in which to encounter the text. It's apparent then that Joe has lost his way in regards to the whole GW issue (and, perhaps, his own dissertation): reading only elicits meaning if we can follow the spirals of the text to their own fruitless and irrational ends.
mike
Tuesday, October 25, 2005
Pierre Mourier: (S)call(ion)s of (D)is(se)nt
Long ago I promised a forthcoming post on theorist Pierre Mourier. Here it (finally) is. Don't worry, this is fun stuff, not boring theory talk.
When I was in graduate school, we had a listserv for the graduate students. The intention of the list was, so far as I could tell, to make general announcements for all grad students -- such as "I need someone to share a hotel room at the MLA" or "the committee meeting regarding the freshman comp classes has decided such-and-such." Pretty pedestrian stuff.
About once per semester, however, the list would turn into a theory shouting-match. The participants were mostly working on their MAs, and I think they were under the misunderstanding that they could get attention from the professors if they showed their theoretical acumen on the list (a false assumption since the only professor on the list was the graduate director, who may not have actually read the comments). This happened fairly regularly, and was annoying in the way that it clogged up our e-mailboxes with posturing crap.
One day I was sitting around the office with my friend (and fellow PhD candidate) Mike, and we were grousing about the theory flamewar currently going on. What we found especially annoying was that it was clear to anyone who had actually studied the theorists that the participants did not know what they were talking about, and were simply throwing around the names of the Theorists of the Week Club as buzzwords. That's when we came up with the idea for a prank.
We invented a fake theorist by the name of Pierre Mourier, and started a flamewar between the two of us regarding his work. We would make up fake texts and long quotes that he supposedly said, and argue aggressively about them. We thought that after a week or so, some enterprising MA student would try to look up Mourier and discover the joke.
But, no ... instead, they started arguing with us! People who were not in on the joke would write things like: "Well, my reading of Mourier is..." implying that they had read these non-existent texts. When the other PhD students saw what we were up to, they began to create (and quote extensively from) their own fake theorists, such as Gillaume de Slopard, Simone Mourier (his wife), and Jorge Jesus Castillo. The works by and about these theorists included "The Rational Irrational," "Language and Determined History: Some Thoughts on the Book of Job," "Baudelaire's Blue Lobster: The Genesis of the Psychocultural Image and the Return to Hermeticism," "The Syphlitic Eye," "Murmurs in the Cabaret: Finding Language through Noise," and "The Suffering of Memory" (which, in a brilliant post, one of my colleagues argued should be better translated as "The Ache of Memory").
This went on and on. We tried to become more and more absurd to see if anyone would call us on it, but no one ever caught on. Finally, after weeks of this fake flamewar, one of the MA students wrote in the first intellectually honest post of the thread in which he confessed total ignorance of Mourier, said plainly that he had never heard of him, and asked if we could refer him to any particular works of significance.
This had gone on for some time, so we were uncertain how to proceed. Finally, I called the fellow on the phone and let him in on the joke. He thought it was hilarious and asked to join in. So, I sent him (via the list) a very, very nasty reply about his ignorance of Mourier, and he replied (via the list) that all the books by Mourier were always checked out of the library, and that the local bookstores could not even keep his books in stock. The fun continued.
Eventually, the hoax was revealed by one of our professors who outed us in a class filled with MAs. We stopped the flamewar, and one of my colleagues put together as many of the e-mails as she could cull out of the database into a samizdat volume entitled, "(S)call(ion)s of (D)is(se)nt: (S)Talking Rational(ity)." I still have it on my shelf, and pull it off for a laugh every so often.
People will occasionally ask me why I don't talk about "High Theory" in specifics here much. The reason, I suppose, is that often when I get into discussions of particular theorists, I can tell that the other person doesn't really know what they are talking about and are instead tossing about fetishized names ("Well, Kristeva completely explodes that argument, you know..."). All I hear are people arguing that their readings of Pierre Mourier are superior to mine.
If there is enough interest, I might occasionally some of the e-mails from the flamewar for your amusement. I'd have to re-type them from scratch, though, so I won't bother if no one asks.
When I was in graduate school, we had a listserv for the graduate students. The intention of the list was, so far as I could tell, to make general announcements for all grad students -- such as "I need someone to share a hotel room at the MLA" or "the committee meeting regarding the freshman comp classes has decided such-and-such." Pretty pedestrian stuff.
About once per semester, however, the list would turn into a theory shouting-match. The participants were mostly working on their MAs, and I think they were under the misunderstanding that they could get attention from the professors if they showed their theoretical acumen on the list (a false assumption since the only professor on the list was the graduate director, who may not have actually read the comments). This happened fairly regularly, and was annoying in the way that it clogged up our e-mailboxes with posturing crap.
One day I was sitting around the office with my friend (and fellow PhD candidate) Mike, and we were grousing about the theory flamewar currently going on. What we found especially annoying was that it was clear to anyone who had actually studied the theorists that the participants did not know what they were talking about, and were simply throwing around the names of the Theorists of the Week Club as buzzwords. That's when we came up with the idea for a prank.
We invented a fake theorist by the name of Pierre Mourier, and started a flamewar between the two of us regarding his work. We would make up fake texts and long quotes that he supposedly said, and argue aggressively about them. We thought that after a week or so, some enterprising MA student would try to look up Mourier and discover the joke.
But, no ... instead, they started arguing with us! People who were not in on the joke would write things like: "Well, my reading of Mourier is..." implying that they had read these non-existent texts. When the other PhD students saw what we were up to, they began to create (and quote extensively from) their own fake theorists, such as Gillaume de Slopard, Simone Mourier (his wife), and Jorge Jesus Castillo. The works by and about these theorists included "The Rational Irrational," "Language and Determined History: Some Thoughts on the Book of Job," "Baudelaire's Blue Lobster: The Genesis of the Psychocultural Image and the Return to Hermeticism," "The Syphlitic Eye," "Murmurs in the Cabaret: Finding Language through Noise," and "The Suffering of Memory" (which, in a brilliant post, one of my colleagues argued should be better translated as "The Ache of Memory").
This went on and on. We tried to become more and more absurd to see if anyone would call us on it, but no one ever caught on. Finally, after weeks of this fake flamewar, one of the MA students wrote in the first intellectually honest post of the thread in which he confessed total ignorance of Mourier, said plainly that he had never heard of him, and asked if we could refer him to any particular works of significance.
This had gone on for some time, so we were uncertain how to proceed. Finally, I called the fellow on the phone and let him in on the joke. He thought it was hilarious and asked to join in. So, I sent him (via the list) a very, very nasty reply about his ignorance of Mourier, and he replied (via the list) that all the books by Mourier were always checked out of the library, and that the local bookstores could not even keep his books in stock. The fun continued.
Eventually, the hoax was revealed by one of our professors who outed us in a class filled with MAs. We stopped the flamewar, and one of my colleagues put together as many of the e-mails as she could cull out of the database into a samizdat volume entitled, "(S)call(ion)s of (D)is(se)nt: (S)Talking Rational(ity)." I still have it on my shelf, and pull it off for a laugh every so often.
People will occasionally ask me why I don't talk about "High Theory" in specifics here much. The reason, I suppose, is that often when I get into discussions of particular theorists, I can tell that the other person doesn't really know what they are talking about and are instead tossing about fetishized names ("Well, Kristeva completely explodes that argument, you know..."). All I hear are people arguing that their readings of Pierre Mourier are superior to mine.
If there is enough interest, I might occasionally some of the e-mails from the flamewar for your amusement. I'd have to re-type them from scratch, though, so I won't bother if no one asks.
Untagging Myself
You may be wondering why I haven't posted for a couple of days (or, you may not have noticed). The reason is that Scott Gosnell over at A Knight's Blog has tagged me for an ACLU meme, or more accurately, for an anti-ACLU meme. I've been trying to figure out how to respond. Perhaps I wouldn't normally have taken so long, but it's the first time I've been tagged for a meme, so I wanted to do it right.
I've decided not to play along. That decision is not implicit praise nor denunciation of the ACLU, nor is it about Gosnell. It's about the nature of this blog.
Unlocked Wordhoard isn't a political blog; it's about the life of the nuos. I don't shy away from political subjects when they are the secondary focus of the posting; for example, I have spent some time debating with Gosnell about whether we wanted another lawyer on the SCotUS -- but that debate began long before even the Roberts nomination. I haven't bothered to state a position on Miers since this blog isn't about those policy decisions. If Miers does something interesting from a textual, medieval, or academic viewpoint, then I'll blog on her. Or, if she does something funny, I might also blog on her.
So.... sorry, but I must decline. The blogosphere has plenty of other political blogs for those who want cogent political debate and analysis. It doesn't, however, have as many blogs about thinking and reading. I don't want to let Unlocked Wordhoard's identity be obscured.
By the way, if I reserve the right to change my mind and start blogging on political topics in the future.
I've decided not to play along. That decision is not implicit praise nor denunciation of the ACLU, nor is it about Gosnell. It's about the nature of this blog.
Unlocked Wordhoard isn't a political blog; it's about the life of the nuos. I don't shy away from political subjects when they are the secondary focus of the posting; for example, I have spent some time debating with Gosnell about whether we wanted another lawyer on the SCotUS -- but that debate began long before even the Roberts nomination. I haven't bothered to state a position on Miers since this blog isn't about those policy decisions. If Miers does something interesting from a textual, medieval, or academic viewpoint, then I'll blog on her. Or, if she does something funny, I might also blog on her.
So.... sorry, but I must decline. The blogosphere has plenty of other political blogs for those who want cogent political debate and analysis. It doesn't, however, have as many blogs about thinking and reading. I don't want to let Unlocked Wordhoard's identity be obscured.
By the way, if I reserve the right to change my mind and start blogging on political topics in the future.
Saturday, October 22, 2005
Textual Polyamory
I've been working on three book projects at the same time. One of them, Conflict in Southern Writing, is almost ready to go to the printer. My part in what remains is simply to do one last read-through.
Now that I've had a weekend of freedom from that book, I realize that trying to do three books at once on a 4/4 schedule was madness. It can't be done ... well, OK, I guess it can be done since I just did it, but it shouldn't be done.
It isn't like I was spending every waking hour working on these three books; rather, all three of them were flitting about the back of my mind in a blur. I find that in order to write (I mean write good stuff, not off-the-cuff stuff like blog entries) I need more than two hours. The first hour I need to piddle about with mindless tasks while I think about the writing, and the second hour I need to ease out of the writing to move on to the next task at hand. Thus, if I have two hours and ten minutes uninterrupted, I'll have about 10 minutes of really great writing time. When I can't get more than two hours blocked off without interruption, I can't write much meaningful. On the other hand, if I can get, say, eight straight hours, that's six solid hours of good writing, even taking a break every 45 minutes.
To give you one real-life example of this, after doing all the research it was taking me forever to finish my dissertation. Finally, I gathered all my materials, some canned food, a sleeping bag, and a laptop, and rented a cabin out in the woods. Four days later I walked out having written the last two chapters -- the two chapters that needed (by far) the least revision. My dissertation director wrote me and said that I had finally found my voice.
Given that I need large blocks of time to concentrate, the more projects I have at once, the less I can get done on them. Now that I finished my part on one of the books, I moved to editing an article for one of the other books. I had previously looked at the article before, of course, but I was surprised at how much more clearly I understood it. Having that extra book to edit didn't take up that much actual writing time, but it took up that much extra thinking time.
So, here's my resolution -- I resolve to never again work on more than two book-length projects at a time. [Did you like reading that split infinitive in the resolution as much as I liked writing it?] Total monogamous commitment to one book at a time might be more than I can realistically restrict myself to, but one book spouse with a book mistress on the side seems reasonable, even for a textual philanderer like me.
Now that I've had a weekend of freedom from that book, I realize that trying to do three books at once on a 4/4 schedule was madness. It can't be done ... well, OK, I guess it can be done since I just did it, but it shouldn't be done.
It isn't like I was spending every waking hour working on these three books; rather, all three of them were flitting about the back of my mind in a blur. I find that in order to write (I mean write good stuff, not off-the-cuff stuff like blog entries) I need more than two hours. The first hour I need to piddle about with mindless tasks while I think about the writing, and the second hour I need to ease out of the writing to move on to the next task at hand. Thus, if I have two hours and ten minutes uninterrupted, I'll have about 10 minutes of really great writing time. When I can't get more than two hours blocked off without interruption, I can't write much meaningful. On the other hand, if I can get, say, eight straight hours, that's six solid hours of good writing, even taking a break every 45 minutes.
To give you one real-life example of this, after doing all the research it was taking me forever to finish my dissertation. Finally, I gathered all my materials, some canned food, a sleeping bag, and a laptop, and rented a cabin out in the woods. Four days later I walked out having written the last two chapters -- the two chapters that needed (by far) the least revision. My dissertation director wrote me and said that I had finally found my voice.
Given that I need large blocks of time to concentrate, the more projects I have at once, the less I can get done on them. Now that I finished my part on one of the books, I moved to editing an article for one of the other books. I had previously looked at the article before, of course, but I was surprised at how much more clearly I understood it. Having that extra book to edit didn't take up that much actual writing time, but it took up that much extra thinking time.
So, here's my resolution -- I resolve to never again work on more than two book-length projects at a time. [Did you like reading that split infinitive in the resolution as much as I liked writing it?] Total monogamous commitment to one book at a time might be more than I can realistically restrict myself to, but one book spouse with a book mistress on the side seems reasonable, even for a textual philanderer like me.
Friday, October 21, 2005
Buckypaper and the End of Print Literature
Scribal Terror has a link to this article in Science Daily about buckypaper. I first heard about buckypaper a few years ago, but in a much different (and more significant) context.
Though it is not mentioned in the article, one of the potential uses for buckypaper that I heard discussed in those days was to make buckybooks (sorry, I have no link -- I heard this long ago). The idea, as I remember it, was to have paper made entirely of buckyballs that were dark on one side and light on the other. The "off" setting for the balls would be for the light side to be facing out. When current was run through the buckyballs, they would flip over so that the dark side was showing. By selectively turning some on and turning some off, you could create writing on the page (much as a marquee sign does by turning lights on and off).
The article talks about a lot of less important uses, like body armor, shielding airplanes from lightning strikes, etc. While all those things are nice, none of them will change the world. Buckypaper has the potential to change the world as significantly as Gutenberg's printing press.
Until now, electronic books have not really taken off. We are willing to access short texts on the internet, but longer texts we like in book form. The book has dominated literacy since it superceded the scroll, and survived through both manuscript culture and print culture. The computer screen does not have the potential to supercede the book because it is too difficult to carry around and is uncomfortable on the eyes and hands.
Buckypaper could be what the e-book has been waiting for. Imagine you hold in your hands a book. It looks like a normal book, except perhaps there is a thin battery and USB port in the binding. When you open the book, you see that the pages are blank. Just page after page of nothing.
Now, imagine you go online and download a novel onto a jump drive. You insert the drive into the USB port, upload the novel -- and suddenly the words appear on the page. Now you have a novel in a comfortable, portable form.
But wait -- you are finished with that novel ... what do you do? Instead of putting it on the shelf, you would simple store the file on your personal computer and upload a different novel into your buckybook. In other words, you could store your entire personal library on your keychain, and have just one or two tabula rasa books on your shelf. The cost of printing becomes nil. Books with short runs are no longer more expensive than mass market paperbacks (indeed, pricing would probably be inverted). Bibles and other censored texts could be stored in files and wiped from the buckybook before any censoring officials saw them. It would also be possible for your book -- that sheaf of papers you have in your hand -- to become truly interactive, with links like a web page.
Of course, buckybooks would be expensive at first, but I would think that a cost comparable to a laptop computer, say a few thousand dollars, would be marketable. Universities could require all students to use them and practically eliminate print books. If your local public library didn't have a book -- why not download it from the Library of Congress? Even rare print books could be scanned and turned into cheap facsimile editions, allowing for early training in manuscript paleography to take place without excessive cost or the risk of damage to manuscripts.
Sure, body armor and airplane covering is fine, but I prefer applications that change the world.
Though it is not mentioned in the article, one of the potential uses for buckypaper that I heard discussed in those days was to make buckybooks (sorry, I have no link -- I heard this long ago). The idea, as I remember it, was to have paper made entirely of buckyballs that were dark on one side and light on the other. The "off" setting for the balls would be for the light side to be facing out. When current was run through the buckyballs, they would flip over so that the dark side was showing. By selectively turning some on and turning some off, you could create writing on the page (much as a marquee sign does by turning lights on and off).
The article talks about a lot of less important uses, like body armor, shielding airplanes from lightning strikes, etc. While all those things are nice, none of them will change the world. Buckypaper has the potential to change the world as significantly as Gutenberg's printing press.
Until now, electronic books have not really taken off. We are willing to access short texts on the internet, but longer texts we like in book form. The book has dominated literacy since it superceded the scroll, and survived through both manuscript culture and print culture. The computer screen does not have the potential to supercede the book because it is too difficult to carry around and is uncomfortable on the eyes and hands.
Buckypaper could be what the e-book has been waiting for. Imagine you hold in your hands a book. It looks like a normal book, except perhaps there is a thin battery and USB port in the binding. When you open the book, you see that the pages are blank. Just page after page of nothing.
Now, imagine you go online and download a novel onto a jump drive. You insert the drive into the USB port, upload the novel -- and suddenly the words appear on the page. Now you have a novel in a comfortable, portable form.
But wait -- you are finished with that novel ... what do you do? Instead of putting it on the shelf, you would simple store the file on your personal computer and upload a different novel into your buckybook. In other words, you could store your entire personal library on your keychain, and have just one or two tabula rasa books on your shelf. The cost of printing becomes nil. Books with short runs are no longer more expensive than mass market paperbacks (indeed, pricing would probably be inverted). Bibles and other censored texts could be stored in files and wiped from the buckybook before any censoring officials saw them. It would also be possible for your book -- that sheaf of papers you have in your hand -- to become truly interactive, with links like a web page.
Of course, buckybooks would be expensive at first, but I would think that a cost comparable to a laptop computer, say a few thousand dollars, would be marketable. Universities could require all students to use them and practically eliminate print books. If your local public library didn't have a book -- why not download it from the Library of Congress? Even rare print books could be scanned and turned into cheap facsimile editions, allowing for early training in manuscript paleography to take place without excessive cost or the risk of damage to manuscripts.
Sure, body armor and airplane covering is fine, but I prefer applications that change the world.
Thursday, October 20, 2005
The Next Time I'm Arrested ...
... I hope my mugshot looks like Tom Delay's. Say what you want about his politics -- if you're going to be arrested, this is the classiest conceivable way to do it. Heck, my driver's license photo looks more like this.
If Delay is convicted, at least his family will have one really nice photo of him for the mantle. Do you think the Harris County Sheriff's Office does school photos, too?
If Delay is convicted, at least his family will have one really nice photo of him for the mantle. Do you think the Harris County Sheriff's Office does school photos, too?
A problem with online polls...
...is that they are so frequently hacked. Unlike Bainbridge, who apparently has never seen the brain-corrupting influence of Time magazine on freshman composition papers, I suspect Time readers ARE dumb ... though whether many dumb people subscribe to Time, or Time dumb-ifies smart people who read it, I cannot say.
Regardless, as of this posting, the ranking system gives The Watchmen as #1, Snow Crash as #3, and The Catcher in the Rye as #14. Um, The Watchmen isn't even a novel (no, changing the phrase "comic book" to "graphic novel" doesn't make it a sub-genre of the novel; it's a sub-genre of the comic book); Snow Crash is a book I read when I mis-remembered the title of White Noise, and found it disappointing enough that it doesn't even belong in the "Top Rated Science Fiction Stories in Paperback for under $10 that Prof. Nokes Has Read on Accident" category; and The Catcher in the Rye is high on my list of books that should be banned because they encourage self-indulgence among already-obnoxious teens ("Society just doesn't understand me, man.")
Surely not even Time readers are this dumb. People readers, maybe. It must be a couple of guys sitting in cubicles hitting the "vote" button over and over.
Regardless, as of this posting, the ranking system gives The Watchmen as #1, Snow Crash as #3, and The Catcher in the Rye as #14. Um, The Watchmen isn't even a novel (no, changing the phrase "comic book" to "graphic novel" doesn't make it a sub-genre of the novel; it's a sub-genre of the comic book); Snow Crash is a book I read when I mis-remembered the title of White Noise, and found it disappointing enough that it doesn't even belong in the "Top Rated Science Fiction Stories in Paperback for under $10 that Prof. Nokes Has Read on Accident" category; and The Catcher in the Rye is high on my list of books that should be banned because they encourage self-indulgence among already-obnoxious teens ("Society just doesn't understand me, man.")
Surely not even Time readers are this dumb. People readers, maybe. It must be a couple of guys sitting in cubicles hitting the "vote" button over and over.
Coining Words: Blognoramus
The Gypsy Scholar has a post about coining a new word -- but he gets it wrong! Now, I've previously posted regarding the right of English professors to coin new words, but you can't hide your light under a bushel; after coining the word, you actually have to use it, or it doesn't count. So, Gypsy Scholar, you are halfway there ... you just have to use the word in public, and define it.
Here, let me demonstrate by coining a new word here:
blognoramus (blog.no.ra.mus) n. An ignorant blogger; someone who frequently blogs stupid things: Dr. Nokes might occasionally blog something interesting, but in general he is a blognoramus.
All done. New word neatly coined. See how easy it is?
[Editor's note: I had included a nice dictionary etymology at the end of the entry, but Blogger keeps mistaking all the odd marks for bad html coding. The limits of the medium, I guess.]
Here, let me demonstrate by coining a new word here:
blognoramus (blog.no.ra.mus) n. An ignorant blogger; someone who frequently blogs stupid things: Dr. Nokes might occasionally blog something interesting, but in general he is a blognoramus.
All done. New word neatly coined. See how easy it is?
[Editor's note: I had included a nice dictionary etymology at the end of the entry, but Blogger keeps mistaking all the odd marks for bad html coding. The limits of the medium, I guess.]
Wednesday, October 19, 2005
It's Carnival Time!
Scribbling Woman is hosting the Teaching Carnival II, which is more about teaching in higher ed than anything else. Warning: I never read SW because her font is so tiny it gives me a headache. Use the biggest monitor you've got!
h/t Inside Higher Ed
h/t Inside Higher Ed
National Review's Non-Fiction 100
Apparently, 'tis the season for book lists. National Review came up with their list of the top 100 non-fiction books of the 20th century.
Much of what is there, particularly at the top, is politically selected; for example, Hayek comes up twice in the top 10, as does Orwell (though Orwell may well deserve it). But Anne Frank as number 20? The comments attempting to justify her inclusion suggest that the compilers know full well it doesn't belong. Tom Wolfe, in my opinion, belongs on neither the fiction nor non-fiction lists.
Then there is God & Man at Yale ... it helps to have founded the magazine doing the rating. The Elements of Style would not have even come up on my radar screen, since I tend to ignore reference books, but it seems a good choice. What the heck is Silent Spring doing there ... one would think even The Nation would laugh that one off. And the Starr Report at number 100? That one just seems like an ironic joke.
Much better is the Random House list that inspired NR's, which makes NR's list even more inexplicable. After reading the Random House list, how could they have left off Henry Adams? Or T.S. Eliot? Or Lewis Thomas?
Ah, well, that is the fun of these lists -- grousing about what is missing.
Much of what is there, particularly at the top, is politically selected; for example, Hayek comes up twice in the top 10, as does Orwell (though Orwell may well deserve it). But Anne Frank as number 20? The comments attempting to justify her inclusion suggest that the compilers know full well it doesn't belong. Tom Wolfe, in my opinion, belongs on neither the fiction nor non-fiction lists.
Then there is God & Man at Yale ... it helps to have founded the magazine doing the rating. The Elements of Style would not have even come up on my radar screen, since I tend to ignore reference books, but it seems a good choice. What the heck is Silent Spring doing there ... one would think even The Nation would laugh that one off. And the Starr Report at number 100? That one just seems like an ironic joke.
Much better is the Random House list that inspired NR's, which makes NR's list even more inexplicable. After reading the Random House list, how could they have left off Henry Adams? Or T.S. Eliot? Or Lewis Thomas?
Ah, well, that is the fun of these lists -- grousing about what is missing.
Tuesday, October 18, 2005
Academic Stars vs. Superstars
"Well, I went to grad school with Robert Scholarson, and Bobby says..."
One of the on-going jokes here in my department has to do with academic superstars we have known. All of the junior faculty have met or worked with some academic superstars, and many of the senior faculty have as well, though by now that starlight has in some cases waned. A strange thing about academic superstars is that they tend to be total jackasses. Many of them are quite smart (though not all them), but their habits of self-promotion, posturing, and cruelty earn loathing from scholars outside of the elitist clique (i.e. those that control a few organizations and publications, such the the Chronicle and the MLA).
One of my colleagues went to school with one of the more obnoxious superstars -- we'll call him Robert Scholarson (no, that isn't his name in some kind of code). One of the bonding moments I had with this colleague was when I mentioned a particularly loathesome essay by Robert Scholarson -- how I found it both stupid and offensive, a cheap exploitation of personal tragedy -- and my colleague mentioned that he had been to school with Scholarson. He then regaled me with tales of how obnoxious Scholarson is in person -- worse than I had thought.
Perhaps more galling is that, so far as we could tell, Scholarson (who has published a great deal), seems to have published very little actual scholarship. I asked my colleague how Scholarson became an academic superstar with such an anemic scholarly record. According to my colleague, Scholarson entered graduate school annointed, and from that point forward was groomed for great things. Scholarson has, reasonably, turned that grooming into a very successful career. I doubt anyone would begrudge him it if he didn't behave as if good things in life resulted from moral superiority, rather than politicking and networking.
Daniel Boorstin once said that "The celebrity is a person who is well-known for his well-knownness." Since that quote lacks a certain poetry, we generally revise it to "famous for being famous." Robert Scholarson is an academic superstar because he is an academic superstar.
Now, this is the peculiar thing. I'm using Scholarson as an example because he is so thin in terms of scholarship -- he is not the scholar himself, but at best the scholar's son living off the father's inheritance (or in this case, dissertation committee's political pull). The run-of-the-mill academic stars, on the other hand, tend to be much nicer people than the academic superstars. Why would that be? Aren't stars superstars-in-waiting?
Probably not. When I look at the superstars, I find that they tend to have one really smart book in their career. After this first book, they are prolific, pronouncing on things about which they know little, but finding publication primarily because of that smart book. Superstars seem insecure in their position, likely fearing that they don't have another smart book in them.
The stars, on the other hand, tend not to have one really smart book. They tend to have a half dozen solidly smart books that, while they never caught fire, advanced their discipline. The stars, then, earned their positions, and maintain them by producing more work of the type that brought them stardom. As a result, stars tend to be more understanding of the shortcomings of others, because they themselves have struggled with writer's block, with ideas that stopped working in the middle of an article, with detailed and meticulous research. I've seen superstars brutally tear apart novice graduate students at conferences, but stars tend to be more supportive, offering suggestions for improvements.
I've tried to take a lesson from this. I'm neither star nor superstar, but in my own limited way I try to emulate the stars. Rushing after the latest, hippest thing might advance my career ... but is that why I got into academe in the first place? Fame doesn't interest me. Riches would be nice, but I'm less interested in them than in leaving a small-but-enduring legacy for scholars who come after me. I'd rather work slowly and produce work I can be proud of.
Whenever we want to say something pretentious, the joke around here is to say, "Well, I went to grad school with Robert Scholarson, and Bobby says..." I hope no one ever says that about me.
One of the on-going jokes here in my department has to do with academic superstars we have known. All of the junior faculty have met or worked with some academic superstars, and many of the senior faculty have as well, though by now that starlight has in some cases waned. A strange thing about academic superstars is that they tend to be total jackasses. Many of them are quite smart (though not all them), but their habits of self-promotion, posturing, and cruelty earn loathing from scholars outside of the elitist clique (i.e. those that control a few organizations and publications, such the the Chronicle and the MLA).
One of my colleagues went to school with one of the more obnoxious superstars -- we'll call him Robert Scholarson (no, that isn't his name in some kind of code). One of the bonding moments I had with this colleague was when I mentioned a particularly loathesome essay by Robert Scholarson -- how I found it both stupid and offensive, a cheap exploitation of personal tragedy -- and my colleague mentioned that he had been to school with Scholarson. He then regaled me with tales of how obnoxious Scholarson is in person -- worse than I had thought.
Perhaps more galling is that, so far as we could tell, Scholarson (who has published a great deal), seems to have published very little actual scholarship. I asked my colleague how Scholarson became an academic superstar with such an anemic scholarly record. According to my colleague, Scholarson entered graduate school annointed, and from that point forward was groomed for great things. Scholarson has, reasonably, turned that grooming into a very successful career. I doubt anyone would begrudge him it if he didn't behave as if good things in life resulted from moral superiority, rather than politicking and networking.
Daniel Boorstin once said that "The celebrity is a person who is well-known for his well-knownness." Since that quote lacks a certain poetry, we generally revise it to "famous for being famous." Robert Scholarson is an academic superstar because he is an academic superstar.
Now, this is the peculiar thing. I'm using Scholarson as an example because he is so thin in terms of scholarship -- he is not the scholar himself, but at best the scholar's son living off the father's inheritance (or in this case, dissertation committee's political pull). The run-of-the-mill academic stars, on the other hand, tend to be much nicer people than the academic superstars. Why would that be? Aren't stars superstars-in-waiting?
Probably not. When I look at the superstars, I find that they tend to have one really smart book in their career. After this first book, they are prolific, pronouncing on things about which they know little, but finding publication primarily because of that smart book. Superstars seem insecure in their position, likely fearing that they don't have another smart book in them.
The stars, on the other hand, tend not to have one really smart book. They tend to have a half dozen solidly smart books that, while they never caught fire, advanced their discipline. The stars, then, earned their positions, and maintain them by producing more work of the type that brought them stardom. As a result, stars tend to be more understanding of the shortcomings of others, because they themselves have struggled with writer's block, with ideas that stopped working in the middle of an article, with detailed and meticulous research. I've seen superstars brutally tear apart novice graduate students at conferences, but stars tend to be more supportive, offering suggestions for improvements.
I've tried to take a lesson from this. I'm neither star nor superstar, but in my own limited way I try to emulate the stars. Rushing after the latest, hippest thing might advance my career ... but is that why I got into academe in the first place? Fame doesn't interest me. Riches would be nice, but I'm less interested in them than in leaving a small-but-enduring legacy for scholars who come after me. I'd rather work slowly and produce work I can be proud of.
Whenever we want to say something pretentious, the joke around here is to say, "Well, I went to grad school with Robert Scholarson, and Bobby says..." I hope no one ever says that about me.
Monday, October 17, 2005
Rosa Parks & Reading
I received this e-mail from the chair of Troy University's English Department at the Montgomery Campus, Kirk Curnutt:
As you know, the fiftieth anniversary of Rosa Parks' arrest is only weeks away. To that end, we would like to make you aware of an exciting event on the Montgomery campus that is taking place on Wednesday, October 19, 2005 in the Rosa Parks Library and Museum Auditorium. Troy-Montgomery's A&S divison is hosting a reading and book signing by Donnie Williams and Wayne Greenhaw, authors of "The Thunder of Angels: The Montgomery Bus Boycott and the People Who Broke the Back of Jim Crow." The event begins at 7 p.m.
I won't be able to make it, but if you are going to be in Montgomery and would like to go to a reading, this one sounds good. If you've never been to a book reading before and would like to attend this one, it is usually not necessary to buy the book in advance; they typically sell them right where the authors are signing them.
By the way, I also recommend Curnutt's book, Baby Let's Make a Baby, though it has nothing to do with Rosa Parks.
As you know, the fiftieth anniversary of Rosa Parks' arrest is only weeks away. To that end, we would like to make you aware of an exciting event on the Montgomery campus that is taking place on Wednesday, October 19, 2005 in the Rosa Parks Library and Museum Auditorium. Troy-Montgomery's A&S divison is hosting a reading and book signing by Donnie Williams and Wayne Greenhaw, authors of "The Thunder of Angels: The Montgomery Bus Boycott and the People Who Broke the Back of Jim Crow." The event begins at 7 p.m.
I won't be able to make it, but if you are going to be in Montgomery and would like to go to a reading, this one sounds good. If you've never been to a book reading before and would like to attend this one, it is usually not necessary to buy the book in advance; they typically sell them right where the authors are signing them.
By the way, I also recommend Curnutt's book, Baby Let's Make a Baby, though it has nothing to do with Rosa Parks.
Saturday, October 15, 2005
August Wilson, RIP
Though I noted it in the news about two weeks ago, I missed mentioning here the passing of August Wilson. At a time when the political pressures on black playwrights writing about black characters were to produce very narrow art that, at best, is only accessible to a certain type of person at a certain period of time, or, at worst, uses blackness as a gimmick to pass off inferior works -- at that time, Wilson produced some work that was remarkable in both its particularness and its transcendance. Wilson understood the importance of the WORD, and refused to allow either posturing or pandering to overwhelm his art.
As a result, I suspect that in 2105, somewhere in the world August Wilson will still be playing (probably Fences), and deservedly so.
As a result, I suspect that in 2105, somewhere in the world August Wilson will still be playing (probably Fences), and deservedly so.
Lacking Style -- or at least some of its Elements
Althouse has an interesting post (and thread of comments) about the new Elements of Style and this article regarding it.
I'm with them ... and leaving the reviser's name off the title suggests that the editors are intentionally trying to pull a fast one. Perhaps next they'll silently revise other books by dead authors, such as...
Hamlet (now with 25% more ghosts!)
Tale of Two Cities (removing all that ambiguity in the opening)
Paradise Lost (with a new ending, in which Adam and Eve go after Satan with AK-47s)
Pride and Prejudice (car chase thrown in for dramatic effect)
Moby Dick (the whale is revised from boring white to black with red racing stripes)
The Scarlet Letter (in which the "A" is quietly changed to a Harry Potter lighting scar)
The Bible (all that stuff about Moses and Jesus removed -- they didn't test well with the focus groups)
Wheelock's Latin (all gendered inflections omitted)
Robinson Crusoe (given a "Survivor" twist, in which Friday gets voted off the island)
Lord of the Ring (Hobbits replaced by Bratz dolls, with hip new illustrations)
Everyman (the sexist title changed to "Everyindividualregardlessofgenderincludingthetransgendered")
The Book of the City of Ladies (abridged to "The Pamphlet of the City of Ladies")
I'm with them ... and leaving the reviser's name off the title suggests that the editors are intentionally trying to pull a fast one. Perhaps next they'll silently revise other books by dead authors, such as...
Hamlet (now with 25% more ghosts!)
Tale of Two Cities (removing all that ambiguity in the opening)
Paradise Lost (with a new ending, in which Adam and Eve go after Satan with AK-47s)
Pride and Prejudice (car chase thrown in for dramatic effect)
Moby Dick (the whale is revised from boring white to black with red racing stripes)
The Scarlet Letter (in which the "A" is quietly changed to a Harry Potter lighting scar)
The Bible (all that stuff about Moses and Jesus removed -- they didn't test well with the focus groups)
Wheelock's Latin (all gendered inflections omitted)
Robinson Crusoe (given a "Survivor" twist, in which Friday gets voted off the island)
Lord of the Ring (Hobbits replaced by Bratz dolls, with hip new illustrations)
Everyman (the sexist title changed to "Everyindividualregardlessofgenderincludingthetransgendered")
The Book of the City of Ladies (abridged to "The Pamphlet of the City of Ladies")
In Favor of Banning Books
The blogosphere has been full of book banning memes all month, designed, so far as I can tell, as opportunities for bloggers to posture, to praise their favorite books, or (if lucky), both. Much of this has been driven by the American Library Association which, truth be told, is not among my favorite organizations, but my distaste for them lacks enough intensity for me to complain overmuch.
I would like to come out in favor of banning books. Most book-banners want to censor material because it is morally or politically objectionable; I, on the other hand, wish to ban certain materials because they are aesthetically displeasing, yet have somehow become so over-rated that they cannot seem to fall out of print. Clearly, the only way to save humanity from these foul ditches of crappy prose, junior-high poetry, contrived plotlines, vacuous posturing, and ridiculous characters is to burn these books in the public square and strike any record of them from all bibliographies, card catalogues, and databases ... finally salting the earth of any location contaminated by them.
A few of these aesthetically objectionable books, in no particular order:
I Know Why the Caged Bird Sings (which, I'm delighted to note, is already in the top 100 most challenged)
The Handmaid's Tale
The Collected Works of John Grisham (not really lit, but I hate him so much I had to include him)
Walden
Stranger in a Strange Land
The Awakening
The Collected Works of Tom Wolfe
All H.G. Wells except The Time Machine, War of the Worlds and The Invisible Man
The Catcher in the Rye
On the Road
A Separate Peace (OK, I don't remember this one well, but since my memory of reading it in high school has been repressed, it couldn't have been pretty)
Jude the Obscure (not obscure enough)
Lady Chatterly's Lover
The Collected Works of Ayn Rand
The Crucible
The Pilgrim's Progress
Billy Budd (and I love Melville so much, too. Why does this piece of garbage always get anthologized? Possibly his worst work)
A Doll's House
The Collected Works of Pablo Neruda (Don't get me started...)
Things Fall Apart
The Collected Works of Toni Morrison except Song of Solomon (which is so good, it redeems all her other stuff ... Toni, come back to us!)
Germinal
The Showings of Julian of Norwich (See? Something medieval!)
The Razor's Edge
Ragged Dick (and by extension everything else by Alger)
Looking Backward
... etc. There should be something on that list to displease everyone. By the way, if you plan to comment, I'd prefer you didn't defend one of the books on the list (an impossibility, since all are indefensible) -- instead, add your own books for us to ban.
h/t Poliblog and Public Brewery
I would like to come out in favor of banning books. Most book-banners want to censor material because it is morally or politically objectionable; I, on the other hand, wish to ban certain materials because they are aesthetically displeasing, yet have somehow become so over-rated that they cannot seem to fall out of print. Clearly, the only way to save humanity from these foul ditches of crappy prose, junior-high poetry, contrived plotlines, vacuous posturing, and ridiculous characters is to burn these books in the public square and strike any record of them from all bibliographies, card catalogues, and databases ... finally salting the earth of any location contaminated by them.
A few of these aesthetically objectionable books, in no particular order:
I Know Why the Caged Bird Sings (which, I'm delighted to note, is already in the top 100 most challenged)
The Handmaid's Tale
The Collected Works of John Grisham (not really lit, but I hate him so much I had to include him)
Walden
Stranger in a Strange Land
The Awakening
The Collected Works of Tom Wolfe
All H.G. Wells except The Time Machine, War of the Worlds and The Invisible Man
The Catcher in the Rye
On the Road
A Separate Peace (OK, I don't remember this one well, but since my memory of reading it in high school has been repressed, it couldn't have been pretty)
Jude the Obscure (not obscure enough)
Lady Chatterly's Lover
The Collected Works of Ayn Rand
The Crucible
The Pilgrim's Progress
Billy Budd (and I love Melville so much, too. Why does this piece of garbage always get anthologized? Possibly his worst work)
A Doll's House
The Collected Works of Pablo Neruda (Don't get me started...)
Things Fall Apart
The Collected Works of Toni Morrison except Song of Solomon (which is so good, it redeems all her other stuff ... Toni, come back to us!)
Germinal
The Showings of Julian of Norwich (See? Something medieval!)
The Razor's Edge
Ragged Dick (and by extension everything else by Alger)
Looking Backward
... etc. There should be something on that list to displease everyone. By the way, if you plan to comment, I'd prefer you didn't defend one of the books on the list (an impossibility, since all are indefensible) -- instead, add your own books for us to ban.
h/t Poliblog and Public Brewery
Friday, October 14, 2005
OK, one more short before grading papers...
This pair of articles on the front page of The Nation online puts the magazine in the strange position of tacitly advocating violence among left-wing college students, while explicitly denouncing right-wing college students for not being violent enough.
Me? I am against student violence, but in favor of faculty violence. In fact, I'm going to commit a metaphorically violent act against a freshman composition paper right now.
Me? I am against student violence, but in favor of faculty violence. In fact, I'm going to commit a metaphorically violent act against a freshman composition paper right now.
Gender Politics in Academe
For my non-academic readers who mistake faithful descriptions of academic life as hyperbolic satire: This is not a joke.
By the way, if the participants had any medieval training, they would have used the OE hit, a gender-neutral pronoun. Do you suppose Cousin It from the Addams Family got his name from one of these sessions?
By the way, if the participants had any medieval training, they would have used the OE hit, a gender-neutral pronoun. Do you suppose Cousin It from the Addams Family got his name from one of these sessions?
Thursday, October 13, 2005
Tolkien Webliography
I just got an e-mail about Tolkien: A Webliography, a collection of online Tolkien resources in English and Italian. It appears to be in progress, but it should become more exhaustive over time.
Suspense and the English Professor
One of the professional hazards of being an English professor is the problem of suspense. It is very, very hard for me to be surprised in a story. When you deal with narrative enough, the details almost don't matter; the pattern is clear.
My wife and daughter hate to watch Monk with me, because I always know whodunit before the second commercial break, and often even before the first. They find it especially frustrating because I know even before there are any clues presented, primarily because the "clues" I notice are narrative clues. Even though I don't tell them whodunit, I will either stop paying attention or simply leave the room, which really frustrates them. It is equally frustrating to me, though, since mysteries hold little excitement for me.
When I went to see The Sixth Sense with a friend, I remember seeing Bruce Willis gutshot, then sitting on a park bench. I leaned over to my friend, already a little angry that the film was falling into place for me, and muttered, "If it turns out that he's dead through this whole thing, I'm gonna be peeved" [OK, I didn't say "peeved," but you get the point]. You can guess how I felt at the end of it.
Stargate: SG1 will always dwell in a good place in my heart because of one particular episode. A major character has apparently been killed, but this is an obvious red herring for a less drastic change to the plotline. The narrative clues all lined up to suggest that we were to think Character A had died, when in fact it was Character B who died. I almost turned it off, and probably only watch the episode through to the end because the remote control was out of reach. Then, LO! In fact, the entire Character A and Character B plotline had turned out to be a red herring atop a red herring (a whole school of herring), and Character C was dead! Hooray! Someone managed to fool me!
But this Stargate episode is a real rarity. I always know, as do most other English professors. So, when I received a reply on an old post about Battlestar Galactica, I found myself unable to respond. The commenter asked:
And replacement or homemade, why is a junior lieutenant a cylon under your argument?
The truth is that I can't explain it, except to say that it fits the narrative mold. Narratives move in predictable ways because, if they don't, they stink (archetypal critics have an explanation for this). I'm not arguing that all the clues point in that direction; I'm arguing, I suppose, that it would be a better narrative if he were. Since the writers have so far avoided bad science fiction writing, I expect them to produce good narrative before tricky or efficient narrative. D'Anna Biers, for example, did not simply have the possibility of being a cylon -- either she (or her cameraman) had to be a cylon for the narrative to work.
So here's the deal: I can't argue for it without putting you through years of study as an English major. But even if I could do so, I wouldn't ruin it for you. As a saavy consumer of culture, you already know how predictable (and dull) romantic comedies can be (Guess what? They get together at the end!) or action adventure films (Oooh, the hero survived! What a surprise!) or horror films (The monster wasn't really destroyed, and could now return for a sequel? Shocking!). Really understanding how narrative works is like already knowing all the spoilers for every story you will ever encounter. You should enjoy the experience, instead.
My wife and daughter hate to watch Monk with me, because I always know whodunit before the second commercial break, and often even before the first. They find it especially frustrating because I know even before there are any clues presented, primarily because the "clues" I notice are narrative clues. Even though I don't tell them whodunit, I will either stop paying attention or simply leave the room, which really frustrates them. It is equally frustrating to me, though, since mysteries hold little excitement for me.
When I went to see The Sixth Sense with a friend, I remember seeing Bruce Willis gutshot, then sitting on a park bench. I leaned over to my friend, already a little angry that the film was falling into place for me, and muttered, "If it turns out that he's dead through this whole thing, I'm gonna be peeved" [OK, I didn't say "peeved," but you get the point]. You can guess how I felt at the end of it.
Stargate: SG1 will always dwell in a good place in my heart because of one particular episode. A major character has apparently been killed, but this is an obvious red herring for a less drastic change to the plotline. The narrative clues all lined up to suggest that we were to think Character A had died, when in fact it was Character B who died. I almost turned it off, and probably only watch the episode through to the end because the remote control was out of reach. Then, LO! In fact, the entire Character A and Character B plotline had turned out to be a red herring atop a red herring (a whole school of herring), and Character C was dead! Hooray! Someone managed to fool me!
But this Stargate episode is a real rarity. I always know, as do most other English professors. So, when I received a reply on an old post about Battlestar Galactica, I found myself unable to respond. The commenter asked:
And replacement or homemade, why is a junior lieutenant a cylon under your argument?
The truth is that I can't explain it, except to say that it fits the narrative mold. Narratives move in predictable ways because, if they don't, they stink (archetypal critics have an explanation for this). I'm not arguing that all the clues point in that direction; I'm arguing, I suppose, that it would be a better narrative if he were. Since the writers have so far avoided bad science fiction writing, I expect them to produce good narrative before tricky or efficient narrative. D'Anna Biers, for example, did not simply have the possibility of being a cylon -- either she (or her cameraman) had to be a cylon for the narrative to work.
So here's the deal: I can't argue for it without putting you through years of study as an English major. But even if I could do so, I wouldn't ruin it for you. As a saavy consumer of culture, you already know how predictable (and dull) romantic comedies can be (Guess what? They get together at the end!) or action adventure films (Oooh, the hero survived! What a surprise!) or horror films (The monster wasn't really destroyed, and could now return for a sequel? Shocking!). Really understanding how narrative works is like already knowing all the spoilers for every story you will ever encounter. You should enjoy the experience, instead.
Wednesday, October 12, 2005
Wayne Booth, RIP
Wayne Booth died on October 10th. The only thing of interest I have to add is that when I was a freshman, I had a professor who mentioned Booth so often in class that I thought Wayne Booth was a professor at my school.
Tuesday, October 11, 2005
... and coming out smelling like roses.
Yes, falling into a big pile of poo and coming out smelling like roses appears to be what is happening to Dan Drezner. After being denied tenure, his high profile has resulted in a storm in the blogosphere (by last count 232 comments on his website alone), along with a story in the New York Sun and a link to that article in Arts & Letters Daily.
Now, in all fairness to the U of C, no one outside of their institution has any idea why he was denied tenure; I know I certainly don't. Frankly, people outside of academe might be surprised -- or shocked and appalled -- to find out some of the legitimate and trivial reasons people are denied tenure. For all we know, Drezner was sleeping with all his colleagues' wives (or husbands), or was impossible to get along with, or denigrated the work of his colleagues in the classroom, or had been discovered as a plagiarist, or was embezzling sums of money, or was failing/passing students at an unacceptable rate, or had chronic halitosis ... well, you get the idea. Or, of course, it could have been all about the blog. If U of C is like any of the other schools I've worked at, we'll never know the reason, and neither will he.
Regardless of the reason, being denied tenure, sometimes the end of an academic career, seems to have pushed him into the limelight. Drezner has managed to do the impossible in academe: he has attained nearly Ward Churchillian notoriety without being a monumental jackass.
So, a request to the University of Chicago: My CV is available here. Please hire me, then feel free to deny me tenure. If it goes anything like it has for Drezner, I won't mind even a little bit.
Now, in all fairness to the U of C, no one outside of their institution has any idea why he was denied tenure; I know I certainly don't. Frankly, people outside of academe might be surprised -- or shocked and appalled -- to find out some of the legitimate and trivial reasons people are denied tenure. For all we know, Drezner was sleeping with all his colleagues' wives (or husbands), or was impossible to get along with, or denigrated the work of his colleagues in the classroom, or had been discovered as a plagiarist, or was embezzling sums of money, or was failing/passing students at an unacceptable rate, or had chronic halitosis ... well, you get the idea. Or, of course, it could have been all about the blog. If U of C is like any of the other schools I've worked at, we'll never know the reason, and neither will he.
Regardless of the reason, being denied tenure, sometimes the end of an academic career, seems to have pushed him into the limelight. Drezner has managed to do the impossible in academe: he has attained nearly Ward Churchillian notoriety without being a monumental jackass.
So, a request to the University of Chicago: My CV is available here. Please hire me, then feel free to deny me tenure. If it goes anything like it has for Drezner, I won't mind even a little bit.
SCotUS and Intellectual Provincialism
A while back (before any nominations had been made), I mused that it might be better to pick a non-lawyer for the Supreme Court because the limited training of the Court assures incompetence in dealing with issues of textual critique (which is, at its heart, what Constitutional law is), philosophy, economics, etc. As the Supreme Court hasn't restricted itself to purely legal matters since Holmes (and probably before), we are ill-served by it. Scott Gosnell took exception to this, arguing that law should be left up to legal experts, and that the Court should be restricting itself to legal matters anyway. My original post was more of a random thought, but the more I thought about it, the more I began to suspect that the Court is now and has traditionally been incompetent to deal with matters beyond law. I now suspect that judicial restraint is only partially a protection against tyranny; it is more a protection against incompetence.
In the recent debate about GOP ivy league elitism, though, I have begun to suspect that the Court may be incompetent intellectually as well. Oh, I'm sure the justices are perfectly smart people who got good grades, but they are suffering from an intellectual provincialism. Most of the justices attended Harvard law during the same 10 year period ('56-'66). Of those who didn't were the late Renquist, who attended Harvard until 1950; O'Connor, who attended Stanford (at the same time as Renquist); Thomas, who attended Yale; and Stevens, who attended Northwestern. Roberts is another Harvard alumnus, where he attended in the late '70's.
No, I'm not arguing that Harvard is a bad school. I've known some real idiots from Harvard, but I've also known some really fine minds. Let us, for the sake of argument, accept the premise that Harvard is not only the best school in the world, but the best school in the history of the world [Note: Stop objecting, you! It's just for the sake of argument!].
Even were it the best school ever, it seems to me harmful to have the SCotUS made up primarily of people from this one place. The result is, I think, a kind of intellectual provincialism, in which all ideas are absurdly framed by the very limited shared perspective of one school. The phrase "every educated person knows..." takes on the meaning of "this is something that was repeated every semester by one professor at my school." Since most of the justices went to the school at the same time, they were influenced by the same tiny handful of professors. Why are legal arguments today framed the way that they are? I suspect that the reason has little to do with the merits of those paradigms -- rather, it probably has more to do with who was a professor of law at Harvard in the late 50's and early 60's.
Let me offer you an example from Troy University. For some reason, every undergraduate here knows Plato's Allegory of the Cave. I've never taught it, but all my students seem to know it, regardless of major. Somehow it fills the intellectual air here. Every semester in every literature class, some student can be counted upon to raise her hand and say, "it's just like Plato's Allegory of the Cave!" Everyone here has read Homer, knows the basics of the Trojan War (for obvious reasons) and knows the Allegory of the Cave. Yet they almost never know anything else about Plato.
Now, imagine that these same Troy students went on to become Supreme Court justices, and in fact, constituted a majority on the court. They would naturally think that the most important thing Plato wrote was the Allegory of the Cave -- not because anyone told them it was the most important thing, but because "everyone else" only knows that same thing.
The Renquist court was very, very narrowly educated, with only five schools supplying the education for nine people. I've often wondered why the Court seemed so intellectually provincial, so incapable of thinking past certain obvious boundaries. The battles in the Court were always framed in such limited ways, such as Original Intent vs. the Living Document. Haven't they an inkling of simple textual scholarship? I would wonder. The answer, I think, is that they probably don't, because their educations are of such limited field.
So, was Roberts all that brilliant when he appeared before the Court, or was he simply familiar with the shared Harvard provincialism of the justices?
In the recent debate about GOP ivy league elitism, though, I have begun to suspect that the Court may be incompetent intellectually as well. Oh, I'm sure the justices are perfectly smart people who got good grades, but they are suffering from an intellectual provincialism. Most of the justices attended Harvard law during the same 10 year period ('56-'66). Of those who didn't were the late Renquist, who attended Harvard until 1950; O'Connor, who attended Stanford (at the same time as Renquist); Thomas, who attended Yale; and Stevens, who attended Northwestern. Roberts is another Harvard alumnus, where he attended in the late '70's.
No, I'm not arguing that Harvard is a bad school. I've known some real idiots from Harvard, but I've also known some really fine minds. Let us, for the sake of argument, accept the premise that Harvard is not only the best school in the world, but the best school in the history of the world [Note: Stop objecting, you! It's just for the sake of argument!].
Even were it the best school ever, it seems to me harmful to have the SCotUS made up primarily of people from this one place. The result is, I think, a kind of intellectual provincialism, in which all ideas are absurdly framed by the very limited shared perspective of one school. The phrase "every educated person knows..." takes on the meaning of "this is something that was repeated every semester by one professor at my school." Since most of the justices went to the school at the same time, they were influenced by the same tiny handful of professors. Why are legal arguments today framed the way that they are? I suspect that the reason has little to do with the merits of those paradigms -- rather, it probably has more to do with who was a professor of law at Harvard in the late 50's and early 60's.
Let me offer you an example from Troy University. For some reason, every undergraduate here knows Plato's Allegory of the Cave. I've never taught it, but all my students seem to know it, regardless of major. Somehow it fills the intellectual air here. Every semester in every literature class, some student can be counted upon to raise her hand and say, "it's just like Plato's Allegory of the Cave!" Everyone here has read Homer, knows the basics of the Trojan War (for obvious reasons) and knows the Allegory of the Cave. Yet they almost never know anything else about Plato.
Now, imagine that these same Troy students went on to become Supreme Court justices, and in fact, constituted a majority on the court. They would naturally think that the most important thing Plato wrote was the Allegory of the Cave -- not because anyone told them it was the most important thing, but because "everyone else" only knows that same thing.
The Renquist court was very, very narrowly educated, with only five schools supplying the education for nine people. I've often wondered why the Court seemed so intellectually provincial, so incapable of thinking past certain obvious boundaries. The battles in the Court were always framed in such limited ways, such as Original Intent vs. the Living Document. Haven't they an inkling of simple textual scholarship? I would wonder. The answer, I think, is that they probably don't, because their educations are of such limited field.
So, was Roberts all that brilliant when he appeared before the Court, or was he simply familiar with the shared Harvard provincialism of the justices?
Monday, October 10, 2005
What to Do with a Big Idea?
I was up until past 3AM last night, owing to the fact that I graded 40 freshman composition papers in one day (!) and every time I closed my eyes, that prose assaulted my mind. The good news is that they are getting better.
Sometime between 2AM and 3:30AM, I had a Big Idea. You know the kind of ideas I am talking about: the kind that strikes you as brilliant in the dim mind of the night, but melts under the harsh daylight of a well-rested mind. The funny thing is that when I woke up, the idea seemed even better.
When I say a Big Idea, I don't mean something like, "Hey, gang, let's have a bake sale!" or "Maybe I'll start eating right and exercising more." I mean an idea for changing the way we think about medieval literature -- a suggestion for an entire paradigm shift. The kind of idea that really cheeses people off, which, if poorly expressed, makes the thinker come off as a crackpot.
So, I've been trying to figure out what to do with this idea. My wife suggested I write an article and send it to Speculum (an old and important medieval journal). An early modernist friend of mine suggested I share the idea with a few like-minded friends and put together a panel at the Medieval Congress (yes, there is such a thing). Another friend suggested I write up the article and send it to one of my mentors for advice. Another suggestion was that I try to turn it into a book.
I'm not trying to be cutsey and teasing here ... I just don't want to reveal the idea before it either ripens to wine or turns to vinegar. The question is: what do you do with a really big idea?The truth is, I'm not sure how to proceed. Any help from the wise among you?
Sometime between 2AM and 3:30AM, I had a Big Idea. You know the kind of ideas I am talking about: the kind that strikes you as brilliant in the dim mind of the night, but melts under the harsh daylight of a well-rested mind. The funny thing is that when I woke up, the idea seemed even better.
When I say a Big Idea, I don't mean something like, "Hey, gang, let's have a bake sale!" or "Maybe I'll start eating right and exercising more." I mean an idea for changing the way we think about medieval literature -- a suggestion for an entire paradigm shift. The kind of idea that really cheeses people off, which, if poorly expressed, makes the thinker come off as a crackpot.
So, I've been trying to figure out what to do with this idea. My wife suggested I write an article and send it to Speculum (an old and important medieval journal). An early modernist friend of mine suggested I share the idea with a few like-minded friends and put together a panel at the Medieval Congress (yes, there is such a thing). Another friend suggested I write up the article and send it to one of my mentors for advice. Another suggestion was that I try to turn it into a book.
I'm not trying to be cutsey and teasing here ... I just don't want to reveal the idea before it either ripens to wine or turns to vinegar. The question is: what do you do with a really big idea?The truth is, I'm not sure how to proceed. Any help from the wise among you?
Sunday, October 09, 2005
Lest You Think a Blog Will Save Your Career
In a backlash against the Ivan Tribble nonsense, some academic bloggers have been trying to make the claim that blogging will help your career. I'm pretty skeptical of either claim; I suspect it is a wash.
As evidence that blogging doesn't help much, Dan Drezner was denied tenure last week. I can't claim to know the particulars, though I look at his CV, one would think a couple of books and two dozen papers would be good in Poli Sci. Regardless, since his blog is consistently in the number one position in the Truth Laid Bear Academy community, and in the top hundred blogs in the world, this would seem to be evidence that having a well-read and influential blog won't assure you tenure and promotion, at least at the University of Chicago.
Will U of C eventually regret its decision? It should ... but universities aren't really known for their introspection (we specialize in deep thinking about OTHER people). Should the untenured or unhired panic and delete their blogs? No ... but doubtless some will. Will Drezner land on his feet? You can bet the farm on it.
As evidence that blogging doesn't help much, Dan Drezner was denied tenure last week. I can't claim to know the particulars, though I look at his CV, one would think a couple of books and two dozen papers would be good in Poli Sci. Regardless, since his blog is consistently in the number one position in the Truth Laid Bear Academy community, and in the top hundred blogs in the world, this would seem to be evidence that having a well-read and influential blog won't assure you tenure and promotion, at least at the University of Chicago.
Will U of C eventually regret its decision? It should ... but universities aren't really known for their introspection (we specialize in deep thinking about OTHER people). Should the untenured or unhired panic and delete their blogs? No ... but doubtless some will. Will Drezner land on his feet? You can bet the farm on it.
Saturday, October 08, 2005
In (partial) defense of binaries
Every time I vow I'm not going to comment on Drout's next post, lest this all become an absurd insider conversation between philograms, he posts something I've got to comment on. And so it goes...
Drout was complaining about the simplistic use of binaries in literary theory, particularly deconstructive types, and his complaints are justifiable. My favorite passage:
Likewise I think it is all too easy to go out and spear a few binary oppositions and then convince yourself that you've helped to expose the unworkable logic of whatever evil system that you're trying to undermine. The whole process now just makes me uncomfortable: I feel like we're waiting for someone to pop out of the bushes and yell "tautology!!!" (well, that's how we played it where I grew up).
I love that because it is absurd and an apt description of what graduate school can be like. If you draw from that comment the suspicion that graduate school can be absurd, well... *cough*
Let me just say this in partial defense of binaries, though. As Earthlings, we may be hardwired to think in binaries. When we look up in the sky, we see two lights, one bright, and one dim. Nevertheless, we see two. And, from the perspective of Earth, those two lights appear to be the exact same size. An eclipse only underscores the apparent equality of these two lights, suggesting that they are in equal opposition to one another, perfect celestial binaries.
Now, think about the other planets:
Mercury -- No moons, just one really huge, blazing sun.
Venus -- Ditto
Earth -- Well, you know.
Mars -- Two lumpy moons of different sizes
Jupiter -- Dozens of moons, though some are too small to be seen from the "surface," I think
Saturn -- Ditto
Uranus -- Ditto
Neptune -- 13 moons, 8 of which are significant enough to have names
Pluto -- Only one moon, but it is more like a double planet system
We Earthlings are in what appears to be a pretty unusual situation. One moon in stable orbit, of apparent relative size equal to the sun. Going waaaaay back to primitive man, should we be surprised that we have a tendency to divide our immediate reality in the same way?
Taking this one step further, consider medieval Christianity ... or any Christianity, for that matter. How many of the heresies had, as part of their cosmology, a challege to the ONE god with THREE aspects of Trinitarism, instead preferring a dualistic system of one type or another: Albigenses, Gnosticism, Nestorianism, Manichaeism, and probably a bunch of otherisms that slip my mind at the moment. Even the Christian folklore regarding the Anti-Christ is part of this dualism, as it is not generally regarded as the many anti-christs that are described in the epistles of John, but rather an equal and opposite of Christ -- a moon versus the sun.
Us vs. Them, East vs. West, Alabama vs. Auburn, Jesus vs. Santa, God vs. Satan, Freedom vs. Tyranny, Rich vs. Poor, Male vs. Female, Dry vs. Wet, Tom vs. Jerry ... I'm not sure how many of these would stand under close scrutiny as useful binaries. We're steeped in them, though.
Are binaries particularly useful ways to think about literature. Generally not, I think. Let's cut people some slack, though -- maybe we just can't help ourselves.
Drout was complaining about the simplistic use of binaries in literary theory, particularly deconstructive types, and his complaints are justifiable. My favorite passage:
Likewise I think it is all too easy to go out and spear a few binary oppositions and then convince yourself that you've helped to expose the unworkable logic of whatever evil system that you're trying to undermine. The whole process now just makes me uncomfortable: I feel like we're waiting for someone to pop out of the bushes and yell "tautology!!!" (well, that's how we played it where I grew up).
I love that because it is absurd and an apt description of what graduate school can be like. If you draw from that comment the suspicion that graduate school can be absurd, well... *cough*
Let me just say this in partial defense of binaries, though. As Earthlings, we may be hardwired to think in binaries. When we look up in the sky, we see two lights, one bright, and one dim. Nevertheless, we see two. And, from the perspective of Earth, those two lights appear to be the exact same size. An eclipse only underscores the apparent equality of these two lights, suggesting that they are in equal opposition to one another, perfect celestial binaries.
Now, think about the other planets:
Mercury -- No moons, just one really huge, blazing sun.
Venus -- Ditto
Earth -- Well, you know.
Mars -- Two lumpy moons of different sizes
Jupiter -- Dozens of moons, though some are too small to be seen from the "surface," I think
Saturn -- Ditto
Uranus -- Ditto
Neptune -- 13 moons, 8 of which are significant enough to have names
Pluto -- Only one moon, but it is more like a double planet system
We Earthlings are in what appears to be a pretty unusual situation. One moon in stable orbit, of apparent relative size equal to the sun. Going waaaaay back to primitive man, should we be surprised that we have a tendency to divide our immediate reality in the same way?
Taking this one step further, consider medieval Christianity ... or any Christianity, for that matter. How many of the heresies had, as part of their cosmology, a challege to the ONE god with THREE aspects of Trinitarism, instead preferring a dualistic system of one type or another: Albigenses, Gnosticism, Nestorianism, Manichaeism, and probably a bunch of otherisms that slip my mind at the moment. Even the Christian folklore regarding the Anti-Christ is part of this dualism, as it is not generally regarded as the many anti-christs that are described in the epistles of John, but rather an equal and opposite of Christ -- a moon versus the sun.
Us vs. Them, East vs. West, Alabama vs. Auburn, Jesus vs. Santa, God vs. Satan, Freedom vs. Tyranny, Rich vs. Poor, Male vs. Female, Dry vs. Wet, Tom vs. Jerry ... I'm not sure how many of these would stand under close scrutiny as useful binaries. We're steeped in them, though.
Are binaries particularly useful ways to think about literature. Generally not, I think. Let's cut people some slack, though -- maybe we just can't help ourselves.
Sir, the 21st Century Welcomes You
While this piece isn't all that interesting, it does have a remarkably anachronistic statement, in which it suggests that Iraq "could become South Korea, an authoritarian state on the road to future prosperity."
Was this piece written 20 years ago? He can't possibly be referring to the South Korea of the 21st century. The last three presidents, from Kim Young-Sam to Roh Muhyeon, have all been opposition figures. It is something like the 11th largest trading nation in the world. If South Korea is authoritarian and prosperity is only in its future, what then is North Korea? A land incapable of sustaining plant or animal life?
Stop watching the MASH re-runs (it was never that funny anyway), and pick up a newspaper printed since 1988. If Iraq became like South Korea, it would be success beyond our fantasies.
Was this piece written 20 years ago? He can't possibly be referring to the South Korea of the 21st century. The last three presidents, from Kim Young-Sam to Roh Muhyeon, have all been opposition figures. It is something like the 11th largest trading nation in the world. If South Korea is authoritarian and prosperity is only in its future, what then is North Korea? A land incapable of sustaining plant or animal life?
Stop watching the MASH re-runs (it was never that funny anyway), and pick up a newspaper printed since 1988. If Iraq became like South Korea, it would be success beyond our fantasies.
Insults, Evangelicals, and Catholics
Blogenspiel has a gripe about NPR saying that Harriet Miers was Catholic before she converted to Christianity, as if Catholicism isn't a form of Christianity. One of her commenters chalks it up to "falling into the evangelical rhetoric."
Perhaps the problem is that NPR reporters aren't familiar with Christianity in any form? I've noticed that they tend to mis-state or seriously mis-understand basic doctrines, tend to use the terminology of Christianity in awkward ways that imply a lack of familiarity, and tend to use the word "Christian" as a signal of disapproval.
So, though I didn't hear the story, given that it was on NPR, I suspect the reference to evangelicals as "Christians" was not intended to be a concession to evangelical rhetoric, but was instead a signal to the audience that evangelicals are the "bad guys" (which would put Catholics in the unusual position of being good guys, or at least the lesser evil on NPR).
Of course, they like to use "medieval" as an insult too, so whenever they refer to the Catholic Church as medieval, I always think of Merry's reaction when the Sackville-Bagginses accuse Frodo of being a Brandybuck rather than a Baggins:
"Did you hear that, Merry? That was an insult, if you like," said Frodo as he shut the door on her.
"It was a compliment," said Merry Brandybuck, " and so, of course, not true."
Perhaps the problem is that NPR reporters aren't familiar with Christianity in any form? I've noticed that they tend to mis-state or seriously mis-understand basic doctrines, tend to use the terminology of Christianity in awkward ways that imply a lack of familiarity, and tend to use the word "Christian" as a signal of disapproval.
So, though I didn't hear the story, given that it was on NPR, I suspect the reference to evangelicals as "Christians" was not intended to be a concession to evangelical rhetoric, but was instead a signal to the audience that evangelicals are the "bad guys" (which would put Catholics in the unusual position of being good guys, or at least the lesser evil on NPR).
Of course, they like to use "medieval" as an insult too, so whenever they refer to the Catholic Church as medieval, I always think of Merry's reaction when the Sackville-Bagginses accuse Frodo of being a Brandybuck rather than a Baggins:
"Did you hear that, Merry? That was an insult, if you like," said Frodo as he shut the door on her.
"It was a compliment," said Merry Brandybuck, " and so, of course, not true."
Wednesday, October 05, 2005
Translation awkwardness, or joke?
I'm not sure whether this is an intentional joke in the original language, or a problem of translation, but my medieval literature class found it quite funny when I inadvertantly read aloud this double-entendre from the Romance of the Rose in class:
"Moreover, I do not consider you courteous when just now you named the testicles to me; they are not well thought of in the mouth of a courteous girl."
"Moreover, I do not consider you courteous when just now you named the testicles to me; they are not well thought of in the mouth of a courteous girl."
Tuesday, October 04, 2005
Academic Blogging: Officially Un-Cool
Well, it's official. Academic blogging is officially un-cool -- The Chronicle of Higher Education has published a favorable article on academic blogging. The effect is like when you were a kid and heard your parents use one of your slang terms; when even parents know the term, it has become un-cool, and wise kids stop using it.
Of all the reasons to blog, one sited by Henry Farrell is one I never considered:
Academic blogs should be especially attractive to younger scholars, to whom they give an unparalleled opportunity to make their voices heard. Cross-blog conversations can turn the traditional hierarchies of the academy topsy-turvy. An interesting viewpoint expressed by an adjunct professor (or, even more shocking, an "independent scholar") will almost certainly receive more attention than ponderous stodge regurgitated by the holder of an endowed chair at an Ivy League university. Prominent academics who start blogging do have an initial advantage; they're more likely to attract early attention than people without established reputations.
Reputation? Can a blog really help a younger scholar develop a reputation ... as anything other than a crank, I mean? To be honest, I've found that the attention I've received is primarily from non-academics. Am I going to start receiving unsolicited job offers and requests for me to develop blog entries into articles?
The other interesting aspect of the article is that it primarily deals with the professional aspects of blogging. I'm wondering if it isn't time to spend more time talking about the blog as a sub-genre of the essay or miscellany. Blogs are interesting in that they are sub-literary, but completely in the realm of the literate. Indeed, one has to be traditionally literate and techno-literate to even access them. Into this mix throw the rhetoric of orality. Now that thinking about the value of blogs per se has become un-cool, maybe it is time to start thinking about how they function as a genre.
Of all the reasons to blog, one sited by Henry Farrell is one I never considered:
Academic blogs should be especially attractive to younger scholars, to whom they give an unparalleled opportunity to make their voices heard. Cross-blog conversations can turn the traditional hierarchies of the academy topsy-turvy. An interesting viewpoint expressed by an adjunct professor (or, even more shocking, an "independent scholar") will almost certainly receive more attention than ponderous stodge regurgitated by the holder of an endowed chair at an Ivy League university. Prominent academics who start blogging do have an initial advantage; they're more likely to attract early attention than people without established reputations.
Reputation? Can a blog really help a younger scholar develop a reputation ... as anything other than a crank, I mean? To be honest, I've found that the attention I've received is primarily from non-academics. Am I going to start receiving unsolicited job offers and requests for me to develop blog entries into articles?
The other interesting aspect of the article is that it primarily deals with the professional aspects of blogging. I'm wondering if it isn't time to spend more time talking about the blog as a sub-genre of the essay or miscellany. Blogs are interesting in that they are sub-literary, but completely in the realm of the literate. Indeed, one has to be traditionally literate and techno-literate to even access them. Into this mix throw the rhetoric of orality. Now that thinking about the value of blogs per se has become un-cool, maybe it is time to start thinking about how they function as a genre.
Monday, October 03, 2005
Polygamous Love Literature
In medieval literature today, we talked in a very general manner about love, sex, and marriage in art. One interesting thing we realized is that even literature from polygamous societies does not have polygamous love stories. We have stories in which there is polygamy and conflict among wives, or stories in which a shrewish wife is a barrier to taking a new lover, or even stories (like David and Michel or Solomon and the Queen of Sheba) in which there is love between two lovers, but other lovers are curiously absent from the picture. This seemed a significant absence to me, but I'm not sure of its importance.
Some possible exceptions to the rule might be the Arthur-Guenevere-Lancelot triangle (though that becomes doomed as soon as it is made public), some of the stuff in the Tale of Genji (though, again, much of that seems to be behind closed doors), or maybe the Odyssey (in which the goddess appears to have a crush on Odysseus, yet helps him get home to his wife). Still, none of these seem quite right. Are there some obvious examples I am forgetting?
Some possible exceptions to the rule might be the Arthur-Guenevere-Lancelot triangle (though that becomes doomed as soon as it is made public), some of the stuff in the Tale of Genji (though, again, much of that seems to be behind closed doors), or maybe the Odyssey (in which the goddess appears to have a crush on Odysseus, yet helps him get home to his wife). Still, none of these seem quite right. Are there some obvious examples I am forgetting?
Cell phones in class
This really doesn't have anything to do with the life of the nuos, but a friend insisted that I blog on this subject.
Naturally, I hate cell phones in class, so I have developed a couple of methods of dealing with them:
If a cell phone rings, I insist the student give it to me, then I answer it. I generally come up with a cock-and-bull story for the person on the other line, such as claiming that I am the jealous lover of the student who wants to know who would be calling my "snookums," or that the student has been called away on a secret mission for the government, or that I am the personal secretary for the student who is now screening the calls to keep the riff-raff out. Generally speaking, the student is so embarrassed that neither he nor his fellow classmates risks allowing their cell phone to ring in class again.
If the student hangs up first, or if the student is annoyingly playing with the cell phone, distracting the class with a beeping Solitaire game, I have a couple of other methods. One is to call one of my friends or relatives on their phone and discuss some pointless issue. This method, however, cannot be used too much as it is limited to the friends and relatives whose numbers I know by heart and who will be home in the middle of the day. The second method (and up until now, my usual one), is to call some random person in that student's phone book and complain about how they are using their cell phone in class. This method works at its very best if there is a phone book entry that reads "Mom." Oh, yes ... they don't like it when I call their mothers on their own phones to complain that they are playing with their cell phone in class.
Last week, though, I hit upon a new method when I confiscated a camera phone. This time, I took a picture of myself pointing at the phone, shouting, "Turn off your phone in class!" Then I set it as her wallpaper. The next time she flipped open her phone, she saw a deranged professor pointing at her angrily. Maybe that'll keep the phone off.
Naturally, I hate cell phones in class, so I have developed a couple of methods of dealing with them:
If a cell phone rings, I insist the student give it to me, then I answer it. I generally come up with a cock-and-bull story for the person on the other line, such as claiming that I am the jealous lover of the student who wants to know who would be calling my "snookums," or that the student has been called away on a secret mission for the government, or that I am the personal secretary for the student who is now screening the calls to keep the riff-raff out. Generally speaking, the student is so embarrassed that neither he nor his fellow classmates risks allowing their cell phone to ring in class again.
If the student hangs up first, or if the student is annoyingly playing with the cell phone, distracting the class with a beeping Solitaire game, I have a couple of other methods. One is to call one of my friends or relatives on their phone and discuss some pointless issue. This method, however, cannot be used too much as it is limited to the friends and relatives whose numbers I know by heart and who will be home in the middle of the day. The second method (and up until now, my usual one), is to call some random person in that student's phone book and complain about how they are using their cell phone in class. This method works at its very best if there is a phone book entry that reads "Mom." Oh, yes ... they don't like it when I call their mothers on their own phones to complain that they are playing with their cell phone in class.
Last week, though, I hit upon a new method when I confiscated a camera phone. This time, I took a picture of myself pointing at the phone, shouting, "Turn off your phone in class!" Then I set it as her wallpaper. The next time she flipped open her phone, she saw a deranged professor pointing at her angrily. Maybe that'll keep the phone off.
Saturday, October 01, 2005
Leaves of Yggdrasill
On the Old English charms front, Edward Pettit, editor of Anglo-Saxon Remedies, Charms, and Prayers from British Library MS Harley 585: The 'Lacnunga', has a new website devoted to his work on the Elder Edda and the OE charms.
Except for the incorrect spelling of my name, it looks useful.
Except for the incorrect spelling of my name, it looks useful.
Subscribe to:
Posts (Atom)