Archive for Culture
What’s with the Washington Post totally missing the smack-you-in-the-face obvious point of Atlas Shrugged, a novel routinely dismissed by its critics as stilted, didactic, and puerile?
I’m not trying to defend the book from these charges — it is, indeed, all of these things on one level or another. Nor am I defending Objectivism or Rand herself.
But the Post seems intent to allow its writers to pen screeds against the book and its author while missing the underlying point.
Last week, film critic Mark Jenkins gave us a review with this leap of logic:
The bullet-train theme is somewhat ironic. A roaring locomotive is a dynamic image of American industrial power, but even in 1957 — when the book was published — the future of railroading was in Europe and Asia. And the right-of-center types who revere Rand tend to dismiss public funding for high-speed rail.
And then today, Bush speechwriter Michael Gerson gives us this gem:
But Rand’s distinctive mix of expressive egotism, free love and free-market metallurgy does not hold up very well on the screen. The emotional center of the movie is the success of high-speed rail — oddly similar to a proposal in Barack Obama’s last State of the Union address. All of the characters are ideological puppets. Visionary, comely capitalists are assaulted by sniveling government planners, smirking lobbyists, nagging wives, rented scientists and cynical humanitarians.
Setting aside the general inanity of Gerson’s entire column, ably dismissed here by Jesse Walker, the question remains: What do you people not get about the point of this book?
Hint: it’s not about high-speed rail.
It’s about human achievement and a couple of characters making happen something the rest of the world has deemed impossible.
It’s not about the benefits of funding rail transit from the public purse. It is in no way “ironic,” nor is it “oddly similar” to any proposal from any elected official anywhere in the world.
The critics who assail Atlas Shrugged for its over-the-top didacticism, one-dimensional characters, and simplistic philosophy (charges I won’t defend it against) should be sure they understand the underlying point. You know, the one that they criticize Rand for making too obvious.
And it’s not a meditation on the utilitarian benefits of different means of passenger transport.
This week marks Twitter’s 5th birthday, and Thursday, along with @adamthierer, I’m teaching a little introductory seminar at work on how to use the service. It’s a boon to anyone who’s job revolves around consuming and producing ideas and information, so it should be a no-brainer that most working in policy should be on it. But any time the subject comes up, skeptical rumblings resound.
Some of the folks we’ll be talking to are Twitter veterans and may be looking to share their own experience or pick up a pro tip, others are new to Twitter and have recently opened accounts, or don’t have accounts but are eager to learn. But then there are the folks, a bit older I have to say, who just aren’t interested.
Some are plain dismissive, in the “What do I care what someone had for breakfast” vein. Others seem overwhelmed and look at Twitter as one more damned thing they have to learn and manage. To them it’s a burden, not a benefit. Here’s a comment from an old post of Tyler’s on the same subject:
Personally, I dislike twitter because it becomes yet another thing that requires upkeep and saps attention from other projects.
There are only so many hours in the day, and I find social networks/e-mail/blackberries jarring and distracting. It outweighs any benefits I can imagine.
Looking back at what I first wrote about Twitter three years ago this month, I too was skeptical at first. Once I started using it, though, there was no looking back. It’s interesting that I wrote that I had “started to force myself to use Twitter to see if I can discover why people find it so compelling.” I guess only then did it seem obvious.
So how do one get skeptical folks to try it? Should one?
To the dismissive folks, I think the key is to explain that Twitter is a tool and it therefore can be used for good or ill. It can be used to only follow pop divas, or it can be used to follow the news, spread ideas, and have debates with other academics. I’m less sure what to say to the folks who answer, “Sure, but I already do that over email, research papers, op-eds, live debates, etc.” Simply answering that this is the new thing is not enough.
I think the immediacy of it is part of the answer, but that just further conjures up the image of another info-torrent one has to deal with. I think one way to answer is that just as Twitter has come on the stage as something new to deal with, mail, faxes, and even telephone calls have exited the stage. More importantly, though, is that Twitter is the kind of beast that doesn’t lend itself to an accurate personal cost-benefit analysis until one has used it. Its value is not easily understood from the outside.
So a little help, please. How do you take a 50-year-old who doesn’t use RSS feeds and get her to monitor a Twitter client? Is it even advisable?
Tyler points out another instance of the perceived-handicap-is-actually-an-advantage meme of which he is the master. In this case it’s artist Chuck Close, whose prosopagnosia may contribute to his amazing portraiture skills. (I saw the excellent Close exhibit at the Corcoran last weekend. No reason for you to know that, just signaling.)
Prosopagnosia is often referred to as “face blindness.” It is the inability to distinguish people or remember people’s faces. Now, if you’re like me, you don’t have prosopagnosia, but you have a terrible memory, especially for remembering persons you’ve met. In my line of work I meet lots and lots of people. I might meet someone at a reception, shake their hand, and chat for a long while, then a month later meet them again and have no recollection of who they are. This is a social fois pas and because this often happens to me, I am likely considered a heel and inconsiderate.
If I told you, however, that I suffer from prosopagnosia, you would feel–if not exactly pity–understanding. In fact, this happened to me recently. At a party earlier this year I said hello to someone from who I had not too long before that been sat across at a dinner party. He apologized, told me he didn’t know who I was, and that he has prosopagnosia. A lightbulb went off in my head and you can see where this is going.
If I told someone, “I’m sorry but I don’t like social situations and I have a terrible memory, so I don’t remember you,” I would be a jerk. This is so even though I’m exhibiting the same symptoms of prosopagnosia. But if I said I had the condition, all would be excused and the person would gladly remind me of how we had met, which would be appreciated. So we excuse a behavior when its cause is some diagnosed brain damage, but condemn it when it’s cause is just the way we’re wired.
Question time: How ethical would it be if I started claiming that I had prosopagnosia? (Not even a little bit, huh?) Is there a name for some other lesser condition that describes general social anxiety and terrible memory? Can we coin one?
So the longest Wimbledon match in history is currently underway. As I type, Isner and Mahut are tied 56-56 in the last set; they’re tied 2-2 in previous sets.
According to reports (and common sense), the two men are exhausted and grimacing with each serve (though they seem to have rallied when they crossed the 10-game threshold). My guess is that they arms will be completely shot by this for the next few days. As a result, whoever wins this match will be at a severe disadvantage going into the next round, and will probably be so exhausted that he will lose. In other words, assuming winning the tournament is the goal and one that’s much more prized than merely advancing to a second round, it’s a classic prisoners’ dilemma. The dominant strategy by each player is to keep playing, even though every game further diminishes his likelihood of winning the next round, and therefore taking them out of contention for an overall win. In other words, every additional set that Isner and Mahut play decreases their chances of making it to the third round and thus reduces their combined expected utility.
Would there have been a way for them to have reached an optimal solution earlier on? Obviously sportsmanship precludes any kind of (spoken) match fixing, but mightn’t there be a rule change allowing a set to stop before it becomes a death match? For instance, before a match begins, players could agree to opt out of the normal rule and set a limit on the maximum number of sets they’ll play in a final, fifth match before the need to win by two games is eliminated. Setting this rule from the beginning of a match allows the players to circumvent the prisoners dilemma; whoever wins the match is more likely to win the tournament than in the absence of such a rule.
Efficiency concerns are seldom used in sports; we prefer the epic struggles and Pyrrhic battles. But that doesn’t obviate a little sideline thought experimentation.
I’m at a conference in Philadelphia today with about 100 people in an auditorium. Around two in the afternoon, someone’s watch made a “beep beep” sound, and it took me a minute to realize that this was a sound marking the hour and one that I hadn’t heard in years.
Do you remember in the 1980s and 90s when a chorus of digital wristwatches emitted perfunctory peels every hour on the hour? I realized today that this seems to have completely disappeared. Why is this? A few hypotheses, ranging from the blindingly obvious to the more subtle (and therefore less likely correct):
- People are less likely to wear watches. This is the most obvious theory. Some estimates show that watch sales have fallen off over the last few years, but not by the order of magnitude that would be required for the virtual elimination of the hourly watch chime. Even if 50 percent fewer watches were sold this decade, and stipulating for the moment that watches are not durable goods, that still doesn’t explain it. Anecdotal evidence suggests that cell phones and iPods have rendered the wristwatch obsolete (at least as a method for telling time), but they haven’t gone the way of the buggy whip yet.
- Preferences have changed towards analog wristwatches. This makes some sense; since we have the time in our pocket (plus calculators, contacts, appointments, memos, and all the other snazzy things our watches used to do before PDAs and cell phones), watches perform only two functions: telling the time and signaling status.
- People no longer want to be told when the hour strikes. What explains this change in preferences, however? Why would this have changed?
- People never wanted hourly chimes to begin with but watches came with them turned on by default. Call this the Sunstein and Thaler theory.
- People still want hourly chimes but don’t want to wear watches to get them. This makes little sense since presumably cell phones could be made to chime hourly, or developers would create an app. (Oh wait, they did.)
- My sample has changed. I’m in a professional environment now rather than school and college. Since I graduated from college about the time that cell phones became ubiquitous, I have a difficult time disaggregating a number of social trends from this other revolution.
Granted, this is a completely pedestrian observation. But it is remarkable that, at least from my perspective, something as ubiquitous as the hourly watch chime seems to have disappeared overnight, and without much fanfare.
I’m back from Disney and here is my verdict: it’s is incredibly ordinary. I’m afraid I have no grand insights to offer, but I’ll take a stab at a few observations.
My last post inspired Jackson Kuhl to riff on how an ideal of cultural authenticity is generally unhelpful, and concluded: “I think perhaps Jerry didn’t want to go to Disney because, as a 30-something dude without kids, riding the Dumbo carousel doesn’t get his heart pumping.” I think that’s absolutely right. Disney is first and foremost for children, and it was for the benefit of my wife’s nephew that we went. It was only through his enjoyment that I could appreciate the place.
Now, two things that struck me. First, vacationing at Disney is like vacationing at a cross of a mall and sports stadium. The entire experience is engineered to get you to buy stuff. At the stores, at the kiosks, at the food court. The vast majority of the stuff is the kind of completely useless garbage that in a previous life I founded Unclutterer to combat. The twist is that there is no competition inside Disney’s walls, so you pay incredibly inflated prices. The company, however, has mastered the art of making folks thankful for the privilege. I am seriously considering purchasing their stock.
The second thing that struck me is that Disney is one of the most massive experiments in privatization we have today. Walt Disney wanted to build more than an amusement part. The immersive experience he had in mind was not just for visitors, but for residents as well. The Magic Kingdom was to be just a small part of the Experimental Prototype Community of Tomorrow. According to Wikipedia:
Walt Disney’s original vision of EPCOT was for a model community, home to twenty thousand residents, which would be a test bed for city planning and organization. The community was to have been built in the shape of a circle, with businesses and commercial areas at its center, community buildings and schools and recreational complexes around it, and residential neighborhoods along the perimeter. Transportation would have been provided by monorails and PeopleMovers (like the one in the Magic Kingdom’s Tomorrowland). Automobile traffic would be kept underground, leaving pedestrians safe above-ground. Walt Disney said, “It will be a planned, controlled community, a showcase for American industry and research, schools, cultural and educational opportunities. In EPCOT, there will be no slum areas because we won’t let them develop. There will be no landowners and therefore no voting control. People will rent houses instead of buying them, and at modest rentals. There will be no retirees; everyone must be employed.”
Here is a film of Disney presenting the concept city. In one sense it’s a libertarian dream. A completely privatized city. In a law review article on the subject, Prof. Chad Emerson explains how it was made possible by the Florida legislature creating what amounts to a giant business improvement district the size of Manhattan. It ceded to the Disney Company traditionally governmental functions such as zoning, streets, drainage and even police and fire service. For example, in the elevators of the Disney hotel at which I stayed last week, the usual inspection certificates were posted. The issuing authority was the Reedy Creek Improvement District, which is wholly controlled by Disney. In essence, the company is certifying its own elevators. In theory, the district (read Disney) also has the power to set up its own municipal court, and it even has explicit authority to develop a nuclear power plant.
In another sense, though, it’s a libertarian nightmare. Planned by experts from top-to-bottom with a benevolent Uncle Walt at the head. As I’ve mentioned, there also doesn’t seem to be much room for competition inside the city walls. If Walt had had his way, alcohol would have been strictly controlled. And what exactly would have happened to the old people who wanted to retire? I guess it’s all OK though if you what you’re signing up for and are free to leave any time.
In the end, Disney died before even the Magic Kingdom opened, and the plan for greater EPCOT was reduced to the EPCOT Center park we know today. The top down and controlled nature of Disney is still very present there, however, and I think that’s what gives me the willies about the place. There’s nothing nefarious about it, it’s simply like Walt’s vision for the dome that would have encapsulated EPCOT: climate-controlled to a perfect 72º at all times with no chance of weather. Even Las Vegas–Disney World for adults–as “synthetic” as it is, has an element of unpredictability to it.
The New York Times, always fresh to break a scoop, reports on the BPGlobalPR Twitter feed, which for the last several weeks has been offering up scathingly hilarious takes on what is quickly taking the mantle of America’s largest-ever environmental disaster:
The parody site is updated throughout the day, offering a combination of “everything is going exactly according to plan” P.R. speak, macabre humor and occasional glimpses of genuine outrage. Over the last week, BPGlobalPR boasted of a deal on “blackened shrimp” at BP gas stations, linked to the photographs of oil-soaked pelicans with the out-of-character postscript “warning: truly heartbreaking” and spoke of how “we’ve modestly made modest changes to this modest gulf.” Beyond its followers, BPGlobalPR benefits from retweeting, becoming grist for other Twitter feeds. On Saturday, this cynical packet — “Safety is our primary concern. Well, profits, then safety. Oh, no — profits, image, then safety, but still — it’s right up there” — was bounding its way across the Internet.
But, the Grey Lady warns you, just because something is on The Twitters doesn’t make it legitimate:
Knowing who’s who on Twitter has been a challenge since the beginning: the basketball great Shaquille O’Neal created his own Twitter feed, with the insistent handle The_Real_Shaq, after someone was pretending to be him. The impersonations had become so problematic that Twitter created “verified accounts” last year assuring followers that the person controlling the account was the real deal.
Far be it from me to cast aspersions on people who use Twitter for comedic ends. Having received an order to cease and desist from a foreign government for allegedly impersonating one of their ministers on Twitter, I am no citadel of righteousness when it comes to tweets.
But the Times buried the lede here. It’s well-known that on Twitter, as elsewhere on the internet, satire (in its better forms) and fraud (in its black hat variety) run rampant. Only in the final paragraphs does the article get to the transformative aspects of this:
While satire has always been with us, certainly longer than public relations executives have been, the Internet is democratizing the process, said Miriam Meckel, a professor of communications in Switzerland who is a fellow at the Berkman Center for the Internet and Society at Harvard studying the impact of Twitter and social media services on journalism.
And that is the real story here. Bursting the bubble of a pompous company is nothing new; being able to do it and have 11 times as many followers (that is, market share) as the object of your derision is what’s new. Blogs, social media, Twitter, et cetera provide myriad ways for normal folks to, if not comfort the afflicted, at least afflict the comfortable. And there are few better ways to hold power — whether in the form of political leaders, firms, or self-appointed social saviors — to account. No longer can a powerful, politically connected company like BP attempt to spin and manage its way out of wrecking hundreds of miles of coastline. This is changing brand management in a way we don’t, I think, fully understand.
It’s not that the facts are getting out. It’s that the Zeitgeist is being established independent of any entity with which BP can directly plead, cajole, or threaten. We are crowdsourcing the establishment of the snarky, ironic conventional wisdom. And in many ways, this is a much more powerful thing than the rise of mere fact-reporting bloggers.
It’s not just about reporting, which is how Web 2.0 (for lack of a better term) has largely been discussed. This isn’t the democratization of information. It’s the democratization of the takedown, the skewering, the needling. This is not the news media being disintermediated — it’s the professional satirists in the vein of Mencken and Rogers and Jon Stewart being replaced by amateurs, and lots of them. It makes it harder for any big entity or brand to remain hallowed and righteous for very long.
On a more prosaic level, we saw this as well with Helen Thomas over the last week. After declaring her wish for the Levant to be Judenfrei, she tried to back out and apologize. And in an earlier era, she might have been able to control the news cycle long enough for it to be buried. The facts here were never in dispute; she was caught on a Flip camera, so chalk that up as a victory for Web 2.0 as we understood it five years ago. But over the weekend she was so badly skewered by thousands of satirists (sample Twitter #helenthomasmovies titles: “10 Things I Hate About Jews,” “Goys Don’t Cry”) that today she was forced to resign from, well, whatever it was that she did.
The BP oil spill is the first major national event where the bad guy in question is subject to lampooning not just from a satirical elite but by anyone with the material and the gumption to set up a Twitter account, or hell, create a funny hashtag. Democratizing the news was a step forward. Democratizing our skepticism towards all form of power is an even greater step.
Starting tonight, I’ll be in Disney World for a week. My father-in-law is turning 60 and he’s taking the whole family (not least his three-year-old grandson) to Florida. I’m very much looking forward to it, especially seeing my wife’s great family.
That said I have to admit that Disney World would not be my first choice of vacation destination. The reason, I tell myself, is that I don’t care for artificial experiences. At Disney World, “cast members” are never allowed to frown, for example. The smell of fresh-baked cookies is pumped into the air around “Main Street.” In fact, the very idea of a long lost American main street is fake.
But then I think, isn’t immersing ourselves in fantasy exactly what we do when we go to the theatre or read a book? Disney World is just intensely more immersive, that’s all. Why not just enjoy the ride? That’s true, and that’s what I intend to do. And I’ll make sure to report back here on what I learn.
Let me briefly tackle another objection to Disney World. One can conceive of two types of travel–for escapism and for enlightenment. Two me, these don’t have to be mutually exclusive. They are probably on a spectrum. An all-inclusive beach resort is way to the escapist end, but you can learn something if you try, and while tooling around souks in Turkey may be enlightening, there’s certainly an element of escapism or it wouldn’t be a vacation. So while I don’t think there’s anything wrong with escapism (quite the opposite) I’m thinking I’ll be able to learn a lot about America at this most escapist of destinations.
Over at my humble podcast, I interview “Is Google Making Us Stupid” author Nick Carr about his new book, The Shallows, and what the internet is doing to our brains. Nothing good, he argues.
Carr’s publicist deserves a gold medal because the NYT today is running a series of articles on the “trend” that Americans are coming to the conclusion that gadgets and always-on connectivity is turning their brains to mush (one, two, and three). What’s more, on its Bits blog, the NYT is asking for volunteers to unplug from the internet and then report on their experience. And Carr had op-eds in the WSJ on Saturday and the WaPo yesterday.
So this is all to say, listen to my podcast. But also to ask, do you feel more distracted, unfocused and forgetful since the rise of the internet? For some of us “before the internet” is a meaningless distinction. Do you find it hard to concentrate on deep reading? Do you read as much as you used to?
From Scientific American, another disability turned on its head:
By examining photographs of artists, Livingstone and her fellow researchers found that Andrew Wyeth, Edward Hopper, Marc Chagall, Jasper Johns, Frank Lloyd Wright, Robert Rauschenberg, Alexander Calder and others all had misaligned eyes. (And by studying the self-portraits and etchings of Rembrandt, she found he also seems to have had a strong lazy eye.) Why this pattern? She proposed that people who have less detailed three-dimensional vision of the world might have an easier time translating what they see onto the two-dimensional page—whether it was for a painting of a dinner scene, sketch for a mobile or plan for a building.
I usually find sports metaphors for non-sporting activities to be strained at best, trite at the mean, and misleading at the worst. Especially when discussing trade and economics, which are non-zero sum games, sports metaphors (where one team inevitably wins and one loses) are lacking.
But Steve Horwitz hits some dingers over at Coordination Problem with some observations about the lessons from last night’s terrible call that cost Armando Galarraga a perfect game (what would have been the third of the season) and what they can tell us about public relations and the rule of law. Steve’s a Michigander, so this has got to be especially painful for him since no Tiger has ever pitched a perfect game.
The reaction by almost everyone after the game was really classy. The Tigers bitched a bit, but didn’t go off the rails. The umpire admitted he just flat out blew the call. No excuses, just “I screwed up and cost the kid a perfect game.” He also apologized personally to Galarraga. This is an outstanding lesson in how to handle a huge public mistake: don’t try to cover your tracks, just admit you screwed up and apologize to those affected. As frustrated as I am by the mistake, I totally admire Joyce for the way he handled it, and Galarraga too, who gracefully accepted the apology.
A great lesson for PR and for life generally. Unfortunately, firms and individuals almost always do the opposite: run, dissemble, cast aspersions, cast blame, and hide. That’s why “getting out in front of it” is so important; it’s the right thing to do and it looks best in the long run. Steve continues:
If I were Bud Selig, the commissioner, here’s what I would do to try to serve some rough justice. He can’t overturn an umpire’s call on the field, as that sets an awful precedent. But what he could do, I believe, is have the official scorer for the game change the play from a hit to an error. That would still deny Galarraga the perfect game but at least give him a nearly-as-cool no-hitter, and it would do so without overriding the ump’s judgment call. Just because the batter was ruled safe, doesn’t automatically mean it was a hit!
If I were Bud Selig, I’d drown myself in a bucket of my own spittle for two decades of unrelenting debasement of our national game, but that’s neither here nor there. Steve’s absolutely right on this: the rule of law and precedent are more important than correcting a one-time injustice. You can’t have baseball commissioners overriding calls on the field willy-nilly; baseball is a sport of rules and not of men (unlike some other sports). Steve’s remedy seems like the best available one.
Clay Shirky describes his media diet to the Atlantic. The whole thing’s a good read (and at the end there are links to media habits of other interesting people), but here’s the part that caught my eye:
In general, there’s no real breaking news that matters to me. I don’t have any alerts or notifications on any piece of software I use. My phone is on silent ring, nothing alerts me when I get a Tweet and my e-mail doesn’t tell me when messages arrive.
I also don’t read any of the big tech aggregators. Knowing that, for instance, Google just bought Blogger, isn’t that useful for me to hear today rather than tomorrow. Some of Michael Arrington’s stuff I think is an example of the worst kind of breaking news. The kind of Apple Insider stuff where they publish something every day to satisfy the news cycle. It’s gossip coverage like following movie stars and it distracts me from thinking longer form thoughts. …
For decades, I religiously read the op-ed pages of the New York Times but recently I’ve stopped because every op-ed is so closely tied to a newspeg that the thinking never gets very far from current events. So I’ve recently gotten away from the daily news cycle. I’ve got a weekly clock cycle and a monthly clock cycle. Time is a precious commodity. Increasingly, I’m trying to maximize it.
Several things strike me about this. First, I’m happy to find a kindred soul who doesn’t read news. People are surprised when I tell them that I don’t read newspapers and simply get my “news” from the ether. It’s a great way to make conversation: “So what happened with some baseball umpire yesterday?” Related to this is what I perceive as the increasing futility of the op-ed, or even blogging about current events, especially the latest policy turn in the tech or telecom sectors that I follow. It’s the same script, over and over, same arguments, slightly different sets of facts.
Finally, it seems like Shirky is accepting Nicholas Carr’s argument that the internet is distracting us and changing the way we think to the point where we can’t think deep thoughts any longer. At the same time, he’s offering a solution: turn it off. You don’t have to check it every five minutes. Unfortunately for most people, that’s easier said that done and requires lots of discipline. But, being aware of the issue is the first step toward addressing it.
Aaron has a post on hipsters that cites Dan’s essay on irony, the point of which is that the line between sincerity and irony has disappeared. Dan, probably ironically, said, “I no longer distinguish between that which I do sincerely and that which I do ironically.” This got me to thinking, and I’d like to propose a litmus test for irony.
Irony requires an audience. If you listen to Taylor Swift when you’re by yourself, with no one watching you, you’re not doing it ironically. You’re doing it sincerely. This test does not work on the reverse case. If you’re dressed like this in front of your friends, you might be doing it sincerely. We can’t tell.
I think this is what bothers Aaron about hipsters, that everything becomes performance art. While most irony is about humor, the irony of hipsters is all about signaling. When Dan says we don’t know if we’re doing something ironically, I think he’s hinting at how hipsters, especially in their extreme incarnations, seem to have forgotten what it is they are signaling. It’s no longer a smart quip that signals their awareness of the absurdity of modern culture, it’s just signaling about signaling.
Whatever the case, you can’t signal to yourself, and that’s the bright line for irony.
Interesting report at the Chronicle about how Google Books is beginning to allow literature scholars to use data-mining techniques on novels:
New insights can be gleaned by shining a spotlight into the “cellars of culture” beneath the small portion of works that are typically studied, [Franco Moretti, a Stanford professor of English and comparative literature] believes. He has pointed out that the 19-century British heyday of Dickens and Austen, for example, saw the publication of perhaps 20,000 or 30,000 novels—the huge majority of which are never studied. The problem with this “great unread” is that no human can sift through it all. “It just puts out of work most of the tools that we have developed in, what, 150 years of literary theory and criticism,” Mr. Moretti says. “We have to replace them with something else.” Something else, to him, means methods from linguistics and statistical analysis. His Stanford team takes the Hardys and the Austens, the Thackerays and the Trollopes, and tosses their masterpieces into a database that contains hundreds of lesser novels. Then they cast giant digital nets into that megapot of words, trawling around like intelligence agents hunting for patterns in the chatter of terrorists.
Unsurprisingly, this has sparked a methodological debate in the field:
Novels are deeply specific, [Katie Trumpener, a professor of comparative literature and English at Yale University] argues, and the field has traditionally valued brilliant interpreters who create complex arguments about how that specificity works. When you treat novels as statistics, she says, the results can be misleading, because the reality of what you might include as a novel or what constitutes a genre is more slippery than a crude numerical picture can portray.
Worth a read.
Christopher Hitchens’ long-awaited memoir Hitch-22 comes out tomorrow. (Fortunately, the publisher, Twelve, has seen fit to release it same-day on Kindle.) Here’s a partial list (below the fold) of what the reviewers are saying about it. I have it on pre-order; I only wish it had come out in time for the long Memorial Day reading weekend.