Tuesday, April 8, 2014

A Hothouse Manifesto: Does Stephen Ramsay Sell Literary Criticism Short?

I think so.

I first learned of Ramsay’s work from a Stanley Fish blog post back in 2012: Mind Your P’s and B’s: The Digital Humanities and Interpretation. Fish bills Ramsay as “perhaps the most sophisticated theorist of the burgeoning field” (digital humanities) and goes on to cite his essay: “Toward an Algorithmic Criticism,” Literary and Linguistic Computing, Vol. 18, No. 2, 2003. pp. 167-174. I downloaded the essay and found it uncongenial.

It’s not just that I disagree with him on various particulars. It’s that he seems to have arrived at his beliefs too easily, that things are too settled in his mind. That’s a subtler business to argue but also, I think, more important.

I propose, first, to present the abstract of his essay without comment and then to repeat the abstract, this time interspersing it with some comments.

Here’s Ramsay’s abstract (p. 167):
The inability of computing humanists to break into the mainstream of literary critical scholarship may be attributed to the prevalence of scientific methodologies and metaphors in humanities computing research – methodologies and metaphors that are wholly foreign not only to the language of literary criticism, but to its entire purpose. Breaking out of this unfortunate misalignment entails reaching for more appropriate paradigms. The 'algorithmic criticism' here proposed rejects the empiricist vision of the computer as a means by which critical interpretations may be verified, and instead seeks to locating computational processes within the rich tradition of interpretive endeavors (usually aligned more with art that criticism) which not to constrain meaning, but to guarantee its multiplicity. Computational processes, which are perhaps more conformable to this latter purpose, may be usefully viewed as ways of proving the necessary conditions for interpretive insight. Algorithmic criticism seeks, therefore, in the narrowing forces of constraint embodied and instantiated in the strictures of algorithmic processing, an analogue to the liberating potentialities of art and the ludic values of humanistic inquiry. It proposes that we reconceive computer-assisted text analysis as an activity best employed not in the service of a heightened critical objectivity, but as one that embraces the possibilities of that deepened subjectivity upon which critical insight depends.
Let me preface my commentary by, of all things, a sentence from the publisher’s blurb of a recent book. The book: Rens Bod, A New History of the Humanities: The Search for Principles and Patterns from Antiquity to the Present, Oxford University Press 2013. Here’s the sentence:
Rens Bod contends that the hallowed opposition between the sciences (mathematical, experimental, dominated by universal laws) and the humanities (allegedly concerned with unique events and hermeneutic methods) is a mistake born of a myopic failure to appreciate the pattern-seeking that lies at the heart of this inquiry.
It seems to me that Ramsay takes that hallowed opposition at face value, that he’s thinking in a world where things could not be otherwise, nor could one even imagine such a world. If I think my way back to my early-career brush with the skeptical turn I wonder whether or not Ramsay has his doubts and so is taking his strong position as a way of enforcing and redoubling that hallowed opposition.

Here’s Ramsay’s opening sentence:
The inability of computing humanists to break into the mainstream of literary critical scholarship may be attributed to the prevalence of scientific methodologies and metaphors in humanities computing research – methodologies and metaphors that are wholly foreign not only to the language of literary criticism, but to its entire purpose.
“Wholly foreign” – wholly? Completely and utterly? Beyond a shadow of a doubt? It’s one thing to assert a difference, but quite something else to insist that it is so stark. I wonder would I. A. Richards or Northrup Frye or the philologists would say to this. Ramsay continues:
Breaking out of this unfortunate misalignment entails reaching for more appropriate paradigms. The 'algorithmic criticism' here proposed rejects the empiricist vision of the computer as a means by which critical interpretations may be verified, and instead seeks to locating computational processes within the rich tradition of interpretive endeavors (usually aligned more with art that criticism) which not to constrain meaning, but to guarantee its multiplicity.
In a way I agree with his skepticism about using the computer as a means of verifying interpretations, but that’s because I see the business of explaining how texts work (in the mind, in society – naturalist criticism) as somewhat different from the business of finding hidden meanings (ethical criticism).

It’s this business of guaranteeing the multiplicity of meanings that I find most problematic. Frankly, that seems a bit like make-work for the literary professoriate, as though literary culture depends on an unending stream of critical texts exalting in the multiple meanings of the primary texts. It makes me wonder however literary culture managed to survive all these centuries until, after World War II, the academy at long last decided to move the production of interpretations into the center of its professional work.

I’m also reminded of a passage early in Geoffrey H. Hartman, The Fate of Reading and Other Essays (University of Chicago Press, 1975, p. 3):
Confession. I have a superiority complex vis-à-vis other critics, and an inferiority complex vis-à-vis art. The interpreter, molded on me, is an overgoer with pen-envy strong enough to compel him into the foolishness of print. His self-disguist is merely that of the artist, intensified. "Joe, throw my book away." Sometimes his discontent with the "secondary" act of writing—with living in the reflective or imitative sphere—makes him privilege some primary act at the expense of art or commentary on art. He turns into Mystic or Vitalist.
Is Ramsay suffering from the same anxiety? Is that why he likes these endeavors that are “usually aligned more with art that criticism”? He goes on:
Computational processes, which are perhaps more conformable to this latter purpose, may be usefully viewed as ways of proving the necessary conditions for interpretive insight. Algorithmic criticism seeks, therefore, in the narrowing forces of constraint embodied and instantiated in the strictures of algorithmic processing, an analogue to the liberating potentialities of art and the ludic values of humanistic inquiry.
Ah, we too, us critics, are interpreters! Long live art! But who is it that’s going to read this multiplicity of meanings that flows from the algorithmic yarrow stalks and marrow bones of Ramsay’s ludic machine?
It [algorithmic criticism] proposes that we reconceive computer-assisted text analysis as an activity best employed not in the service of a heightened critical objectivity, but as one that embraces the possibilities of that deepened subjectivity upon which critical insight depends.
Curious, isn’t it, that Ramsay turns to the computer to deepen (rather than heighten?) his subjectivity? What would Commander Data think of that?

Addendum (9 April 2014): Willard McCarty asks a question that puts the forgoing in perspective: “What can the digital humanities can do for the humanities as a whole that helps these disciplines improve the well-being of us all?”* It’s obvious how Ramsay’s program serves the well-being of critics, but what does it do for everyone else? Who else will know and care about all those algorithmically prompted interpretations?

*A telescope for the mind? Debates in the Digital Humanities, ed. Matthew K. Gold. Minneapolis MN: University of Minnesota Press, 2012.

No comments:

Post a Comment