Thursday, June 30, 2005
Why? What’s wrong with “close reading” (“explication de texte,” if you want to be snooty about it)? After all, it has been the mainstay of literary analysis for, oh, sixty-odd years now and has opened up the study of literature, allowing for marvelous new directions of inquiry.
Unfortunately, though, by the 1970s, it had become too much the rule. Even today, it is often called “the most important skill” for the study of literature and “the building block for larger analysis” (just to pick the first two things I found on a quick Google search). Many of us teaching now, in fact, were never presented anything but “close reading” in our classes—certainly, we were never encouraged to move on to “larger analysis.” Sure, we still had a few grumpy historicists about us, trying to keep the clock from passing WWII, but we ignored them, for the most part.
Because of its predominance, by 1980, as a tool “close reading” had come into a position where it was determining the outcome of analysis and limiting the scope of literary scholarship. If you limit the range of tools you are working with, you necessarily affect the outcome of your project, limiting it just as much. That frustrated many of us, leading us to follow the examples of people like Stephen Greenblatt into realms like New Historicism, where “close reading” has a more limited impact—becoming simply one of a number of different tools that can be used.
Thing is, in reaction to the dominance of one tool, we can easily forget that it still does have value. Let me give an example from a hobby of mine, photography:
When I first became seriously interested in taking pictures and developing them (in the 1980s), photography had already become quite sophisticated and automated. There were all sorts of ways of preparing for what would be the final print (and now, of course, there are many, many more). I had friends, however, who rejected many of these tools, believing that they reduced the “purity” of the result. Some would even file out the edges of their negative carriers so that a black border would surround their prints, showing that their framing was all in the camera—that no cropping had occurred in the darkroom.
Because of this, they limited themselves in more ways than, perhaps, they understood. They couldn’t really use rangefinder cameras, for example (it’s difficult to judge what the exact framing would be with a rangefinder, not to mention questions of parallax), and thus couldn’t take advantage of the lighter weight and smaller size of the rangefinder.
As a rangefinder aficionado (and as someone in love with darkroom work—where I did most of my framing through cropping), I thought they were silly. It didn’t matter where the composition occurred, in my view, only that it was reflected in the final print.
Still, that did not mean that my friends were wrong. Eventually, I learned what really was behind their decision: it forced them to work with the camera and the image itself, not simply with the image of the image that I was working with in the darkroom. In other words, their decision to limit themselves forced them to spend more time considering their subject at the time of the shoot—not a bad thing.
As time went on, I started to limit myself, too—though in other ways. I stopped using the light meters in the few cameras of mine that had them, turning to a hand-held one that gave more flexibility and could be used to turn reliance away from averaging. I also stopping focusing through the camera, but relying on knowledge of the focal plane available for each different aperture. This allowed me to place my central subject in different areas of the focal plane, not always keeping it at the center.
I did these things for my own pleasure, and so that I could learn more about the camera, myself, and even my subjects. And learn a lot I did.
But these things did not make me a better photographer. Still, I had to remind myself not to look down my nose at someone who hadn’t been so totally influenced by Ansel Adams as I. I had to remind myself to look at the result, not the method. If I started to say that the only good photographs possible were ones created the way I created mine, I would be limiting the field through tools, not through the aesthetics inherent in the outcome.
If photographers in general had been forced to work in as constricted a fashion as I do, photography would be much the worse. That doesn’t mean that what I like is bad, but that it is limiting. I can accept that limit (and I do), but I know I am not as likely to produce a great photograph as someone willing to learn a wider set of tools and work with them.
That, though, doesn’t mean that it is not worthwhile for a photographer to learn about depth of field through manipulation of the focal plane, or to come to an understanding of parallax through use of a rangefinder. It simply means that great photographs can be taken without this knowledge or skill.
The same can be said of “close reading.” It’s but a tool, and one that can be used for learning a great deal about literature. But it is not a necessary tool.
The last two decades have seen a break from dependence on “close reading,” but it has not been an easy break to make. Too many did (and many still do) consider it a necessary tool, and desired to keep limitations such as “the intentional fallacy” in place.
That I don’t find “close reading” necessary does not mean that I do not respect it at as tool—nor does it mean I don’t use it. Yesterday, I taught Sylvia Plath’s “Daddy” (a poem most all of us use at one time or anther). We did a “close reading” of it, for it is a poem quite open to “close reading.” But the poem can’t be understood, any longer, simply through its language (even in the broadest sense) for it is a poem that uses cultural actualities that have changed a great deal since it was written.
If we did not look at 1940, 1962, and 2005, our “close reading” of the poem would have left us with a paltry sense of the poem.
That’s not the fault of “close reading,” though, any more than it is the fault of my rangefinder that my picture does not capture all that it could.
Friday, June 24, 2005
That, however, doesn’t mean that there aren’t things needing reform in academia. One of these is the entire tenure system which is casting a pall over our universities and allowing for attacks that may be at least partially justified.
When I have brought up this topic in the past, I’ve been attacked by tenured academics as if I were reacting simply from sour grapes. I’ve been told that I’m angry because I just have not been good enough to achieve tenure. So that won’t happen again, let me give a bit of information on my background: for fifteen years after earning my Ph.D. in 1988, I did not try for a full-time career in higher education. I had other things in mind: Peace Corps in West Africa and opening a store, among others. Only in the last two years have I actively considered higher education as a career (and am now teaching full time for my first time ever in the United States). By no standard now in place could I have even been considered for tenure so far—let alone be turned down for it. I have not been in the field long enough (and I am happy with my progress, to date). I have had a book published by a reputable press (and have a contract for another), have seen a number of articles reach print, have made academic presentations, watched my dissertation appear as a book in translation in Europe, and have fine recommendations from students and peers. I also was awarded (and accepted) a Senior Fulbright Lectureship. Certainly, I am not writing as a disgruntled outsider, but simply as someone turning to the profession later in life. So, please, don’t dismiss my arguments on an ad hominem basis. What I am saying is important, not who is speaking.
Most simply stated, tenure is “the right not to be fired without cause after an initial probationary period.” Usually, the probationary period is five or six years. This allows the institution to be sure that the professor is committed to it and to evaluate her/his performance over a long term. In the ideal, it means the professor has proven his/her worth and can be granted the right to take intellectual and professional chances without fear of retribution.
Sounds good? Sure, and it is—in the ideal. But watch out for unintended consequences!
The first of these is a wall that’s been built between those with tenure and those without. Even if everyone is on a tenure track (which is not the case any longer—but more on that later), the distinction between those who have made the grade and those who have not is ever-present. As a result of this, the pressure to conform on those wanting to cross the wall is great: no one wants to put in all those years of effort only to be rejected. So, while those with tenure may have more academic freedom, those without, paradoxically, actually have less than they might if tenure did not exist at all. Often, they are investing too much in the possibility of tenure to be willing to rock the boat.
So, wanting to insure that their efforts are rewarded, many on the tenure track (consciously or unconsciously) adopt the views of those with tenure—those who will be sitting in judgment on them. This can (and does) lead to the perception that there are “party lines” that must be followed if one is to gain tenure. As a result, though there are other reasons why academia tends to be more liberal than much of American society, people are able to point to the tenure system as a means for keeping out those with more conservative viewpoints.
In the 1960s, in the wake of the Free Speech movement at Berkeley (and elsewhere), American universities did become high-profile magnets for the left—and certainly were focal points of the antiwar movement that followed. As a result, public perception of the universities (if not the fact) was that they were dominated by the left. The identity politics (and the perception of “political correctness”) of the seventies only ratified this belief. Today, many Americans see the universities as places where well-paid and comfortable leftists not only keep out those who might disagree with them but use their sinecures as platforms for brainwashing generation after generation of college students.
This perception (again, it is not the reality) has played into the hands of those who want to see the universities (at least the state run ones) come more under the control of the political system to make them reflect more accurately the mind-sets of those in power.
Sure, the perception is wrong, but our universities certainly do grow out of a humanist tradition that is increasingly antithetical to the political right in this country and that allows universities to be pegged as having an overt leftist agenda.
To make matters worse, because of expansion of the universities in the sixties and seventies, we now have both a top-heavy system of too many tenured professors (at many institutions) and many more qualified applicants for available positions than will ever be absorbed. Not only does this increase the pressure to conform, but it leads to resentments on the part of those not chosen—exactly the sort of resentments I have been accused of harboring. Even in a highly-competitive job market, it’s easier to say that one didn’t get the job “because they don’t like my politics” than it is to accept that another applicant might be better suited.
Another unintended consequence of the tenure system has been the growth of means of bypassing it. With higher and higher percentages of tenured faculty (and growing costs, as a result), the universities moved to create cheaper, non-tenure tracks for staffing classrooms. These have generally been of two sorts: first, the limited-term appointment and, second, the use of adjunct (part-time) teachers (either graduate students, those still trying to find full-time jobs, those without terminal degrees, those who teach part-time at a number of schools, or those who have turned to another career but would like to keep their hands in). This has led to what is, effectively, a four-tier system in a great number of universities: tenure, tenure-track, temporary, and adjunct. One’s value and ability is perceived as increasing at each upward step—which is both unfortunate and unfair (some of the best teachers I know are adjuncts). One’s income for the same work, certainly, is quite different at each level (though less so between tenure-track and temporary).
It has become harder than ever before even to get onto the tenure track, and extremely difficult to get tenure. This has led to another misperception (but also with more than a grain of truth), that tenured professors think “well, I got mine” and start to coast. They’ve made it, after all, to the top—so why bother to do more than they have to? The people in the job market today sometimes see themselves interviewed by tenured faculty with fewer scholarly publications and credentials than they (the applicants) have… by people who, in today’s job market, would not even be even asked to interview for the job. This is another source of resentment that the right, in its drive to gain control of the universities, can tap into. The pressure on young academics is so great today that they are publishing younger and more frequently than did the older generation—they have to, or must leave academia. The older, who came up without the same sort of pressure, can sometimes look like “dead wood” by comparison (more likely, they saw their careers as focused on teaching, not research—something young academics cannot do in today’s super-heated “publish or perish” environment). Once more, those who publish but cannot find academic jobs often feel they must find someone to blame—after all, they are more “qualified” than many of those passing judgment on them—so, if they are at all right of center, they can easily make disdain for their politics the culprit for their failure to make it to the “inside.”
Perception is the greater part of “truth.” And perception of the tenure system has helped lead to the calls for academic reform that have found voice in at least sixteen state legislatures. Tenure is (and should remain) an important part of academia, but that does not mean it cannot be modified. To me, its glaring problems are two, the probationary period and tenure’s use (all too often) as simply a system of job security. Why should a young academic have to go through a period of years where she/he cannot express themselves freely, for fear of not attaining tenure? Shouldn’t this system, meant to protect academic freedom, protect all academics, not simply those willing to wait it out? It’s understandable how universities are reluctant to extend tenure that way, for then (given the way tenure is administered these days) they would never be able to get rid of anyone. That brings us to the other problem: sure, in principle, even a tenured professor can be fired “for cause.” But how often does this happen?
Tenure should be an expanded and more limited right: every academic should be covered, but only for specific academic (and political) activities. If it were so, many of the misperceptions that allow academia to be so easily attacked could be dispelled.
If it were so, our universities might even be improved.
Thursday, June 23, 2005
We regard the Scriptures as the records of God’s successive revelations to mankind, and particularly of the last and most perfect revelation of his will by Jesus Christ. Whatever doctrines seem to us to be clearly taught in the Scriptures, we receive without reserve or exception. We do not, however, attach equal importance to all the books in this collection. Our religion, we believe, lies chiefly in the New Testament….
We profess not to know a book, which demands a more frequent exercise of reason than the Bible. In addition to the remarks now made on its infinite connexions, we may observe, that its style nowhere affects the precision of science or the accuracy of definition. Its language is singularly glowing, bold, and figurative, demanding more frequent departures from the literal sense, than that of our own age and country, and consequently demanding more continual exercise of judgment.—We find, too, that the different portions of this book, instead of being confined to general truths, refer perpetually to the times when they were written, to states of society, to modes of thinking, to controversies in the church, to feelings and usages which have passed away, and without the knowledge of which we are constantly in danger of extending to all times, and places, what was of temporary and local application.—We find, too, that some of these books are strongly marked by the genius and character of their respective writers, that the Holy Spirit did not so guide the Apostles as to suspend the peculiarities of their minds, and that a knowledge of their feelings, and of the influences under which they were placed, is one of the preparations for understanding their writings. With these views of the Bible, we feel it our bounded duty to exercise our reason upon it perpetually, to compare, to infer, to look beyond the letter to the spirit, to seek in the nature of the subject, and the aim of the writer, his true meaning; and, in general, to make use of what is known, for explaining what is difficult, and for discovering new truths.
Reading something like this in 2005 is extraordinarily saddening, for this is one of those cases where, looking back, we see not have far we have come, but how far we have fallen.
The Christianity of Channing’s “we” is one of openness, exploration, and curiosity—as well as of belief. Their Bible is a living document, one which, though created by humans and showing their limitations, is imbued with the spirit and guidance of God.
Sure, there are plenty of Christians around now who will nod in complete agreement with Channing’s statements (and his words were not without controversy when delivered), but much of Christianity has been hijacked, today, and stolen away into a rigid, unforgiving place where the Bible is used to rule things out, not bring people in.
Too bad. So much beauty lost. So much compassion and understanding left by the wayside….
Monday, June 20, 2005
A great man has sad that ignorance lies at both ends of knowledge. Perhaps it would have been truer to state that deep convictions lie at the two ends, with doubt in the middle. In fact, human understanding may be considered as having three distinct states which frequently follow one another.
Man has strong beliefs because he adopts them without looking deeply into them. Doubt arises when he is faced with objections. He often succeeds in resolving these doubts and thereupon he believes once again. This time he no longer seizes truth by accident or in the dark; he sees it face to face and walks straight toward the light.
--trans. Gerald Bevan (London: Penguin, 2003. 218)
If de Tocqueville’s description is accurate, then doubt plays an essential role in the development of belief. He goes on a bit later:
It may be guaranteed that most men will halt in one or other of these two states, either believing without knowing why or ignorant of what precisely they ought to believe.
Only a very small number of men will ever be blessed with the attainment of this other kind of deliberate and self-confident conviction born of knowledge and arising from the very heart of agitation and doubt. (218)
Without doubt, we can never move forward, de Tocqueville (writing almost a century and three-quarters ago) seems to be arguing. Doubt, in other words, is an essential component of the American democracy he was describing with such detail and (I believe) accuracy.
And doubt, I sincerely believe, is one of our most important tools in the struggle to get that democracy back on track. Just because so many on the right have never moved forward from that initial state of simplistic belief we don’t need to pretend to be in the same state. Just because they cannot comprehend the value of doubt we do not need to hide it.
A couple of notes: on the next page, de Tocqueville says:
When abstract opinions are in doubt, men end up by handing on to their instincts and material interests alone which are more obvious, tangible and permanent than opinions. (219)
Though there seems a groundswell of belief in America these days, I suspect it is more accurately what de Tocqueville describes here. Look around: are these so-called Christians really following the teachings of Jesus, or are they merely using Christian writings to justify their protective and knee-jerk material interests?
At the end of the chapter I took the above quotes from, de Tocqueville writes:
Whether democracy or aristocracy is the better form of government constitutes a very difficult question. But, clearly, democracy inconveniences one person while aristocracy oppresses another.
That is a truth which establishes itself and precludes any discussion: you are rich and I am poor. (219)
The people in power now fought for that power because they had been inconvenienced by democracy (remember that clip of Bush in Fahrenheit 9/11, talking about his “base,” the “haves, and have mores”?).
Need I say more?
Wednesday, June 08, 2005
you, like the characters, deal with a situation that can't be fully understood because it cannot be interpreted in only one way.
In other words, Doubt, like most art, raises more questions than it answers. This is such an integral part of great art that it is rare to see art that has survived over the centuries that does not lead to speculation or open itself to a breadth of interpretation.
This is no accident. The strength of our secular humanist or liberal tradition is that it encourages us to face uncertainty not simply as a puzzle to be solved, but as one of the core elements of being. As, in fact, one of the beauties of existence.
In the sciences, it is doubt that has allowed for progress—even for the development of the scientific method, a process for narrowing doubt through reproducibility (for one thing). Recognition of doubt leads to care and precision in the laboratory. One of the greatest mathematical proofs of the 20th Century, Kurt Godel’s Proof, establishes that doubt must remain, even within the most rigorously-constructed system.
Paradoxically, doubt and the challenging of belief has even been one of the greatest pillars of faith. Those who believe, but have the wisdom to doubt, are able to sharpen their faith and deepen it. It is the willingness to doubt that allows faith to mature.
Yet, today, when faced by vigorous and growing fundamentalist movements based on certainties of belief, many of us whose philosophies incorporate the necessity for doubt and the beauty of doubt try to hide this value. It is as though we have become ashamed of it and want to prove that we, too, base our lives on rock-solid “truths.”
It’s as though we have entered a land of people with no teeth. Though we have retained our own, we pretend that we are like the others, that our food must come to us as pap. We like to chew, however, and can’t survive without something to sink our teeth into. The others, watching us pretending to ingest the goo they eat, recognize that we aren’t happy and even guess that we are hiding something. Finally, they decide, if we are hiding something, it must be bad. Teeth, then, become bad; anyone developing them is singled out and the teeth removed.
Wouldn’t it be better to show the value of teeth? To demonstrate how much more varied and healthy our diet is with teeth than theirs without? Wouldn’t it be better, even, to show that teeth can be replaced, that they don’t have to stay in their gums-only state?
The fundamentalist right has had us on the run for some years now, partly by taking advantage of our inclination to refuse absolutes, to consider, to question. They have been hitting us about the head with their certainly. We have been running away with our tails between our legs, stopping only now and again to look at them, to admire their success and wonder how we can imitate it.
Thing is, we can’t imitate it. Over the long haul, our system based on doubt has been much more successful than theirs. So, rather than seeking a certainty to counter theirs (which can only be a second-rate version of them), we should be accenting the system of careful and caring doubt that (for one thing) built this country.
Let me give an example: the right trumpets how the United States is built on religion, and pulls us into a debate on that. We end up arguing on their ground. Instead, we should be showing how the United States is really based on the question of authority that began even amongst the religious communities that first came from England. We need to argue for the success of doubt instead of, as we have been doing, arguing against certainty.
Let’s have some pride, folks, in what we have accomplished. Let’s stop hiding in face of the onslaught.
Sunday, June 05, 2005
The first meaning of "doubt" given in the Oxford English Dictionary is "The (subjective) state of uncertainty with regard to the truth or reality of anything; undecidedness of belief or opinion." This is no lesser state, no step on the way to faith--it is a dynamic state of challenge, or continual questioning. It can keep us intellectually alive and vigorous.
Yet we hide it, many now saying that we on the left should emphasize our values and elide discussions of our doubts.
To me, however, doubt is a core value not only of the left, but of the intellectual forces that created the United States and that, yes, have been the force behind all progress in Western civilization for 2,500 years.
There's much to be said for serendipity. I've been thinking about doubt since responding to a dKos diary last night. The diarist is one I respect most, but he feels quite strongly that doubt should not be accented in discussions by members of the left with those leaning towards rightist fundamentalism. To me, that's hiding our light. So, this morning, as I was driving, I was pleasantly surprised by a show that came on to NPR, Speaking of Faith; Jennifer Michael Hecht was featured. She is the author of Doubt: A History, a book that celebrates the long history and success of doubt. She speaks of doubt as a "magical quality of human experience." I agree completely and will celebrate, not hide, my doubts.
Let's take back enthusiasm for that great value, doubt.