Saturday, March 31, 2012

Theory and Knowledge

Too many of us in the Humanities have a sneaky suspicion that we are involved only with kitsch, that we are far removed from the avant-garde which, deep in our hearts, we worry may only be reserved for scientists. I wrote something about this ten days ago, linking our worries to a misunderstanding of the two-cultures argument of C. P. Snow and (though I didn't say so) by the inferiority complex that grew in the wake of Sputnik.

One result was the peculiar devotion to "Theory" that grew up in the Humanities starting in the 1970s and that threatened to sidetrack almost any discussion into its arcane and scholastic mazes. Film Studies, David Bordwell and Noel Carroll's collection Post-Theory notwithstanding, is still mired in pointless arguments over the primacy of one theoretical point in relation to another--or in dancing on the head of a pin. So are many Literature departments.

But we in the Humanities aren't the only ones suffering from this peculiar disease. In today's The New York Times, Political Science professors Kevin Clarke and David Primo point out that the Social Sciences struggle with the same malady. They call it "science envy."

They point out a couple of significant and related things.

First, the 'scientific method' or, as they call it, 'hypothetico-deductivism,' isn't the only way to attain, prove, and present knowledge. Not everything, after all, can be reduced to quantifiable data--precisely the problem we are facing in our educational systems today as we rely more and more on testing, forgetting that the testable isn't the totality of either education or knowledge (Clarke and Primo don't go into this--I'm simply extrapolating from their piece).

Second, we rely on all sorts of 'information' that comes to use through paths other than 'hypothetico-deductivism.' Clarke and Primo use examples from their field, but almost anyone could find others. It's reductive and confining to look to a single methodology only, especially when we judge the others through the lens of that one.

And third, any method is useful as long as it helps one get somewhere. A map, for example, can do this, sure--but it doesn't do the actual driving. And asking at a gas station works, too. Both are only tools or methods, and ones of many--all of which are held together by thought, not theory.

Clarke and Primo end their article by saying that "social scientists would be better off doing what they do best: thinking deeply about what prompts human beings to behave the way they do."

The same could be said for the Humanities.

Thursday, March 29, 2012

Stealing a Copy?

In one of my chapters, "Intellectual Property in a Digital Age," of Robert Leston's and my book Beyond the Blogosphere: Information and Its Children, I write:
The laws and economic models in use today do not reflect... changing realities of either IP [Intellectual Property] or of the possibilities of copying. At some point, in law at least, we are going to have to play catch-up by reexamining just what we mean when we say ownership. Our current assumptions and legal structures will no longer suffice. (67-68)
I thought of that this morning as I read Stuart Green's op-ed in today's The New York Times. He writes:
[W]e should stop trying to shoehorn the 21st-century problem of illegal downloading into a moral and legal regime that was developed with a pre- or mid-20th-century economy in mind. Second, we should recognize that the criminal law is least effective — and least legitimate — when it is at odds with widely held moral intuitions.
It's nice to see that I am not alone in seeing our assumptions about theft of IP as being insufficient in a digital milieu. My comment came on the heels of a discussion of the views of Marcus Boon's In Praise of Copying  and a Las Vegas Times op-ed by Stephens Media CEO Sherman Frederick, who equates stealing (copying) of copyrighted material with stealing his sports car. But the two, as Green makes clear once more, are not at all the same. And they are not the same for reasons far beyond the false equivalency engendered by the use of the word "stealing" in both cases. Stuart writes:
The problem is that most people simply don’t buy the claim that illegally downloading a song or video from the Internet really is like stealing a car. According to a range of empirical studies, including one conducted by me and my social psychologist collaborator, Matthew Kugler, lay observers draw a sharp moral distinction between file sharing and genuine theft, even when the value of the property is the same.
Furthermore, as I write:
Questions of ownership--and more--of IP (creations having no distinct and necessary unique physical presence) change as soon as any particular work becomes public (or is utilized discretely but for public purpose), if for no other reason than it is then covered by copyright, patent, trademark, or trade-secret laws. It loses its simply property status and becomes part of the intellectual commons, though with a great deal of restriction. (68)
Not all types of property are treated the same in law now, but the distinctions are glossed over by many who want to protect the ownership of IP from any infringement.

Green traces current confusions back to a 1962 Model Penal Code, but I think the problem has antecedents much older (and I am sure Green would agree), stemming from the development of the modern concepts of authorship and creativity that developed through the 18th century. By 1826, certainly, Noah Webster could (in a letter to his cousin Daniel Webster) make an argument for "placing this species of property, on the same footing, as all other property, as to exclusive right & permanence of possession."

In the 21st century, it is certainly time that we start questioning assumptions of ownership stemming from a time before the photograph, let alone the Xerox copier and the more recent and explosive digital possibilities.

Wednesday, March 28, 2012

The Naive and Hopeful Me

Yesterday, I shared with a class the Charles Simic article from The New York Review of Books that set me off so a few days ago. These particular students are enrolled in an early-college high school program that will lead them (if everything works out) to an Associates degree within five or six years of entering 9th grade. As juniors, they are also taking their first college classes, including my First Year Composition course. We are concentrating on research skills as well as writing, using 19th-century New York as our focus. The students have just completed extensive revisions on their proposals for pieces to be included in an anthology we are creating composed of class work and writings from a century ago (and more) that are in the public domain. The revisions come atop a "meta-proposal," a paper about their thought processes during the initial research and writing for the proposals.

Much of our class time, recently, has been spent on meaning and methodology of research in a digital age, the students coming to understand decisions they must make on such things as when and where to stop their library research: At what point do you decide to trust the authorities you have found--and why? They are also learning about Intellectual Property, the commons, and conventions of citation (including 'fair use'). I wanted them to understand just why I am pushing all of this on them, to understand (among other things) the importance of an educated populace in a democracy. The Simic piece seemed like an appropriate means for furthering this.

And it was.

I told the students that I have reservations about the article, but did not tell them what, and did not mention my own blog post. As I was reading it to them, I stopped at various points to explain a little about the relevance of their program to some of the things Simic complains about. That, and their discussion (we also stopped frequently to talk as a group), was eye-opening even for me, who was already familiar with the piece.

One of the places I stopped so I could talk about their class was after this passage:
At first it was shocking, but it no longer surprises any college instructor that the nice and eager young people enrolled in your classes have no ability to grasp most of the material being taught. Teaching American literature, as I have been doing, has become harder and harder in recent years, since the students read little literature before coming to college and often lack the most basic historical information about the period in which the novel or the poem was written, including what important ideas and issues occupied thinking people at the time.
I told the students that, yes, it is shocking, but that some of us are trying to do something rather than simply react in shock--and that something includes programs like the one they are in. Why do you think, I asked them, that I use 19th-century urban American literature in their writing class?

They got the point.

Simic goes on to complain that students don't even know the histories of their own homes--and so we talked about why New York City (and Brooklyn, in particular) is the focus of their research and writing. The people who created their program, I told the students, don't gripe but try to change things.

And, we discussed, it may be true that high-school curricula have been dumbed down, as Simic claims. Don't like that? Do something about it. "You," I said, addressing the class, "are already better writers and even thinkers than most of my entering students--and that's because of the intense program you've been subject to over the past two-and-a-half years." And that's true.

Discussion was lively and on target, even when not directed by me. Early on, one student complained that Simic is arrogant and is making his own rather uninformed assumptions--and he did so without prompting from me (though that may be hard to believe--you had to be there; I expect the expression on my face as he talked was one rather much of shock or surprise). Though most of them agreed with some of Simic's points, they all also saw his weaknesses--and were able to discuss the article without falling into simplistic patterns of one-or-the-other.

Toward the end of this article, Simic lists a number of things many Americans believe without any attempt at verification. As we went through the list, I could see that the students were beginning to understand the importance of what we are doing in class. We've already talked about the importance of being able to research and write about it competently as a part of almost any career path today, but we really had not focused so much on the role of the citizen in a democracy. Probably, I did not want to seem naive and idealistic even to my students, most of whom already have a rather jaded view of the world.

But I am an idealist. And an optimist.

And, very likely, I am naive. But I was pleased yesterday.

Unlike Simic, I don't believe people today are more ignorant than they've ever been in America. At worst, they are simply just as ignorant as they were a century ago (we went through an educational boom after World War II--but that does not mean people were educated before that). At best? Well, that has to be up to us, each of us individually. We can't simply complain about the world going to hell in a hand-basket but have to find even a little way of helping reverse the trend.

I must admit, though, that even ranting in The New York Review of Books may help. After all, it proved useful to my students. Thanks, at least, for that, Professor Simic.

Tuesday, March 27, 2012

Take It On Faith

Thank goodness for Stanley Fish. Not 'thank God,' 'thank goodness.'

I never thought I'd say that. Fish has made me uneasy since the day I first saw him on a panel at a Modern Language Association convention some thirty years ago. He demolished an opponent--not by force of reason but by mere stage presence, by cockiness and ability to manipulate both audience and debate. Today, though, he has an op-ed in The New York Times in keeping with my own ranting "letter" to Charles Simic from last Saturday. In it, he uses the debate between "science" and "religion" to make a point about faith of all sorts (and about its necessity), a point that the 'liberal elite' tend to ignore as they pride themselves on their great knowledge base--as opposed to ideas that they see as based "merely" on belief.

Everyone has faith, and we are all biased. A belief in atheism is still a belief, as is a belief in science. There's a leap of faith behind all of our positions--we couldn't operate, otherwise. And we all accept authority. Fish talks of a Richard Dawkins defense of science as something you can look up, chapter and verse--Dawkins apparently unaware that he was stepping into a religious methodology.

Close to the end of his piece, Fish writes:
But the desire of classical liberals to think of themselves as above the fray, as facilitating inquiry rather than steering it in a favored direction, makes them unable to be content with just saying, You guys are wrong, we’re right, and we’re not going to listen to you or give you an even break. Instead they labor mightily to ground their judgments in impersonal standards and impartial procedures (there are none) so that they can pronounce their excommunications with clean hands and pure — non-partisan, and non-tribal — hearts. It’s quite a performance and it is on display every day in our most enlightened newspapers and on our most progressive political talk shows.
It's also a performance that drives the right in America crazy. They know it is smug and they know it is wrong-headed--but they have yet to find a way to combat it any more than the liberals can successfully counter religious belief.

It drives me crazy, too, for I believe the liberals are more often right than are the conservatives. But their attitude toward their belief, their unwillingness to admit that they rest on faith as much as Rick Santorum, say, does, makes me want to bang my head against the wall.

Monday, March 26, 2012

Breitbart and O'Keefe: The Legacy

Last week, News with a View, an anthology of essays edited by Burton St. John and Kristen Johnson appeared. In it is an essay of mine, "The Pride and Reward of Falsification: Post-Objectivity as Post-Responsibility," an examination of some of the "journalism" of Andrew Breitbart and James O'Keefe. Two weeks and a day before publication of the book (on March 15), Brietbart died unexpectedly. O'Keefe, though still alive, now seems completely marginalized, a 90-minute wonder destined only to be a reminder of the craziness of an earlier era.

As internet possibilities for publicity and for "citizen journalism" hit their stride about seven years ago, a number of people tried to use the web to muscle their way past the traditional gatekeepers and methodologies of journalism. Some of them saw, correctly, that the profession had calcified, was too self-satisfied to respond adequately to the new digital age. Others simply saw an opportunity not being taken, and reached in for their own benefit.

The result has been a changed media landscape, of course. One with new powerhouse "publications" (such as The Huffington Post) and an entirely new way of work for journalists--who, in response to the blogosphere, have had to adapt or leave the profession.

Many of the "new" journalists, like Brietbart and O'Keefe, had no real interest in the ethics of journalism--or even in the profession. Journalism, to them, was nothing but a means to a political end. No one, within the traditional elements of the profession, knew how to deal with them, for no one like them had managed to have a major impact on reporting since the days of Walter Winchell. The overtly political had been moved aside, given space on op-ed pages, on television talk shows, and on the radio--but it was not seen as active in day-to-day reporting and investigation.

The profession, also, had fallen into a lazy, self-protective pattern of pretended objectivity, of presenting two "sides" of an issue as a means of deflecting any accusation of advocacy. The dangers of this, an attitude of 'false equivalency,' was one of the things that opened the door for the Breitbart's and O'Keefe's of the last five years. News organizations, however, have finally opened their eyes to this, the latest being National Public Radio.

Though the journalism profession was blindsided by people willing to take advantage of the quest for 'fairness' that 'false equivalency' fostered, it has now learned not to assume that people presenting stories are themselves trying to be honest and objective. Taking things out of context, setting up false but attractive narratives, and blustering accusation by these "new" journalists led more timid and less skeptical traditional journalists to both back down and accept what the Breitbart's and O'Keefe's were presenting. That no longer happens... or does, but much more rarely.

As time and time again, the stories that Breitbart and O'Keefe created proved specious (even the Anthony Weiner "scandal" had little real substance--just a man behaving like an idiot) and the two acted more and more strangely (Breitbart yelling at protesters; O'Keefe trying to embarrass a CNN anchor), journalists were also waking up to the possibilities for abuse that these two--and others--had taken advantage of. The profession was learning to institute new safeguards, new vetting and fact-checking--not only internally for its own stories (see the Stephen Glass imbroglio at The New Republic), but in the profession as a whole. No statement, any longer, is taken at face value--even if by someone else in journalism (that 'professional courtesy,' of necessity, has now disappeared).

Today, for example, Charles Seife, a professor of journalism at New York University, blogs about his own interactions with O'Keefe, one of his workers, and his organization Project Veritas. Seife wanted verification of the PV's non-profit status, but could not get it (because, it turns out, it did not exist). O'Keefe, in retalization, targeted Seife, using a woman named Nadia Naffe to conduct the sting. It failed, and now Naffe and O'Keefe have fallen out, neither looking quite so good as a result. Summing up his experience, Seife writes:
We sometimes do things that are ethically gray; journalists don't live in a world where right and wrong are divided by bright lines. But O'Keefe seems scornful of numerous voices that are telling him he's crossed ethical boundaries that he shouldn't have crossed, including voices from within his organization.

We believe that it can be worth breaking the law for the greater good; many of us would gladly go to jail to protect a source, for example. But O'Keefe shows contempt for the rules of our society: he has broken the law in the past, and, even discounting my discovery about his nonprofit, it looks like he continues to break it by inducing others to commit illegal acts on his behalf. (Though the courts haven't been entirely clear on the matter, "Ashley," as well as Jay Rosen's visitor, "Lucas," appear to have broken the law when they lied to gain access to our offices.)

O'Keefe is a journalist reflected in a funhouse mirror. He appears to be grappling with the same moral and ethical issues that we journalists do all the time, but his actions are distorted and twisted by his own personal agenda into something unrecognizable.
It takes a clear ethical base to be a "real" journalist, something neither Breitbart nor O'Keefe ever had. Only now, however, is the profession beginning to demand this of its practitioners--even of the ones outside of the mainstream.

It's about time.

The legacy of Breitbart and O'Keefe? Probably a better and more self-conscious body of journalists--and one that will no longer allow its own lapses to create means for the likes of these to join in and pretend to be one and the same as the professionals. On the other hand, it may also teach the professionals to have a little more respect for amateur journalists and to take them seriously. If Breitbart and O'Keefe had been taken seriously from the beginning, instead of being scoffed at (as they often were) by "real" journalists, they would never have managed to create the controversies they were responsible for.

Saturday, March 24, 2012

Ages of Ignorance

Dear Charles Simic,

I've just read your piece "Age of Ignorance" in The New York Review of Books. My wife says it sounds a lot like me--arrogant, that is. She's right. And I do agree with much of what you say, and could easily find myself saying much the same thing. I count myself as a progressive as, I assume, you do. We are close together on many issues, I am sure. On the other hand, I know that you've got it terribly, terribly wrong.

This is a difficult piece for me to write, for not only are we on the same "side," but you are a much more experienced teacher than I am, have seen more of the horrors the world has to offer than I have, and are skilled as a writer in ways far beyond me. But, from your essay, you are also a great deal more ignorant than I--about a large part of America, that is... the very part of it you write of in your essay. The very people you call ignorant. And you are blind to your own conceit.

Though I no longer live among them, having been whisked to the other side by my own parents, themselves "refugees" from what is now called 'red-state America,' these people you write of as so ignorant are my extended family, and I grew up with them as much as away from them.

What you see as their ignorance, I see as something else indeed.

The people you are calling ignorant have been called that for generations. No, for centuries. Culturally, they are descended from groups of people who had long been pushed to the edges of what their oppressors called "civilization." They were used by the "civilized" as a barrier against the "barbarian," and sometimes have even been called "borderers" for their role. This role was first seen in the Scottish lowlands, then in Ulster Plantation, then in the English colonies where they found themselves between the Native Americans and the coastal settlers, and then (finally) when they were manipulated into the role of oppressor of the African-Americans by an establishment that saw nothing wrong with oppressing and manipulating them, certainly not when it kept its own distrust of black folk somewhat hidden. They have fought and bled, always at someone's behest and, in their eyes, they have gained little from it.

Just as they hated the English who manipulated them into fighting the Highland Scots, the Irish, and even the Native Americans, they now hate what they see as the 'liberal establishment' that is, in their eyes, oppressing them... and that mocks them constantly (as you have done), giving them no respect and no real place in its universe of the worthy--but that uses them to keep its gated communities safe.

They know what will bug the establishment--me, you. They know what will enrage us, make us crazy. And, as they also know they can't please us, they are going to do their best to get the opposite response--especially as they feel powerless against us. In some ways, they are putting you on, just as slaves did their masters (see the volume of slave narratives Gilbert Osofsky edited, Puttin' On Ole Massa). They see your disdain for "their vast ignorance about things they should have already been familiar with" and realize they can never please you--so they don't try, but fool you by pretending to the stereotype that you have of them already.

You write:"In the past, if someone knew nothing and talked nonsense, no one paid any attention to him." That's not true, and you know it, for you have read Mark Twain, who points his finger constantly at nonsense paid attention to more than a century ago--and you have surely heard of Father Charles Couglin, the direct ancestor of Rush Limbaugh. And these people who, you claim, ignore the past know it, too. They also know that you, too, are ignoring the past or, at least, are twisting it to your benefit and in ways that make them look foolish.

They know that, by your rules, they aren't as smart as you are. But they also know that, because you make the rules, they are losers at this game unless they become Mini-You's. Unless they become part of the liberal establishment--as you have become. As, to a much lesser degree, I have become.

Anyone can be made to look stupid. They believe you are going to make them look stupid, no matter what they do--so why should they try to appear otherwise? Want to understand Sarah Palin? That's it, in a nutshell.

Much of America doesn't want to be what you define as an "educated, well-informed population" for the simple reason that they don't want you defining them. They've been defined as dumb by you (and by me) for generations, for centuries. Failing to fight the stereotype successfully, they've embraced it. Do you think Appalachians would ever have embraced the 'hillbilly' image on their own? No. They embrace it for the same reason some black Americans now embrace what has come to be called 'the n-word.'

My wife is right: We are intellectual snobs, you and I... and much of the 'east coast liberal elite.' We think we know better than anyone else does.

But do we? Is our arrogance any more justified than that of Rick Santorum, who beliefs are every bit as firm as our own?

Whose ignorance is it, anyway?

Maybe we ought to consider that it could be our own as much as anyone's.

I'm sorry. I didn't really want to turn this into a rant, but you hit a nerve. We are arrogant, we "intellectuals," but I chafe at mine, for I do recognize that I, as Bob Dylan writes in "My Back Pages," "become my enemy in the instant that I preach." Just like the Elmer Gantry's of the right, our own preaching can be just as sanctimonious, just as self-destructive.

The truth of that gnaws at me.

Sincerely,
Aaron Barlow

Thursday, March 22, 2012

Pathways to a Common Core

There is a great deal of controversy within the City University of New York over its "Pathways" initiative, an attempt to establish a system-wide flexible core of courses for students during their first two undergraduate years. Most of the controversy has to do with process--with how Pathways was conceived, structured, and introduced. Faculty see it as an imposition from the administration, when the appropriate way of approaching the possibility would have been through faculty governance structures. The result is a poorly thought structure that will be a nightmare to institute--something that could have been avoided had the administration enough confidence in the faculty to simply suggest a possibility and then listen and respond as the faculty developed a workable program--something the faculty as a whole can do but that the administration, which hasn't the day-to-day connection with courses and academic program implementation, cannot.

Still, the idea behind Pathways is a good one--better, probably, than even its backers at the top of the CUNY central administration realize. It harkens back to attempts to establish an undergraduate foundation not governed by "disciplines," something that C. P. Snow pines for in The Two Cultures and that Robert Hutchins and Mortimer Adler suggest (and that is still found in St. John's College in Annapolis, MD), though without the slavish devotion to an antiquated body of knowledge.

It is just possible that someone behind the program recognized that devotion to the disciplines is so strong within the faculty that it was unlikely that anything that could possibly be seen as a threat to the little kingdoms would not be rejected out of hand--so thought to bypass that possibility completely. The result, however, is a timid and tepid program that will probably do no one any good (in terms of their education) but that will throw the entire system into chaos. The faculty may be petty at times, but its input could certainly have averted what is now a looming disaster. The faculty could have made a good idea realizable instead of, as we face now, creating a weak reflection of a good idea that will be next to meaningless even if it succeeds.

One of the cornerstones of Pathways is a "Flexible Core" of five areas:
  • World Cultures and Global Issues;
  • U.S. Experience in its Diversity;
  • Creative Expression;
  • Individual and Society; and
  • Scientific World.
Students will be expected to take six courses to meet the core requirement, including one in each of these areas with the proviso that they can take no more than two in any single discipline. Courses will be placed in particular areas at the request of individual colleges and approval by a system-wide Pathways committee. Qualification will depend on compatibility with 'learning outcomes' specific to each area. The areas, taken together, are expected to provide a common foundation for students moving on into their specialized majors.

The areas, and the 'learning outcomes' that define them, were created by a committee formed and tasked by the administration, though it was made up of faculty members. According to one of them, they didn't even really understand the purpose of what they were doing until far along in the process. In other words, the faculty did not create this but simply did administration bidding.

When I attended the first meeting of another system-wide committee, the one that is asked to decide if courses belong in the areas proposed for them by individual colleges (I am also on its "Individual and Society" subcommittee), a number of questions were raised by faculty to administrators who were really running the meeting (though there was a bow to faculty governance, a chair of the whole from the faculty). One of these used the example of an introductory Economics course, of the sort taught at almost all colleges. What would happen, the administrators were asked, if one college would place this course in "Individual and Society" and another in "World Cultures and Global Issues"? What would happen to a student transferring from one school to the other? The answer was that the course would continue to fill the area it was associated with at the first school. But how, if the student hadn't yet met all of the area requirements, would the student continue?

One of the things that didn't come up, but that should have been discussed openly and across the faculties of all the colleges before this was instituted (and should have been decided upon by the faculty), was this move to create new silos in place of (or in addition to) the older discipline silos--and just how placement within them should be defined. 'Learning outcomes' themselves are something of a sideways attack on faculty self-governance, for the very concept has not arisen from within the institutions but from outside organizations, including accrediting bodies. Perhaps it can be argued that these have nothing to do with the disciplines and faculties themselves but are concerned with more general educational goals, but I think that's a little bit of a red herring: 'learning outcomes' are being used as means of structuring courses in increasingly quantifiable ways, for purposes that have little to do with education itself, but with control of the process. They have arisen from a lack of trust in the faculty... at least, from a lack of faith that the faculty can determine for itself (and without outside guidelines) the value of a particular offering.

Using 'learning outcomes' as a way of sorting courses for meeting a set of requirements, especially when those requirements (both in terms of the five areas and in terms of the learning outcomes) were established at a remove from the faculty, further erodes self-governance--but it also erodes any sense of solid structure within our educational systems, one of the very things Pathways was meant to avoid. The areas are so generalized, and the 'learning outcomes' so amorphous ("Articulate and assess ethical views and their underlying premises": which of the five does this belong in? What course could this not be a goal of?) that the divisions start to seem random, almost capricious. Hammering out a new core structure among the faculty would have been difficult and time-consuming, but it would have ended up creating something with a great deal more clarity--of necessity. The competing needs and ideas within the faculty would have forced negotiation and justification before the fact, creating something solid and defensible as a whole. Not, as will now happen, on a case-by-case basis as departments tailor their 'learning outcomes' to particular areas and my committee evaluates their success in doing so, creating something definable only through 'learning outcomes,' creating a circularity of reasoning instead of a linear logic--a real pathway to a clear goal.

Though I am perfectly happy to serve on this committee, and to do the required work in hopes that Pathways succeeds, I see very little likelihood that it will. There are many more potential problems with Pathways than I have expressed here, any of which could derail the whole. Though the CUNY administration may have felt that side-stepping traditional governance structures and processes was the only way that its vision of Pathways could be instituted, it ignored the fact that those structures grew for a reason, one if which is that the individual vision (in this case, the administration's vision) rarely covers all contingencies and needs. Change, to really succeed, needs to be constructed by the whole of an institution, not simply by its head--especially in education, where structures are cultural and diverse of necessity.

Tuesday, March 20, 2012

Avant-Garde, Kitsch, the Two Cultures, and Academic Publishing


Over the past few days, I've been trying to gather a few thoughts on the inferiority complex many of us in the humanities feel when forced to look upon the sciences. For a number of reasons, scientists make some of us feel like we're not real intellectuals, and we've reacted in a number of ways, all of them a bit peculiar.

Yesterday, I cornered a few of my wayward strands in Clement Greenberg's old distinction between the avant-garde and kitsch, something I've used in a number of different ways in various books and articles. Though I don't much care for the distinction, I recognize that it has become central to many views of society. Among these are the view that there are "academic" audiences as opposed to"general" ones (and that the former have greater importance) and that there's something about the humanities in general that, somehow, is more akin to kitsch than to science and the avant-garde.

Greenberg writes:
[A] part of Western bourgeois society has produced something unheard of heretofore: avant-garde culture. A superior consciousness of history--more precisely, the appearance of a new kind of criticism of society, an historical criticism--made this possible. This criticism has not confronted our present society with timeless utopias, but has soberly examined in terms of history and of cause and effect the antecedents, justifications, and functions of the forms that lie at the heart of every society. Thus, our present bourgeois social order was shown to be, not an eternal, "natural" condition of life, but simply the latest term in a succession of social orders. New perspectives of this kind, becoming a part of the advanced intellectual conscience of the fifth and sixth decades of the nineteenth century, soon were absorbed by artists and poets, even if unconsciously for the most part. It was no accident, therefore, that the birth of the avant-garde coincided chronologically--and geographically too--with the first bold development of scientific revolutionary thought in Europe. (Clement Greenberg, "Avant-Garde and Kitsch," The Partisan Reader, 1946; it can be found here.)
The connection between science and the avant-garde, we see, is longstanding. And both are rarefied, not for everyone, but for the cognoscenti alone:
Retiring from public altogether, the avant-garde poet or artist sought to maintain the high level of his art by both narrowing and raising it to the expression of an absolute in which all relativities and contradictions would be either resolved or beside the point. "Art for art's sake" and "pure poetry" appear, and subject matter or content becomes something to be avoided like a plague. (Greenberg)
Subject matter and content aren't for "pure" arts and sciences at all. So, it's no wonder that:
The avant-garde's specialization of itself, the fact that its best artists are artists' artists, its best poets, poets' poets, has estranged a great many of those who were capable formerly of enjoying and appreciating ambitious art and literature, but who are now unwilling or unable to acquire an initiation into their craft secrets. (Greenberg)
Oh, but aren't we special, we who can speak our secret language! Wanting to partake in this, many of us in the humanities, which really should be accessible to everyone, begin to couch our commentaries and explorations in terms that only our small group of fellows can understand. We can't hitch our wagon to science, but we can imagine its path and can tail along with the avant-garde.

After all, everything beside "pure" art and "pure" science is easily ignored or put aside:
Kitsch, using for raw material the debased and academicized simulacra of genuine culture, welcomes and cultivates this insensibility. It is the source of its profits. Kitsch is mechanical and operates by formulas. Kitsch is vicarious experience and faked sensations. Kitsch changes according to style, but remains always the same. Kitsch is the epitome of all that is spurious in the life of our times. Kitsch pretends to demand nothing of its customers except their money -- not even their time. (Greenberg) 
It's all commercial, anyway, and not worth the time of those who turn to the heights, unattainable by the great unwashed and their money-grubbing mind-sets, of "pure" thought. The most recent example of the flight from kitsch was the great "Theory" bubble that so recently burst.

A little more than a decade after Greenberg, C. P. Snow unintentionally made matters worse through his The Two Cultures and the Scientific Revolution. Though he was trying to suggest that we get out of our intellectual silos and learn something about what others are doing (stating that "our fanatical belief in educational specialization" (18) is more than a bit wrong-headed), what he ended up doing, for the humanities, is convincing scholars that they had to be more like the scientists if they are to be taken seriously. That is, if someone outside of the specialty could understand them, something was wrong. Poor Snow. That is not what he meant at all; that is not it, at all.

Unfortunately, what he wrote does resonate Greenberg. Or some of it does. He says "the scientists have the future in their bones" (12), that they:
have their own culture, intensive, rigorous, and constantly in action. This culture contains a great deal of argument, usually much more rigorous, and almost always at a higher conceptual level, than literary persons' arguments. (13)
This, all of Snow's arguments notwithstanding, makes people in the humanities want to rise to what they see as a challenge, to move towards providing something of their own just as rigorous, just as hard to understand. So, instead of taking Snow's words to heart, they have continued a process that he describes with sadness:
Somehow we have set ourselves the task of producing a tiny elite--far smaller proportionally than in any comparable country--educated in one academic skill....
It may well be that this process has gone too far to be reversible. I have given reasons why I think it is a disastrous process, for the purpose of a living culture. (21)
He does give reasons, but they are ignored. Instead, we pride ourselves on speaking only to those whose "training" is a specific and as overly defined as our own. Instead of two cultures, or three, we now have dozens, none of which can talk to the others.

Our process of academic publishing, with its specialized journals hidden behind pay walls only worth breaching by those within the minuscule specialties, makes matters even worse. No longer do we have to justify ourselves to anyone. When can simply tell ourselves that what we are doing is "pure" and ignore anything going on outside. After all, it is only kitsch if it is not with us, we who are the avant-garde.

Sunday, March 18, 2012

Academic Audiences

Just who should we--academics, that is--be talking to? Be writing for?

Sometimes, admittedly, our conversations assume a great deal of background. Sometimes, that's even necessary. In too many of these cases, however, that background itself narrows consideration of possibilities and angles outside of the "wisdom" passed down in graduate school or in conferences of narrow focus or through books and journals aimed so explicitly towards "the few" that their language itself keeps others out. In other words, speaking and writing only to those who share the background we have in a specialty restricts the conversation--and in more ways than one.

That's why I love the Ray Davis comment on a post of mine the other day:
 Whenever anyone asks me about academic publishing, I think of E. B. White's polite demurral: "Nothing would delight me more than to write exclusively about sheep, exclusively for shepherds. But...."
I mean, what's the point? If we all already have had the same experiences and, fundamentally, agree about the main theses of our fields, why are we talking to each other? Wouldn't you say we need to get out more?

The other day, I asked someone who is putting together a scholarly anthology if she might possibly be interested in a particular approach to the topic I could offer. It was rather a long shot, for what I proposed proposing (I wasn't going to write a proposal for a chapter if there were no interest--and I suspected there would not be) involved a re-examination of certain "fundamentals" relating to the Comp/Rhet field (one I am only tangentially associated with, making it an outsider re-examination, at that). I got a nice email back declining... nice, except that it included this statement: "scholars in rhetoric and composition have a pretty firm grasp on why these claims can be made."

Oh, my. I'd somehow missed that the book was to be for specialists only. Furthermore, whenever "scholars" feel they have a "firm grasp," it's past time that the "why" be looked into anew. The statement the editor made reeks, to me, of a self-congratulatory and sedentary field (which, actually, Comp/Rhet is not) and one that is satisfied that the insiders really have a handle on things, thank you very much. The editor is a good person, I am sure, and a fine scholar. She comes out of an excellent program and her own dissertation director is a nationally recognized figure. But she has narrowed her focus so (for the proposed book, at least) that the only audience will be the few approaching the topic from a narrow Comp/Rhet viewpoint. This is disappointing from any scholar, but from someone with a background in rhetoric and in composition, it is a particular letdown.

What's the use of writing for so few? In my broad field of cultural studies, certainly, there is diminished interest in speaking only to those within the specialty. In fact, most of us like to dive into other fields and to try to pull audiences from outside into our discussion. That's part of why I like being in cultural studies, for it keeps me in touch with all sorts of things I would miss, were I working in a field where only a narrow group of "experts" are welcome.

And what's the use of an "expert," anyway? I would say that an expert is only valuable insofar as she or he brings that expertise outside of the ivory tower... not a new claim, by the way, but even one of the underpinnings of arguments for academic freedom as made for a century now, as the AAUP's 1915 Declaration clearly shows.

An academic who only writes for other academics only writes for himself or for herself. Each one of us should really be looking for ways, in our writing certainly, but in many other venues as well, to expand knowledge of what we are doing within populations that might not already be parts of our conversations.

Thursday, March 15, 2012

"Lover Come Back"

By the spring of 1962, I had become addicted to the Emory Theater, a small movie house near the entrance to the university and just a ten-minute bike ride from home for ten-year-old me. Though it was an addlepated way to begin a film education, learning was the last thing on my mind anyway. I simply loved shoving my quarter under the glass at the ticket window and watching the late-afternoon matinee. I didn't care what the movie was.

Given the era, its not surprising that I saw an awful lot of Rock Hudson, Elvis, and Jerry Lewis. Oh, and Doris Day. And Rock Hudson and Doris Day. When I watched Mr. Hobbs Takes a Vacation, it was probably the first time I'd seen James Stewart on the big screen--and Maureen O'Hara. For a while, I even confused Stewart with Fred MacMurray, who was ubiquitous on TV at the time (and who had starred in The Absent-Minded Professor, which I had seen with my father in Indiana, shortly before we moved back to Georgia).

Before discovering the Emory, all the films I had seen had been with a parent. Sometimes, it was at the drive-in, my brother and I snug amongst pillows and blankets in the back seat and the two of us expected to be asleep before anything steamy crossed the screen (my youngest brother, once he came around, already conked out in my mother's arms). The first feature, after all, was generally more "family," the second for the parents. In between, we got to exhaust our last energy on the small playground right under the screen. Like many kids of the time, I loved the way the speaker hung over the driver's half-rolled window.

I don't really remember my mother ever taking me to the movies, just my father. Poor guy. He not only had to sit through gobbledegook about flubber, but through Bambi, Tom Thumb, and goodness knows what else. Maybe my mother did take me to the movies once, with a group of friends for my 10th birthday. She scheduled it, at least, even if she wasn't the escort. Asked what I wanted for my birthday party, I'd told her I wanted to see a movie. I remember being shocked when she said she'd let me choose the film. Just to see how far this could go, I opted for the most "adult" film I could that was playing locally, The Devil at 4 O'Clock starring Spencer Tracy and Frank Sinatra (my first big-screen experience of either of them, too, I am sure). I don't remember much about it except that all the flowing lava made me feel hot and that Sinatra seemed rather too fragile for his part, especially in comparison to Tracy.

I thought of that time the other day, as I watched Love Come Back on Turner Classic Movies. I saw it for the first time at the Emory, and retain vivid memories... all of which proved accurate during the repeat. There are a couple of scenes in particular that stood out, but I remember the whole of it pretty well, considering that it was 50 years ago that I watched it first. What I wish I really also remember is what I thought of it at the time. Though it really isn't much of a movie, it is certainly funny--but in ways too adult for my ten-year-old self, most certainly. I doubt I grasped the implications of a candy equivalent to a triple martini, and I know I knew little about sex, marriage, annulment, or childbirth. Let alone about Madison Avenue or advertising.

None of that mattered, not really. Not to me, back then. I can remember in detail almost every movie I saw at the Emory--none of which was any great shakes as a film (Kid Galahad, anyone?). After all, it was a special place for me, and each showing a special treat. They were my movies, the ones I saw there, and I loved them for that.

Understand them? Bah, I might have said, who cares? As any movie lover will tell you, that wasn't the point.

Tuesday, March 13, 2012

The Return of the Public Intellectual?

One of the biggest frustrations for me, as a scholar, is continual denigration (by certain academics) of my work as addressing only a "general audience." I can't be a "real" intellectual, you see, unless I write so that only a few specialists can parse my sentences and unearth my meaning. [It's also frustrating that I am criticized on the other side, as being too difficult for some readers--but that's another story for a different post.] Each time it happens, my memory forces up John Collins Bossidy's bit of doggerel about a Boston:
Where the Lowells talk only to Cabots,
And the Cabots talk only to God.
I don't want to live in that world, and don't want to write for it, either. Still, each time I read "general audience," I bristle--even though that's exactly who I am writing for. A couple of weeks ago, I received a comment back from the editor of a volume I am contributing an essay to, telling me that my piece would be perfect for sparking discussion in a college classroom or a synagogue group. I wanted to feel complimented, but I had a slight suspicion that I was being damned with faint praise (though the editor is including the chapter). Yesterday, Robert Leston, my co-author of Beyond the Blogosphere, pointed out that there is a new description of the book on The Free Library. It says the book is for "general readers"--and it is. Yet I can't help feeling vaguely insulted, though that is exactly what I want the book to be. The power of intellectual snobbery affects me, whether I want it to or not.

The culture of the academy is strong--and I feel its disdain even as I claim not to care. Though I pine for the days when college professors were regularly invited to speak to church groups and at public libraries (after all, if William James did it, why can't we?), I still feel the force of the intellectual snobbery that has become contemporary academia.

I have felt this for a long time. In fact, it is one of the reasons I didn't enter the profession on completing my PhD. I didn't want to live as "removed" from the world as I saw my contemporaries in academia living--intellectually, at least.

Or culturally.

For my father's generation, moving into academia hadn't been a right or an expectation. They had come back from World War II to a future that swept them in totally unexpected directions (he became a professor of Psychology)--but they had behind them something other than experience within universities. Though I had seen no war, I was oddly proud, during my first, tentative semester as a grad student, of showing up to class in my blue work uniform with "VW" over one pocket and "Aaron" over the other, grease under my fingernails and a red rag dripping from my back pocket. I had, at least, experience of something other than school.

So it was that, on defending my dissertation, I joined Peace Corps, where I worked in agriculture (teaching farmers techniques for using oxen for plowing in northern Togo, West Africa). Later, I spent more than a decade running the store/cafe I established in a brownstone Brooklyn neighborhood. It was only when I started writing again, when I realized I could use my "intellectual" skills to reach a "general" audience, that I began to consider entering academia as a new career. It was only when I realized I could teach in a school like City Tech, where the students bring in a world far beyond my experience, that I became enthusiastic about the idea.

Almost as soon as I began teaching full-time, I also started agitating, trying to change the culture, trying to move my colleagues beyond their complacency. This got me into a bit of trouble, but it also led me to a greater understanding of why the likes of David Horowitz (with whom I developed something of a contentious relationship) view American universities so poorly. Though Horowitz is as myopic as "the professors" he criticizes, he does have a point: the arrogance, both cultural and intellectual, of university faculty is both unwarranted and harmful. It further isolates the professoriate from the culture as a whole and provides an echo chamber that masquerades as affirmation.

We professors, I thought, need to get out more.

There are many reasons we don't, but the primary one may be our archaic and isolating processes of tenure, promotion, and funding. Each of these looks to small groups for affirmation, creating "ivory tower" hierarchies that serve as barriers to concentration on work addressed to people outside of the academy. Of course, there are needs for decision-making bodies, but they don't have to be insular and unquestionable--as many of these are.

In addition, as the "Boycott Elsevier" movement is now showing us quite dramatically, we have allowed other barriers to be raised between academia and the "general" public--in this case, making access to scholarly journals available only to those who can afford it or who have affiliation with academic institutions who have decided to afford it.

Though speaking to small groups in churches, film societies, and other interest groups may not have the same cachet as presenting a paper at MLA, it is just as valuable an exercise--even though our inward-looking peers committees may not think so. Though writing an op-ed or a feature for our local, small-circulation weekly doesn't look quite so fantastic on a CV as a piece in PMLA, it may get as many readers, and spark as much discussion. Then there's blogging (since starting to aggregate, a bit, "boycott Elsevier" on Facebook, I've been looking at more academic blogs than ever before--and am impressed by their range and intelligence) and online publications like Raging Chicken, edited by my friend and former colleague Kevin Mahoney. Even though the audiences for these venues may not be "intellectuals," contributing to their debates and discussions is an intellectually viable activity, and it need to be considered so.

We rue the loss of the "public intellectual," but we do very little, actually, to return that figure to its place. It's time we do so--by acting the part ourselves and by rewarding, rather than disparaging, our colleagues who try to do the same.

Sunday, March 11, 2012

Remodeling Academic Journals

As David Gosser's comment on my post yesterday indicates, there are already a number of possibilities online that can be used by and for new types of academic journals--and people are taking advantage of them. The problem lies in finding an audience, in getting the necessary eyes and necessary responses, the two things that make an academic journal viable.

This is why I would like to see large research universities and university systems become their own aggregators, so to speak, taking control of dissemination of the scholarship generated within their walls, making it so that all of it is easily available and easily transferable. The universities are paying for this work, after all (supplemented by grants, in many cases, but it is in the universities where primary responsibility lies), and should want to see it made use of in the best possible manner.

Many schools already provide web pages for faculty, but these are generally rudimentary and are rarely seen as an integral part of research projects or of university activity. Taken more seriously as "housing" for promotion and tenure documentation, however, these could become key parts of a broader university structure and could serve functions far beyond what they do now. Professors could keep (at least partial) public listings of the books and articles (and more) that they are using in their current work--surveys of the literature, as it were--listings that could then be assessed as a whole, showing what, in a particular field, is proving most useful to ongoing research. Blogs, even like mine, could also be housed under the university umbrella, as could wikis, interactive journals, and online versions of print works, both journals and books. People could even "publish" scholarly work through their pages--it happens now, but generally only when the scholar has already reached a level of institutional security. The site's statistics could tell whose work is generating the most interest outside of the particular university or system and could lead people to look at things they might otherwise have missed.

All of this, and much more, could be done right now. Some of it is, in tepid ways, but it will need much more high-profile leadership and top-level university support for it to really have an impact. It will require universities to recognize that they are control of resources that they are not adequately handling but will also require trust in the faculty that is, quite frankly, not often seen, these days. For, if the university decides to act as gatekeeper for such a site, the purpose will be defeated, and scholars will find themselves migrating away--just as is happening now with the commercial academic journals and with blind peer review.

There is a unique opportunity available right now for universities to re-situate themselves in terms of scholarship and the public--in terms of scholarly publishing. They can take their university presses, their journals, their "faculty commons," their department websites and make them into something cohesive and useful, something that facilitates scholarship in ways never before seen.

Will they? I don't know. The opportunity has been around for a number of years, now, but nothing has been done at an institutional level by any research university or university system that I know of.

I hope that will change.

Update: At City Tech, we are now encouraged to update evidence of scholarship in our files with digital documents. That's all well and good, but those documents often cannot be made public on our websites due to copyright considerations. Publishers are quite jealous of their rights, and don't even care much for authors presenting their own work, if that work falls under publisher rights. One of the changes that could occur, were a system such as what I describe above instituted, is that copyright could be viewed a bit more leniently. The ownership of the Intellectual Property would rest with the scholar or, possibly, with the institution. It could be offered, then, depending on the circumstances, under Creative Commons licensing, something much more useful to future scholarship than ownership rights as presented under current copyright law (Creative Commons, in general, provides a way for giving blanket approvals for many different types of copying and usage of IP).


Instead of hiding what we are doing behind IP walls, putting all of our work together on a public web page would allow us to participate more readily in the broader conversations going on both within and beyond our disciplines. It would also make it easier for promotion and tenure committees to evaluate scholarship, for it would be easily compared to what others are doing.


Secrecy and protection of IP may have their place (or may not), but these should not be our main concerns as scholars. All of us, as professionals, should be working to enhance knowledge and its dissemination. We did not go into academia to make money, after all, but to learn, to teach, and to explore.

Saturday, March 10, 2012

Why Keep Academic Journals As They Have Been?

Over the past decade, newspapers have learned that they need to change to survive. The deaths of papers all over the United States made that quite apparent, and the journalism industry, though hating to do it, learned to adapt. Today's newspapers aren't merely print, but are intertwined with other media, including television, radio, and websites of many natures, including blogs, online versions of the print edition, and quite a bit more. Flexibility and adaptability have become part of any newspaper model.

Academic journals, not having had to respond to either changing readership habits or advertising models, have not similarly expanded. With "captive" writers--scholars who need to publish in "name" academic venues in order to gain grants, tenure, and promotion--clamoring to provide them with content and libraries (not themselves beholden to any commercial model) trapped into paying almost anything the publisher demands, there has been little incentive for change. Even online, an academic journal (with a few significant exceptions, of course) looks little different today than it did a quarter of a century ago.

The newspapers had to adapt--or die. For academic journals, there has been no similar need.

So far, at least.

It may take an act of some bravery, but the mechanics of presenting a digital-age replacement to the traditional academic journal, one that can easily step into the "certification" role of the "top" journals, are not hard to imagine. The bravery will come when one large university or university system says, "Enough!" and offers its own replacement, challenging the rest of academia to show why the new entity is not as scholarly and relevant as any older venue.

The universities, after all, are paying for the work that the commercial academic-journal publishers are profiting from. They don't need to continue giving scholarship away for the commercial gain of others. On the other hand, they don't need to lay their own proprietary blanket over the work of their professors, as commercial enterprises do when their employees create. They can find ways of presenting and promoting the work their scholars do, ways that promote the university, the scholar, and subsequent work based on that presented.

Sure, there are plenty of academic journals housed in universities and even colleges. But these tend to be in individual "silos," each one standing on its own and not as part of a system-wide collective, well channeled, mapped, and linked. Such a system could provide a home for "traditional" academic journals, but also for blogs related to them, or aggregating sites relevant to particular topics. Some parts of the collective could be carefully structured, vetted, and edited, while other portions could be clearly informal. The trick would be to make the whole easily negotiated, both by visitors and by scholars, particularly by scholars wishing to contribute, or to update their contributions.

Problems, in terms of university administration, will come when desire to protect reputation butts up against academic freedom, as will happen. This, probably, is one reason why nobody has yet to offer a large, freewheeling site of this sort.

What we would have would be something akin, in part, to academia.edu, a blog, a wiki, an online academic journal, and much more--all structure so that they interact, so that a researcher can go back and forth between parts with ease, and can even organize documents, providing their own aggregation within the whole.

Once something like this appears--a CUNY Research Commons (to imagine one in my own system), for example--the commercial journals will begin to disappear and the non-profit journals will migrate into such sites. Scholarship will begin to be more accessible and usable. The universities will be able to boast of work that is there for all to see, and scholarship, in general, will be able to come out from behind the walls that have been built, too often, around it.