Wednesday, November 28, 2007

“Discussion” in a Faculty Office

Fiction? Yes. But….

Full Professor Irma Fayles has been teaching at the inner-city institution since its days as a community college a quarter of a century ago. Never having published a book, let alone an article, she became a full professor at a time when the college had not re-envisioned itself as a four-year school with scholarship an important focus. Assistant Professor Sam Stamper is new to the college, but arriving with one book out already and another about to go to press. He doesn’t yet know the “traditions” of the school and has no preconceptions about its student population.

Fayles was recently assigned to observe Stamper and has decided that it is her job to put this tyro in his place, to teach him how things really are. He had asked her to attend his sophomore literature class where the students, over the semester, were reading four novels, four plays, and a number of works of poetry. The observation took place the day Doris Lessing was awarded the Nobel Prize. This conversation followed a week later:

Fayles: First of all, I must admit that you have a strong presence in the classroom. And you’re clearly a good and dynamic actor. But I must caution you: beginning with a mention of Lessing is going to do nothing for these students. You’ve got to consider who they are and what their backgrounds have been. They haven’t heard of Lessing, and aren’t likely to. This is just a bit of advice: work with things they know or will need to know; leave out irrelevancies.

Stamper keeps his mouth shut, though he cannot bring himself to nod any agreement. His chances of promotion and even retention, he knows, could be affected by Fayles’ evaluation of his class. Behind his straight lips, however, he bristles: the students, as he is well aware after even his short time at the school, know a lot more than many of his older colleagues believe, and have experienced a good deal in their short lives. Their world is no more limited than that of their professors. Doris Lessing might or might not ever again appear as a name before them, but now they could make some connection if she did. Furthermore, he thinks, there is something essentially classist, if not racist, about what Fayles is saying. The implication is that, at a “better” school, one with fewer blacks and immigrants, speaking of Lessing might be OK. But not here. He silently rejects Fayles’ implied thesis that, because the students come from what seems to her to be a limited background, their teachers cannot expect them to move beyond it, and should not even encourage them to try.

Fayles: I saw a lot of teaching in your class, a lot of pyrotechnics, but little learning was going on. Too much performance by you and too little activity on the part of the students. As a result, much of your class was wasted. You need to have different tasks, each an activity for the students, each lasting fifteen minutes or so. Otherwise you will lose them. Maybe have them read aloud, a paragraph each, and then write for a few minutes.

Here again, Stamper keeps his mouth shut, and once more can’t bring himself to nod. He simply sits and waits, looking at Fayles. Did Fayles see no difference between the needs of a remedial (or even first-year composition) classroom and a more advanced literature one? Or does she really believe that the only sort of learning possible for these students lies in mastery of a series of small tasks? In the context of this course, he is not interested in developing skills, but in encouraging students to think and to develop enthusiasm for reading and ideas—and he does not feel that reading a paragraph aloud or writing short paragraphs would contribute to that. He wants to bring his students into a more sophisticated dialog rather than falling into the simplistic thinking fostered by the sort of program Fayles had described. His class is themed around questions of generation of knowledge and his students are beginning to grasp and argue about the distinction between the believed and the demonstrable. Neither five nor fifteen minutes of writing—or of small-group discussion—is going to further that. In fact, any success that he has achieved would be lost.

And little learning? No, he had seen a great deal exhibited in the papers he had just returned to the students—book proposals, following the standard professional model, for works of fiction exhibiting some aspect of the problem of belief. Some of these students, whom Fayles believed couldn’t manage a task exceeding a quarter of an hour, had turned in creative and sophisticated ten-page proposals, some of which would actually make intriguing novels.

Fayles, he thinks to himself, mistakes activity for learning, one of the side effects of the “student centered” pedagogies of the 1970s. Though there is much to be said for Paolo Freire and his Pedagogy of the Oppressed, it is essentially a political statement, and one that moves classroom pedagogy in one particular direction for reasons that have as much to do with desire for cultural change as with the real needs of teaching and the learning implied by the act of teaching. It has led to the confusion of learning and doing, placing (for example) an undue emphasis on small groups, short writings, and other in-class exercises. These have their place, of course, but they need not dominate every classroom. In fact, they should not. What Fayles is promoting, furthermore, is actually a perversion of Freire, for she is turning his methodology into a means for furthering oppression rather than stopping it.

Fayles: Really, I don’t see why you should do the reading aloud, though you are very good at it. Have the students do it. It’s good for them.

How patronizing, thinks Stamper, still silent. Anyway, I am not teaching reading, but am trying to show a group of students who have never seen it the beauty prose can rise to. The passage I read was short, no more than a page or two, and my purpose was for the students to hear the skill of the author and the beauty of the phrasing without my telling them. There are times when it is appropriate and useful for students to read aloud (I use play cuttings read by students, for example, when teaching drama), but this was not one of them. Fayles, why have you put such a blanket rule in place, stating categorically that, if text is to be read, students should read it. I prefer a much more expansive and flexible view of the classroom. There are, one might say, more arrows than one in my quiver—and I choose the one best for the situation.

Now that I think on it, I wonder if she has actually read Freire, or had simply heard tell of his describing and debunking what he called the banking model of education, where passive students just give back interest on what they had been given—or worse, simply regurgitate what they have taken in. This was part of a simplistic concept of audience present from the 1950s through the 1980s, and not only in regards to the classroom. Watchers of television and movies were also considered passive receptacles. However, readers of books, for some reason, were not. It was rarely recognized that watching could be just as active and intellectually stimulating as reading. We are beyond that now, most of us, and realize that lack of physical motion is not lack of intellectual activity. Fayles wants me to go backwards; it’s not going to happen.

The movement towards “student-centered” classrooms was a response, in part, to what was seen as a patronizing, paternal system of education that, in many eyes, amounted to indoctrination, not education. The irony is that, today, it is people like Fayles, insisting on the Freire-influenced classroom, who are being accused (by critics on the political right) of indoctrinating rather than educating. The accusers, though I hate to admit it, are right to this extent: any attempt to enforce a cookie-cutter model does lead to conformity and not to thought. And the older models of education were not nearly as indoctrinating as many, in the heat of a political moment, came to believe.

Teaching by example of knowledge and enthusiasm, as the best lecturers have always done, amounts to something quite different from indoctrination. And it is a necessary element in a good education—though never the only one. Not every course should be a lecture/discussion of the sort Fayles observed, just as her model, while admittedly useful in certain contexts, should not be universal. Many of us, when we think back to the teachers who influenced us most, find that they were the ones who lectured and discussed—with fervor and finesse. We weren’t indoctrinated by these teachers, but were led by their passion to explore on our own. When we decide that such leaps into our own learning are not possible by our own students, we demean those students and block access to an important element of education. It was good enough for us, we are showing, but is beyond what they can handle. That is unacceptable.

Since I began teaching, Stamper thinks, trying to be patient, observations have been my bane. The checklist of small groups, exercises, and constant shifts in activity that has become the observation staple (to the point where students make jokes about their professors adding these things to the class only when being observed) has become quite stale. I will not lower myself any longer—as well I could—to playing this particular game, certainly not for an observer who does not recognize that both times and students have changed.

Small groups were new and unusual in the 1970s, and students saw them as a refreshing shift from the teacher-centric classroom they had been familiar with. To many of today's students, however, the small group is something they “suffered” all the way through high school. In fact, Stamper knows, all of the parts of that checklist are things now more common to high school than to college. Today, if students are to move beyond their high-school behavior, they have to be treated as something other than high-school students, utilizing methodology other than what they earlier experienced, methodology more demanding upon them. Methodology moving them forward in their education, not simply providing the same thing over and over again, class after class.

There's that other factor, of course: the computer. In a year or two, more than a quarter of students nationally will be taking classes that are at least partially online. Such classes are necessarily task oriented and many of them have to follow the Fayles model. Simply to survive, on-campus classes are going to have to be offering things online classes cannot. We need, Stamper believes, to accent the instructor in the classroom today, not further reduce her or him to the "facilitator" that some online programs actively promote. The leadership, the broad knowledge, and the enthusiasm that a professor can show in the classroom does not come through so well (at least, not in the same way) in online situations, so should be emphasized if the "real" classroom is to survive.

Yet we must be careful, Stamper warns himself, in what we “say” to students in our classrooms in other ways. Tasks of the sort Fayles wants utilized in the classroom are seen, more often than not, as onerous by the students. These do not engender a love of the art being studied, but can even lead in the opposite direction. In a course where a skill (such as writing) is the central focus, task-oriented classrooms are essential. In a course where the goal is much broader, tasks of the sort Fayles insists on can deaden student enthusiasm for, and appreciation of, the art. Certainly, they do little to enhance it.

Fayles: One of the basic rules of teaching is that students must be engaged at all times. I saw students drifting in and out of the discussion. You need to draw them all in.

What, she wants me to work down to the lowest rather than challenging the highest, doing so in a way allowing the lowest to rise as well? Either way we do it, we risk losing some of our students. I’d rather, in this class, that risk be at the bottom than at the top. Yes, I like teaching remedial classes, too, bringing the struggling students to the point where they can attempt college work… but not every class should be like that, focusing on the lowest common denominator.

Fayles: You need to be careful with the things you say, or you will lose the students. You should have explained the “butterfly effect” when that came up. Some of the students probably don’t know what it is.

More dumbing down, she wants? No thanks. What she is asking, again, is that I stoop to what she believes is the level of my students, not demanding more from them than they are used to giving—any of them can find out what the “butterfly effect” is quite easily by asking others in the class or looking it up online. After all, the mention wasn't mine, nor was understanding of it essential to the point being made. The students don't need to be spoon-fed such things, anyway. My feeling, again, is that more can be gained by demanding the students rise than by lowering myself. Sure, a few students will be lost—but as many (if not more) will disappear if I dumb things down—and all of them will be poorly served.

As she talks and Stamper does not respond, Fayles becomes angry, more and more so with each stony lack of response to each comment. When her officemate comes in and starts puttering around, she stops, waiting for the other to leave. Stamper finally speaks, telling her it is OK if the other overhears. Fayles, trying to smile, says it is not OK with her—and asks her colleague to withdraw. Once they are again alone, she continues, her frustration with Stamper clearer than before.

Fayles: During the class, you brought up World War II a number of times. That was a mistake. Our students have little knowledge of history; some confuse the Civil War and World War II. It’s best to avoid history unless you are going to teach it.

Understanding history, Stamper believes, is necessary for understanding literature. He has been laying out the basics necessary for the texts being covered since the beginning of the semester. If he were to follow Fayles’ advice, he would have to teach different texts, probably much simpler ones. And that would not suit his purpose. We serve our students poorly, he believes, when we don’t open up the unknown country.

This, he tells himself, is getting ridiculous. But, boy, is she steamed!

Fayles: And bringing in 9/11? That was gratuitous, facile, and unnecessary. There is no reason to talk about something like that in a literature class.

What are you talking about, thinks Stamper, forcing himself not to respond. 9/11 was the most significant common event of the lives of today’s students. I wonder if she would have said the same in 1969 about the assassination of JFK, same number of years earlier. 9/11 needs to be a part of our teaching for quite a number of reasons, including the simple fact that it can be used to open all sorts of doors. Our students, quite naturally, are interested in it; they perk up and listen, making 9/11 an effective entry into any number of topics.

This isn’t ridiculous… it’s stupid. He stifles a sigh.

Fayles’ anger is now clear in just about everything she says, her words becoming more and more accusatory in face of Stamper’s determined lack of response. This young man just isn't listening, she realizes, isn't respecting the experience that she brings to interaction with these students. But she goes on anyway.

Fayles: Another problem was that you didn’t ask the students enough questions, and did not call on specific ones. You need to drag them into the conversation, sometimes! And you should never answer the questions yourself. You did that at least twice.

My goodness, more high school? It becomes like high school if I force students to squirm under my eye as I put them on the spot. And I don't believe that is effective pedagogy. All it does is embarrass the student. The last thing I want is for the classroom experience to be actively painful. I try to build a comfort zone into the classroom… which is one of the reasons my students show up. Maybe you didn’t notice, Fayles, but all 30 of them were there.

What time is it? Five minutes to the hour. Ah, good! I’ve an excuse for getting out of here and I had better use it—or I’ll end up saying what I think, and that won’t get us anywhere. Though it has to be said, this is not the place.

Stamper: I’ve got to go teach.

He stands and leaves without another word.

Tuesday, November 20, 2007

Look What They've Done to My Song, Ma

In one of those bits of serendipity that, when you examine them, really have more to do with a greater cohesion, David Brooks' column today deals with American music—just the topic of discussion in my composition classes, yesterday and today. Brooks, however, bemoans the splintering of the music. I see things differently.

Brooks has been speaking with 'Little Steven' Van Zandt, guitarist with the E Street Band:

He argues that if the Rolling Stones came along now, they wouldn’t be able to get mass airtime because there is no broadcast vehicle for all-purpose rock. And he says that most young musicians don’t know the roots and traditions of their music. They don’t have broad musical vocabularies to draw on when they are writing songs.

As a result, much of their music (and here I’m bowdlerizing his language) stinks.

He describes a musical culture that has lost touch with its common roots.

Later, Brooks says Van Zandt has “drawn up a high school music curriculum that tells American history through music. It would introduce students to Muddy Waters, the Mississippi Sheiks, Bob Dylan and the Allman Brothers. He’s trying to use music to motivate and engage students, but most of all, he is trying to establish a canon, a common tradition that reminds students that they are inheritors of a long conversation.” I did something similar in 1993, the last time I taught high school. In fact, I actually taught the course, doing more than simply concocting a curriculum. And I use music in my writing classroom frequently, doing things like introducing students to the backgrounds of rap music in poetry and song going back (believe it or not) to Beowulf, where the 'sprung rhythms' (to use the phrase that Gerard Manley Hopkins created for his own nineteenth-century poetry) and alliterations show an accent on the beat that was later masked by the stricter cadences of the poetic forms introduced with the French of the Norman invasion. I recite for them Woody Guthrie's “Talkin' Dust Bowl,” Bob Dylan's “Subterranean Homesick Blues,” among other things, and tell them about the story-telling of country music. And I talk about Amiri Baraka's Blues People, written when he was still known as LeRoi Jones. And much more.

In other words, I know a little bit about the history of American music--and love to share it. The so-called 'American Songbook' is familiar to me, as is the jazz by the likes of Carla Bley and Paul Motian, as is the boogie piano of Rosie Sykes. Not only that, but my mother is a classical musician and I grew up surrounded by her music (it was first hearing Leadbelly's 12-string guitar that shocked me into another listening direction). I have heard the mermaids singing, in other words, each to each—and have paid attention.

One thing I have learned from my obsession with music is that its history is one of constant flux, of melding and separation. It always has been thus. There has never really been the kind of cohesion or national musical 'language' that Van Zandt laments in the quote from Brooks above. When I was a kid in the 1960s, there were few I could talk to who would know both Professor Longhair and Phil Ochs... or either one, for that matter. Rare was a songwriter like Bob Dylan, who soaked up influences like a sponge. Most people—and even most musicians—knew little beyond the particular genre they worked within.

How many American musicians of the 1960s were familiar with Ska? Or John Cage? Or Johnny Gilmore of Sun Ra's Arkestra?

Very few.

Every generation laments the passing of a golden age that was somehow more nuanced than what we see “today.” In 30 years, David Brooks' replacement will be interviewing Kanye West about how musical knowledge has dwindled, about how much broader the influences were, back in the day, that first decade of the century.

Brooks calls what we have today “the segmented society.” As an amateur historian of the history of American music, that makes me laugh. He conveniently ignores the fact of “race” records, an enforced segmenting with a power greater than anything around today. He forgets (or never knew) that many classical musicians resolutely refused to listen to jazz (Leonard Bernstein had much to do with changing that—in the 1940s). If anything, though the average person knows no more today about the history of music than she or he did fifty years ago, the possibility of exposure to a wide range of music is greater today than it has ever been. Rappers sample Buffalo Springfield as well as Bootsie Collins—and some of them can talk in detail about Monk or even Mozart.

Yes, it is true that there are few today who can speak broadly of “Bessie, bop, or Bach”--but, Mr. Brooks and Mr. Van Zandt, it was ever so.

Monday, November 19, 2007

Race: Still the Base in American Politics

The recent endorsement of Rudy Giuliani by Pat Robertson is called “unlikely” by some—but, and make no doubt about this, it is not. For all the talk of “family values,” “protecting America,” abortion, and whatever else, the real dividing line in American politics falls along questions of race. It has been so throughout my lifetime (I was born in 1951), the only change being in alliances. Giuliani became mayor of New York City through race (David Dinkins, the black mayor he replaced, probably did more than Giuliani to redirect NYC, but he was black—and a nice guy, but that's another story). He is so popular in much of America not simply because of 9/11 (what exactly did he do to stop terrorism, anyway?) but because he “tamed” New York. He put the blacks and other non-whites in their places, that's what he did.

Look at Robertson's career: though rarely making African-Americans the focus of his attacks, his ministry parallels the rise of the “southern strategy” used by Republicans in the wake of Democrat-led successes in civil rights in the mid-1960s. Before that time, southern Democrats were reliably racist and it was a coalition of liberal Republicans and northern Democrats that made the civil-rights legislation possible. The result of the “southern strategy” was a shift that saw the disappearance of liberal Republicans and a move of southern Democrats to the Republican party—and the rise of what is seen as a newly vigorous conservative movement. It should be no surprise that Robertson feels comfortable with Giuliani, for their differences are ultimately peripheral. Robertson, using the coded terms of the “southern strategy,” was a part of the movement to use race to bring the Republican power to dominance—just as Giuliani will prove to be, if his campaign succeeds.

As was Ronald Reagan. In fact, he was probably its ablest practitioner.

There has been a bit of a donnybrook recently on the op-ed page of The New York Times. It started with another of those attempts to pretend that the whole “southern stategy” is simply a “slur” on the Republicans and, in the instance under consideration, on Ronald Reagan. The columnist, David Brooks, writes:

The distortion concerns a speech Ronald Reagan gave during the 1980 campaign in Philadelphia, Miss., which is where three civil rights workers had been murdered 16 years earlier. An increasing number of left-wing commentators assert that Reagan kicked off his 1980 presidential campaign with a states' rights speech in Philadelphia to send a signal to white racists that he was on their side. The speech is taken as proof that the Republican majority was built on racism.

Like many contemporary apologists for Reagan, Brooks attempts to move the blame to his “strategists” but then tries to exonerate them as well. The wink and nod used by Reagan by appearing at the Neshoba County Fair and using the term “states rights” were, in the vision Brooks tries to build, purely accidental. He fails to mention (though he surely knows this) that such coded references to the side of the race-issue divide one falls on was a major part of a strategy that, by 1980, was already a decade-and-a-half old.

Four days after Brooks' November 9 attempt at revisionist history, fellow columnist Bob Herbert took him to task, though without naming him. He wrote:

Reagan was the first presidential candidate ever to appear at the fair, and he knew exactly what he was doing when he told that crowd, “I believe in states’ rights.”
Reagan apologists have every right to be ashamed of that appearance by their hero, but they have no right to change the meaning of it, which was unmistakable. Commentators have been trying of late to put this appearance by Reagan into a racially benign context.
That won’t wash.

Unfortunately, it does wash—in the eyes of the millions of racist Americans who have subsumed their hatred under a patina of coded red-herrings and internal misdirection. Whether Herbert (or I) like it or not, the justifications and explanations those like Brooks provide allow people to hide their racism while continuing to support essentially racist policies—which, I suspect, is what led Paul Krugman to jump into the fray today.

Krugman took Herbert's column, which dealt with the murders that made Philadelphia,Mississippi iconic to the American racial debate, and brought the issue smack into the current political climate:

More than 40 years have passed since the Voting Rights Act, which Reagan described in 1980 as “humiliating to the South.” Yet Southern white voting behavior remains distinctive. Democrats decisively won the popular vote in last year’s House elections, but Southern whites voted Republican by almost two to one.
The G.O.P.’s own leaders admit that the great Southern white shift was the result of a deliberate political strategy. “Some Republicans gave up on winning the African-American vote, looking the other way or trying to benefit politically from racial polarization.” So declared Ken Mehlman, the former chairman of the Republican National Committee, speaking in 2005.
And Ronald Reagan was among the “some” who tried to benefit from racial polarization.

Krugman, who must have been feeling rather optimistic when he wrote this column, sees in the fact of a racist base to the rise of conservative political power as containing its own destruction: the power of racism is receding in America, he believes, though slowly (receding, principally, through demographic changes that will soon reduce the white majority to a plurality).

Make no mistake about it: the rise of conservative power in American politics is tied inextricably to race, as any honest examination of Reagan's appearance at that 1980 county fair (among thousands of similar incidents) makes clear. The continuing potential of appealing to the racial divide will be obvious as the 2008 presidential election continues to make what seem to be strange bedfellows—such as Pat Robertson and Rudy Giuliani.

For it's really not so strange. Robertson and Giuliani are both appealing to a racist base that will come out to vote in response to their coded appeals—just as it did for Richard Nixon, Ronald Reagan, and George W. Bush. Whether or not this strategy can still succeed remains to be seen.

Wednesday, November 14, 2007

Conspiracy in the Classroom: Oh My!

This semester, I am teaching a course under the generalized rubric “Perspectives on Literature” that I call “Alternatives.” It’s a cobbled-together course, taken on at the last minute and soon after I had sworn never, ever to teach literature again. Not surprisingly, it is proving to be the most successful course I have taught in years.

As anyone who has organized a class in a hurry can attest, it is best, in such circumstances, to begin with what one knows. As any successful teacher understands, it is also necessary to challenge yourself, to make sure that you are doing something new or in a different way, so that you won’t get stale and so you can have a little fun—and maybe even learn something. So, I started the course off with Philip K. Dick’s The Man in the High Castle, an alternative-history novel by the author I wrote on for my doctoral dissertation. I followed that with another novel in the genre, Ward Moore’s Bring the Jubilee, a somewhat more straight-forward work meant to be something of a breather before Philip Roth’s The Plot Against America, which I had first assigned in an independent study a couple of years earlier. Taking alternatives into the future, I followed that with Alan Moore and David Lloyd’s graphic novel V for Vendetta, a change both of pace and direction and a set-up (believe it or not) for Shakespeare’s The Tempest—where we are now. Next will come Thomas Pynchon’s The Crying of Lot 49; we will end the semester with Vladimir Nabokov’s Pale Fire. I also initially planned to bring in poetry, and have done so, including such works as Andrew Marvell’s “To This Coy Mistress,” e. e. cumming’s “pity this busy monster, manunkind,” W. B. Yeats’ “Sailing to Byzantium,” and much more.

The first paper assigned was a book proposal, each student having to present the scenario for an alternative-history novel that they might write. For the second paper, each student will explicate a part of The Crying of Lot 49, their work together forming something of a reader’s guide to the book. The midterm required them to write on two of the first three novels; for the final, they will have to find parallels between V., Prospero, and Kinbote.

The students are loving the class. Oh, not all, but I rarely have an absence—and we meet at six o’clock in the evening. The midterms were good, and some of the novels proposed in their first papers really should be written.

One of the most delightful things about this course is that it is affirming for me that the students here at New York City College of Technology really are better than some of my “older” (in terms of tenure here) colleagues give them credit for being. I’m treating them as intelligent and able college students, and they are rising to the challenge—no matter that dreadful educational mills have ground up and spat out quite a number of them.

More interesting, however, is the fact that I seem to have stumbled onto an organizing topic particularly suited to students today.

Most of us professors have been out of school for some time, and long ago went through our own conspiracy-theory phases, many of us doing so in the 1980s, when The Crying of Lot 49, in particular, was something of a rage for lower-level literature courses and curiosity about Watergate and the Kennedy assassination lay behind much of our political discussion. Today, we tend to find such topics passé and cliché, scoffing at the conspiracy theories concerning 9/11 and the moon landing—let alone all that blather about Area 51 or Roswell, New Mexico.

Our students today, though, approach conspiracy from a perspective different from that of earlier times—or from the “nuts,” be they apocalyptic visionaries or refugees from images of bureaucratic intrusion. They know that just about all of the information they are given is questionable, so approach everything with caution. For, what they don’t have is an apparatus for determining the validity of what they are shown.

It is this they seek, a way to evaluate data—not the hidden data that was so much the focus when “we” were young.

To many contemporary college students, the existence of conspiracies of some sort or another is a given. They read Dan Brown’s The DiVinci Code without being convinced or dissuaded. Strangely enough—or maybe not—they have a strong understanding of ambiguity and uncertainty. Over the past few years, they’ve seen a war sold to the nation through definitive statements about WMD’s and mushroom clouds—only to unravel as rationale. They’ve become cynical about the information they are provided—but they don’t want to be. Rather, they would love to be able to unravel the tangle web of deception themselves.

Here, once more, it is the process that is proving more important than the product. The “fact” of conspiracy is, to many contemporary students, a given—even to the point where they get bored when another conspiracy theory pops up. They don’t want to hear about it.

But, when they come across tools they can use for debunking or—more fundamentally—for understanding the thinking behind the theories, they perk up.

There’s a new book out by Christine Borgman called Scholarship in the Digital Age: Information, Infrastructure and the Internet (there’s an interview with her today on Borgman is writing to and about professional scholars, but I think much of what she is saying applies to student scholars as well. In both cases, right now, it is the how, and not the what, that’s becoming increasingly interesting.

Yeah, we do care about the subject matter but, in the knowledge explosion we are experiencing right now, the manner of evaluating the thing becomes as important as the thing in itself. Conspiracy theories, then, become particularly interesting, for they are based on differing ways of viewing and understanding information.

Is Charles Kinbote really the bare Professor Botkin gone mad or an escaped and hidden deposed king? Or is he some other sort of madman—or sane man—completely? Or a figment of John Shade’s imagination? Certainly, he comes from Nabokov's imagining, but so what? The answer to these questions, my students are coming to understand, are far less interesting than the process of discovery that might or might not lead to an answer.

In a sophisticated information age, my students are rising to steal the bait from the hook, taking away the knowledge about the process of gaining knowledge without getting caught up in the conspiracy theories that raise questions of the validity of knowledge in the first place.

More power to them!

Monday, November 12, 2007

Academic 2.0: Moving Web Skills into the Classroom

The following is the draft [updated 11/13/07, with thanks to Sherman Dorn and Time Barrow] for a talk I'll be giving at 1:30 this Thursday, November 15, 2007, for a session at the annual convention of the National Council of Teachers of English at the Javits Center here in New York City. The session, which I created and will chair, is entitled "Up from the Streets: Melding Diversity Through Technology in the Writing Classroom." Four of my colleagues, Julian Williams, Annie Seaton, Mark Noonan, and Charles Hirsch, will also be presenting papers:

That student looking down at her hands in her lap? Texting her sister in Sheboygan. That other one, so diligent behind his notebook computer? Emailing mom in Mumbai.

So, what do we do? We get angry. “Turn off all electronic devices in the classroom!”

But what have we done? Focused the class, perhaps. But what else?
Two things: first, we’ve reinforced that the classroom is a special zone cut off from the rest of the world in the way a movie theater or a playhouse is. Second, we’ve reinforced the idea that communication is a distraction in a writing classroom dedicated to… ah… communication.

Let me tell you a little story, related to #1: on 9/11/2001 I was teaching just off the Brooklyn side of that stone-towered bridge they make such a fuss about. Cell phones were off.

Sirens started. From the classroom window (which looks northeast), I could see emergency vehicles congregating over by the entrance to the Manhattan Bridge; I though maybe something had happened at the McDonald's there. Sirens not being particularly unusual, I kept class going.

It wasn’t until our break that phones went on and the beeping started.
So, we’d been there talking about, I don’t know, topic sentences or elaboration or something else that supposedly had to do with communication when people had been desperately trying to communicate with us. So focused was I on teaching communication skills that I had cut off communication—on the day when human contact mattered most.

Ah, the irony!

Last year, the Educause Center for Applied Research published a Study of Undergraduate Students and Information Technology. Among the finds was that:

Today’s students spend a lot of time using a raft of gadgets and a lot of time online. While the average respondent reports spending 23 hours per week using various technologies, more than one-quarter of male respondents report using electronics more than 30 hours per week….

Undergraduates are communicators. Nearly all (99.9 percent) create, read, and send e-mail and more than 80 percent send instant messages, most of them doing it daily. They use their arsenals of electronics to write documents for coursework (98.8 percent), search the Web and institutional library (94.0 percent), and create presentations (90.8 percent). (ECAR)

They are using these things… and their devices have become an integral part of our world. Cut the devices off from our classrooms, and we cut our classrooms off from the world—not good, as I found in that classroom on 9/11, a mile from Ground Zero but with no idea what was happening.

Now, to be fair, I am simply trying to make a point about technology and the classroom, not to argue for cell phone use there. My point has more to do with babies and bathwater than with the need for a certain amount of decorum in classroom behavior. We tend to see the negative side of the use of technology by our students, forgetting that there’s a positive in there somewhere—if we’re willing to reach in, pull it out, and dry it off.

Strangely enough, when we’re not looking at technology as disruptive, we often lean the other way, working to make technology the centerpiece of the classroom. Even in our “smart” classrooms the apparati loom, their presence dominating any event in the room. Computer classrooms are worse: the existence of row after row of desktops presents what amounts to a moral imperative to use them, and use them often. Yet our new technological devices are not supposed to be our classrooms any more than their smaller should be allowed to disrupt them.

Our students, who increasingly have grown up with the technologies that we teachers find so new and fascinating (who are already, as I like to say, “neterate”), understand this, perhaps better than their elders:

As a whole, younger respondents and female respondents to the ECAR survey prefer less technology in their courses than others. This finding suggests that while younger students arrive on campus with a lot of IT tools and self-described skills in IT-mediated communication and recreation, they are comparatively unskilled in IT to support academic purposes. (ECAR)

I would word that somewhat differently: This finding suggests that the students are recognizing that their instructors have a different, and much less sophisticated, view of technology—and are bored by the way old fogey teachers use it in the classroom.

Our students are onto something that we've only now started to recognize, and that is that we are in the midst of a new technological leap, one shown in their current attitude towards that staple for older students and, now, practically everyone else, the notebook computer:

While respondents appear to resoundingly prefer laptop computers and some are loading up on PDAs and smart phones, they are largely not bringing computers to class. Most respondents (70.3 percent) never bring their laptop computers to class, and only 14.5 percept do so weekly or more often. Even 16.2 percent of responding students who are enrolled in courses that require a laptop fail to bring these devices to class. The weight of laptop computers and the risk of their theft are cited frequently as reasons students do not bring laptops to class. (ECAR)

Plus, there's little status in the laptop or notebook anymore. Once it was a sign of the technically savvy. Now, its the albatross of the passé technie bore.
Consider that, in many respects, the laptop is being left in the dust—right now.

Just look at what is happening to Nicholas Negroponte's One Laptop Per Child project: it is now slated to become a techno-cultural curiosity, superseded by the embrasure of the cell phone by developing cultures or by whatever develops out of Asus's Eee PC, the commercial version of the OLPC computer. Though I think the future will arrive through today's hand-held devices and not through the laptop, it could easily come the other way. In either case, the computers we use now may be antiques faster than my old Osborne portable, twenty-five years ago.

Yes, the cell phone. Even here in the United States, it will soon replace the laptop. Just hear this, from The New York Times ten days ago:

Google Makes Its Entry Into the Wireless World
Google took its long-awaited plunge into the wireless world today, announcing that it is leading a broad industry alliance to transform mobile phones into powerful mobile computers that could accelerate the convergence of computing and communications….
The technology is expected to provide cellular handset manufacturers and wireless operators with capabilities that match and potentially surpass those using smartphone software.... In contrast to the existing competitors, Google’s software will be offered freely under “open source” licensing terms, meaning that handset manufacturers will be able to use it at no cost and be free to add new features to differentiate their products….
[A spokesperson] said the [Google] open-source strategy would encourage rapid innovation and lower the bar to entry in the highly competitive handset market, where software accounts for an increasing share of the cost of making a phone….
[The spokesperson] also said that in the future, the Google technology could be used in other portable devices, including small hand-held computers and car navigation systems.
Google’s phone software is named Android.

Hear that? “Including small hand-held computers.” The cell phone is going to become those computers. An Associated Press article from November 4th says the process has already begun, in Japan, at least:
The PC's role in Japanese homes is diminishing, as its once-awesome monopoly on processing power is encroached by gadgets such as smart phones that act like pocket-size computers, advanced Internet-connected game consoles, digital video recorders with terabytes of memory.

Our students and the hundred-million people with cell phones in Africa (just to pick another place) understand what is happening, even if they aren't already verbalizing it.

In a November 7 editorial, The New York Times states that:

What Google seems to be envisioning — apart from a greatly expanded market for ad sales, of course — is software for mobile devices that will be more flexible and innovative than most of us are used to seeing on our cellphones.
The impact on the phone market could be enormous. The mobile world is currently shaped mainly by the carriers — the networks that provide the connections. Google aims to turn this around, to put the software first, and to open the development of the software and the phones themselves to third parties. The end result could well be a more richly and fully integrated universe of mobile devices — “smartphones” that in many ways resemble hand-held computers — and greater choice for consumers....
Another winner is likely to be innovation. Google’s new model is betting that more minds are better than fewer, and that the future of the cellphone lies less in the phone itself than in its role as a tiny computer capable of connecting in any number of ways to the world — real and virtual — around it.

Innovation. While we're still thinking in terms of smart classrooms and computer labs, and envisioning new ways of keeping personal devices out of the classroom, we are being passed by. The innovations that marked the last 25 years were the PC and the Internet. Even those of us who embraced these dynamic changes early on have now slacked off, seeing them as the immutable center. We are stuck inside of Academic 1.0 while a whole new Academic 2.0 rises around us.

The conductors of the ECAR survey:

speculate that communicating—socially, recreationally, professionally, and academically—via a wide variety of communication technologies may be so interlaced with the student experience as to be increasingly inseparable by [survey] respondents as an educational outcome. (ECAR, 85)

To them, the Web 2.0 experience encompasses even their academic experiences. Yet, we in the academy still cling to the idea that our usages of technology are somehow removed from the rest of the world—witness the popularity of Blackboard, a stand-alone application resolutely going against the philosophical grain of Web 2.0. The reasons for this are simple: we want to control the academic experience (and, for a number of legal reasons, have to control some of it) and have grown comfortable in an ivory-tower existence that keeps us remote from the “real” world.

Institutionally, Brad DeLong's idea of the Internet “invisible college” extending well beyond ivied walls scares us.

However, as developmental focus turns more and more to hand-held devices, we are going to have to relinquish control (to some degree, at least) over technology in our classrooms, entering into a Web 2.0 experience for our Academic 2.0 purposes, making use of what the students are doing and finding instead of finding things for them to do ourselves. This is going to force us to enter into our students' worlds in a manner never before necessary.

The title of this panel, “Up from the Streets: Melding Diversity through Technology in the Writing Classroom,” refers, in part, to the fact that what we should now be looking to make use of increasingly comes from the students—both in terms of technology and experience. If we don't recognize this, we are going to lose the attention of the students. They, for example, are already involved in new kinds of code-switching, using different “voices” for texting, for email, and for academic writing. Even texting itself, which many of us teachers see as debasement of written English, is developing its own codes, markers that indicate the writer's position vis-a-vis a particular group, ethnic, racial, religious or otherwise. Our students may be quickly becoming more "neterate" than we!

We need to see that and use it.

Thursday, November 01, 2007

Blogging America: The New Public Sphere

Below is a press release relating to my forthcoming book:

Blogging's Role in Resurrection of 'Citizen Journalism'
Explored in New Book by City Tech's Aaron Barlow

Brooklyn, NY -- November 1, 2007 -- Cyberspace is a far cry from plowing with oxen in West Africa, but the Internet is where Aaron Barlow -- formerly a Fulbright Scholar at the University of Ouagadougou, Burkina Faso, and a Peace Corps volunteer in neighboring Togo -- spends a lot of time these days, and he brings a powerful message from out there.

Dr. Barlow, an assistant professor of English at New York City College of Technology (City Tech), is the author of a book due out at the end of this month, Blogging America: The New Public Sphere (Praeger), a study of blogs as they are now. (For those unfamiliar with Internet jargon, a “blog” is a Web log -- a personal site to post information for all the world to see and comment on.)

Barlow’s current research centers on what he calls "neteracy" (the ability to effectively negotiate the Web) and "massed media" (media coming from the people). Influenced by the work of thinkers Jurgen Habermas, B. F. Skinner and Walter Ong, Barlow talks about the growing need for people to become “neterate” as well as literate, to take full advantage of the Web. “I also consider changing attitudes toward communications technology, especially as technology increasingly falls into the hands of end users; this was not expected at all 50 years ago,” he says.
His previous book on the subject, The Rise of the Blogosphere: American Backgrounds (Praeger Publishers/Greenwood Publishing Group, Inc.), published this past March, was the first to provide readers with a cultural/historical account of the blog. It traces the evolution of American journalism from colonial times to the present, and examines factors leading to the blog explosion. In it, Barlow refers to Benjamin Franklin as "the patron saint of blogs," and dubs Thomas Paine, a revolutionary “citizen journalist.”
Barlow, a Lefferts Manor, Brooklyn, resident, off and on, for 37 years, has criticized media trends such as news presented as entertainment, while emphasizing journalism’s role of contributing to informed public debate on issues. Though mainstream media may now shirk that role, Barlow sees Web journals and blogs as helping to form public opinion and to give individuals a voice on critical issues by "broadening the public sphere, bringing popular opinion back into our national debates.”

Raised a Quaker, Barlow was "always aware of political activism. The civil rights movement and the anti-Vietnam War movement colored my youth.” He was a citizen journalist even before that label existed. At 11, he began learning printing methods. In high school, he launched an “underground” paper, and a year later printed his first book on an old letterpress. As a graduate student, he edited and wrote for a monthly environmental publication. His doctoral dissertation at the University of Iowa focused on science-fiction writer Philip K. Dick, whose work has been made into numerous films.

In 2005, his book The DVD Revolution: Movies, Culture, & Technology was published. It was when he began exploring online film culture for that project that he first considered studying the blogosphere.

Barlow says that his background -- which includes living in 11 states and three foreign countries, working with the previously mentioned oxen in the Peace Corps, and founding and running a store and café -- "has allowed me to interact with all kinds of people and has heightened my understanding of the importance of grassroots expression."

It helped develop his commitment to the volunteer citizen journalist organization ePluribus Media (, first as a founding member and now as a writer and editor for the ePluribus Media Journal. This year he represented the group in the prestigious Punch Sulzberger News Media Executive Leadership Program at Columbia School of Journalism (

Barlow says he is honored to be involved in an effort that puts him in contact with people in journalism active today in changing the profession. That involvement has also proved useful in his teaching. “I am seeing first-hand the changes that commercial news media are going through as they try to adapt to a world that includes the Internet. This has been one of the most intensive learning experiences I’ve had in years!”

This past spring, Barlow delivered a presentation on the theme of "Orality and Literacy: The Next 25 Years," at the annual Computers and Writing Conference, held in Detroit. Because of new technologies, he noted, the teaching of writing can no longer be based on working solely within the literacy tradition. "Students must be able to negotiate the virtual world, and teachers must use technology as an aid to student writing," he says.

In his writing courses at City Tech, where he has taught since fall 2006, Barlow finds that students are quite adept at social blogging on MySpace, etc., "but they don't yet see blogs as an extension of their interests. What a student needs in approaching writing through the Web is a sense of confidence, of being part of the conversation.”

He encourages writing instructors to focus on citizen journalism when teaching about research as an aspect of writing. "We need to bring writing and research to life for our students," he says. "We are developing the methodology of tomorrow’s journalism.”

The role of blogs and the future of journalism are still open-ended, though, and even Barlow can’t predict the next phase. “I wish I knew what impact blogs will have on American society! Changes in technology are coming fast and furious, and I don't know what the results will be,” he notes.

New York City College of Technology (City Tech) of The City University of New York is the largest public college of technology in New York State. The College enrolls more than 13,500 students in 57 baccalaureate, associate and specialized certificate programs. Another 15,000 students enroll annually in adult education and workforce development programs, many of which lead to licensure and certification. Located at 300 Jay Street in Downtown Brooklyn, City Tech is at the MetroTech Center academic and commercial complex, convenient to public transportation.