Thursday, June 28, 2012

The Changes Around Us

One of the things I have noticed over the past few years is how much easier the type of research I like to do is getting. No longer do I have to go to one of the massive university libraries; no longer do I have to wait for inter-library loan. No longer do I have to put one avenue of investigation on hold while waiting to view hard-to-find sources. Because much of what I want to read first was generally published before 1923, almost everything I need is available on the web, generally from Gutenberg or Bartleby, if not from a university website.

Not only that, but the footnotes and bibliographies in my newer books often lead me to books I can buy for almost nothing, getting them much more quickly than I ever could before and removing the danger (for me) of forgetting to return them to their homes.

Last night, I lay down on the bed to do a little preliminary reading, my Kindle Fire and a stack of books beside me. As I read, I was able to look up additional information at my ease, finding and ordering--and downloading--other books. I was happier than a pig in... well....

None of this is new, of course, just easier. My laptop could do all the Kindle can--and more. But it is bulky and gets rather hot. Libraries, though I had to travel to them, offer almost everything I can find on the web.

Ease, though, is changing research in the humanities. The things that can be found quickly and easily on the internet are those that are going to get most of the attention from scholars--who are as lazy as anyone else. We are as bad as anyone at not refining our searches when we get a million hits, of looking at only the first five of the Google results.

Things behind paywalls are going to start to suffer from this--if they haven't already. With the amount of information available for free, fewer and fewer are going to be willing to pay for an article--certainly not the $30 or so that is often charged by the Elseviers of the world. The top journals remain profit centers, but they are beginning to lose their scholarly centrality--especially in the humanities (though, from what I hear, there's quite a big rebellion beginning in the sciences). For a long time, they have been the playpen of the old guard, the places where those in academic power can express themselves (though rather tepidly and within constraints established by past generations of scholars). Now, they are being bypassed by open online journals and even by blogs. Significant information can almost always be found elsewhere.

Unless they (and their back issues) are easily available on the web, humanities journals run the risk of becoming backwaters. This will certainly be true as alternative venues for scholarship become acceptable to hiring, promotion, and tenure committees--as is now happening. The Modern Language Association (MLA), seeing this, has taken an important step towards making sure that the work it publishes will not simply disappear by giving control of copyright to its authors. This will allow essays to be posted elsewhere by the scholars, allowing them to keep their articles within the greater conversations evolving through the internet.

The journals and publishers that don't follow suit will soon find authors reluctant to contribute. I would love to have any number of book chapters of mine available on the web, but am reluctant, for I don't control the copyright. From now on, I will inquire, as I agree to submit, and may reconsider if I do not retain that control. After all, I am interested in readers and impact, not in the tiny bit of money an essay of mine might earn for anyone.

Things are changing. Not in any cataclysmic way, but changing nonetheless. We'll see if companies and copyright follow.

Wednesday, June 27, 2012

Pathways: What More Can One Say?


Statement to the CUNY Board of Trustees June 18/25 2012 

A Path Forward 

Sandi Cooper, chair University Faculty Senate 

One year ago in June 2011 you chose to ignore a range of protests from CUNY faculty regarding the proposed Pathways resolution, dismissing the faculty comments either as foot dragging, self serving, time wasting or trivial. One example was ridiculing a request from senior colleges for at least a four credit increment in the senior college option on the grounds that four credits were inconsequential.

Today you are faced with a crescendo of protests – NONE claims to dismiss the transfer issues but all assert that the proposed common core substitution for our existing well crafted general education is an embarrassing dumbing down of education to a sub par high school level. Indeed there are many high schools in this country which would not have tolerated this curriculum for an instant. It would not get their students into a first rate state university, never mind a private liberal arts college. Your determination to move forward – to support your central administration which forced this proposition on over 7000 full time faculty by selecting a few hundred from our midst – has produced the worst demoralization in CUNY since the 1976 retrenchment. You have pitted campus administration against elected faculty governance – with provosts sending in college proposals that never passed faculty curricula or senate votes; you have pitted departments and disciplines against each other; you have endorsed a resolution which now permits a student to take one semester of a foreign language but impossibly claims that the course will teach students to “Speak, read, and write a language other than English, and use that language to respond to cultures other than one’s own.”

The rigid insistence on 3 credit courses assaults the carefully designed programs at the College of Staten Island, which were authorized by the Board of Higher Education in 1967 – but you care little for history or precedent. Worse, this insistence mocks national standards of science and English teaching – largely eliminating laboratory work in the former and focused writing instruction in the latter. For a student body with highly mixed skills and next to no knowledge of history – US or other – you propose a 3 credit course in “world cultures” which can come from any one of the following disciplines: anthropology. communications, cultural studies, economics, ethnic studies, foreign languages (building upon previous language acquisition), geography, history, political science, sociology, and world literature. This course, in addition to the ludicrous objective of language fluency in one semester (which can be based on some distant high school experience) promises to “analyze the historical development of one or more non-U.S. societies.” Such an objective could only be included by someone who knows nothing about historical methodology, nor about the woeful national ignorance of American students regarding US or global history, or by someone who hated history in school. This muddled list – ONE course from any of those areas – insults by ignoring the variegated methodological underpinnings of each. Worse, it blocks students from experiencing an area which they may never had touched before and which might have become their major.

The common core math requirement which equates quantitative reasoning with college level math is obviously designed to push through that large group of students who struggle with mathematical concepts. Clearly this is an effort to expand graduation rates but at what cost to the student? Several years ago a previous vice chancellor identified math as one of the “killer courses” in CUNY; the common core expands that concept to include most of our disciplines.

One of the most respected members of the UFS Executive Committee gave voice to a whispered murmur– calling this curriculum is an example of “soft racism” – lowering expectations for students to provide them with a paper credential instead of an education equal to accepted standards.

I find it interesting that the central administration has ordered all IT systems to move forward to create the platforms for pathways. Last year when we proposed adapting our IT to support a rational transfer system devised by the faculty, we were told that it would cost far too much to do so .. to reconfigure CUNYfirst so that a student could easily see what catalog requirements at a different college were. Now evidently the IT can be adapted. For me this is further evidence that faculty engagement was not wanted IF it was not controlled from the top. Any of us with independent ideas or data were to be marginalized ... thanked for our interest and dismissed.

In defending pathways I have heard some of you refer to your own negative college memories or to how you smoothly moved from two to four year schools. Anecdotes are amusing – and I can insert my own which recall 64 credits of general education at CCNY with tremendous fondness – but they should not be the basis of policy. Because a vice chancellor did not have general education requirements at an Ivy League and a chancellor regrets not having had more liberal arts at CCNY, because a previous trustee flunked a course 3 decades ago at Brooklyn College, producing our CUNY policy that allows a student to balance at F grade with another stab at the course to negate that F; because a few dozen students wave handmade placards at trustee meetings – none of this is evidence sufficient to undermine and insult the combined experience and training of over 7000 faculty who – I assert – know more collectively about higher education than an entire battalion of administrators and trustees. Moreover the gratuitous destruction of unique historical qualities of each campus is too high a price to pay for a problem which could have been solved more rationally, with more buy in and support, less top down micro management and propaganda and produce a result that did not embarrass us all. It is no benefit to students to send them forth into a very competitive world where they face college graduates from elite schools whose training and skills; who facility with the English language and familiarity with global realities; clearly beats and defeats most of our students.

You have a simple solution to this needless and heedless policy. Let us salvage what is good about it, return it to the college faculties, to the elected senates, to work on as the faculty at SUNY and Cal State have and are doing. It needs at least two years; it cannot be implemented by fiat from the top if the bottom does not believe in it. We have senates, curriculum committees -- all authorized in the charters which you have voted on -- we have discipline councils; we have young, enthusiastic faculty who are afraid to speak out now just as some provosts are -- because of fear of retribution. The negativity prevailing now can be undone. The faculty are ready to move on.

Tuesday, June 26, 2012

The Physical College

In a New York Times opinion piece today, Jeff Selingo of The Chronicle of Higher Education lays out 'urgent needs' for American colleges and universities--but completely ignores the physical changes that would be necessary for successfully meeting those needs.

Selingo's 'needs':
  1. Improve usage of technology in the classroom;
  2. Offer more online instruction;
  3. Make 'academics' the top priority;
  4. Cut back on the quest for 'research' status;
  5. Make sure all courses a student takes count for the degree.
The last three I agree with completely. The first two? Well, they are laden with assumptions that I am not sure I can accept. They are built upon current visions of the structures of education, structures that center on the traditional classroom and sage-on-the-stage extension into the digital world (what is a Massively Open Online Course, or MOOC, without the concept of the lecture?). Second, they assume that technology in the classroom and online instruction are two different things, assuming the classroom walls as barriers.

In "Good-Bye, Teacher," Fred Keller describes a much more flexible system, one not bound by traditional concepts of the classroom:
[On Tuesday] John receives... instructions and some words of advice from his professor.... He is... advised that, in addition to the regular classroom hours on Tuesday and Thursday, readiness tests may be taken on Saturday forenoons and Wednesday afternoons of each week - periods in which he can catch up with, or move ahead of, the rest of the class.
He then receives his first assignment [with] "study questions", about 30 in number. He is told to seek out the answers to these questions in his reading, so as to prepare himself for the questions he will be asked in his readiness tests. He is free to study wherever he pleases, but he is strongly encouraged to use the study hail for at least part of the time. Conditions for work are optimal there, with other students doing the same thing and with an assistant or proctor on hand to clarify a confusing passage or a difficult concept....
On Thursday, John... decided to finish his study in the classroom, where he cannot but feel that the instructor really expects him. An assistant is in charge, about half the class is there, and some late registrants are reading the course description.... 
On the following Tuesday, he appears in study hall again, ready for testing... He reports to the assistant, who sends him... to the testing room.... .The test is composed of 10 fill-in questions and one short-answer essay question.... 
[John's student proctor] runs through John's answers quickly, checking two of them as incorrect and placing a question mark after his answer to the essay question. Then she asks him why he answered these three as he did. His replies show two misinterpretations of the question and one failure in written expression. A restatement of the fill-in questions and some probing with respect to the essay leads Anne to write an O.K. alongside each challenged answer....
As he leaves the room, John notices the announcement of a 20-mm lecture by his instructor, for all students who have passed Unit 3 by the following Friday, and he resolves that he will be there.
Rather than a structure bound by walls and hours, Keller's flexible suite of need-determined rooms can make for a learning environment that can make use of our new technologies indeed--and without removing what is so important in fact-to-face instruction (face-to-face not just with instructors, but with fellow students).

If we are going to improve education, we can't just imagine technology as the way, the answer. We also need to re-examine our very ideas of "classroom," of "meeting," and of process (and more). What I would like to see is a jettisoning  of the formula of place-bound and time-based empires presided over by solo teachers. A suite including a small lecture hall (for lectures, films, performances, etc.), a seminar room, a technology center, a laboratory, a lounge (set up for comfortable reading and talking), and study space that can be used by individuals, pairs, or small groups. Oh, and offices for the faculty and for student proctors, offices physically open to all. Within each suite, flexible schedules could be created by the group (say, five members, each from different but related departments) allowing for oversight and involvement.

A suite of this nature could become a locus for learning, a real learning community, with faculty put together because students taking the course from one would likely be taking one from another. It would extend outward through digital devices connecting students with each other, with proctors, with instructors, and with events taking place in the suite.

We can't improve the use of technology in the classroom until we improve our idea of the classroom. We can't create really effective online instruction until we can connect it to the classroom. Until we re-envision the classroom itself, the meeting of Selingo's first two points will ultimately prove to be nothing more than additional smoke and mirrors. Without changes to the structures of the physical college, the virtual college will never have the anchor is needs for real stability and success.

Saturday, June 23, 2012

The Misuse of "Research" or Don't Always Trust What You Read

I am always looking for new examples of sloppy research, of reliance on the first three Google hits, of the assumption that if something is in print (or online) it must be true. In Robert Leston's and my Beyond the Blogosphere: Information and Its Children, I use the example of Joy Masoff, author of Our Virginia: Past and Present, a history text for elementary-school students. The book claims that large numbers of African-Americans fought for the Confederacy during the Civil War. Masoff said:
she found the information about black Confederate soldiers primarily through Internet research, which turned up work by members of the Sons of Confederate Veterans.
 Her defense?
"As controversial as it is, I stand by what I write," she said. "I am a fairly respected writer."
Yesterday, I came across an even better example, though one from pre-internet years. It appears in the Preface to The Lost Life of Horatio Alger, Jr.  by Gary Scharnhorst with Jack Bales. If any incident can teach us the dangers of shoddy research and trust, this one can.

Apparently, Herbert Mayes, later a respected editor, perpetuated a hoax when he was a young man, producing a "biography" of Horatio Alger, Jr. in 1927 that he had made up completely (shades of Stephen Glass). By default (and because nobody bothered to check the research or sources), this became the standard for biographies of Alger, information from it appearing in biographical dictionaries and becoming the basis for later biographies. Mayes hadn't been able to find much out about Alger so, either as a parody or as deliberate fraud, he simply made the man up, attaching the famous name. Scharnhorst and Bales write:
All Alger biographers to date have grappled with the same problem of meager sources that first beset Mayes. At best, they have cursorily sketched a life on the basis of skimpy evidence. Most have cited without question Mayes's fabricated documents. At worst, they have borrowed and embellished those sources with their own fabrications. Writing an authoritative biography of Alger now is a task akin to disproving a conspiracy theory. (xiv)
Fraud is all around us. The only way to be an effective researcher is to also be an effective investigator of fraud. Even corroborating evidence can turn out to be doubtful, as in the Masoff case, as with Alger--as with "Binjamin Wilkomirski" who claimed to have known "Laura Grabowski" in a WWII concentration camp (and she him), only to have it shown that they both, for reasons independent of each other, were lying. Blake Eskin's neat little book on Wilkomirski, A Life in Pieces: The Making and Unmaking of Binjamin Wilkomirski, is another good object-lesson on the dangers of too generous belief.

Though I've never much liked Ronald Reagan, I still love one of his favorite phrases, "trust, but verify."

Thursday, June 21, 2012

Just Who Won the Civil War, Anyhow?

Last summer, waiting to visit the submarine Growler at the Intrepid museum on the Hudson River, I watched a video on the development of the submarine. Included was discussion of Confederate submarines during the Civil War with a reference to one of the developers as a "patriot." I thought that odd: where else would someone in open rebellion against a country be later called a "patriot" by that country? I mean, would George Washington ever be considered a "patriot" by Great Britain?

The meaning of "patriot" has always been of concern to me. I didn't like it when, as a youth, I was called "unpatriotic" for opposing the Vietnam War for I thought my action the most patriotic thing I could do. As a southerner born often living in the north while growing up, I was constantly reminded of the question of allegiance. This was during the 1950s, and the questions were quite real, especially to those of us with southern roots--and I certainly did feel a divide.

My great-grandfather Marion Stephen Barlow served under General Phil Sheridan (his company is now the inspiration for a group of re-enactors) in the chase of Jubal Early's Confederate army, the battle of Opequon, and the razing of the Shenandoah Valley. The army he served in is still hated by many Virginians for the destruction it caused, destruction as horrible as anything found in Sherman's 'march to the sea.'

After the war, great-grandfather Marion reminisced with an uncle of his (whose name I don't know, but who was likely a little further removed, probably a cousin, a child of Marion's uncle Aaron, who had moved across the Ohio River to Virginia--what is now West Virginia--many years before the Civil War) who had fought in the Confederate Army. They determined that they had both been involved in a skirmish at a place called Gauley Bridge, and could even have been shooting at each other.

The other side of my family was determinedly Confederate. Three of my great-great-grandfathers fought, one of them captured during the breakout at Petersburg in 1865, spending the final months of the war in a prisoner-of-war camp in Maryland. All three were men from western North Carolina.


The Civil War, of course, was a horrible experience. Its aftermath in Reconstruction and then in Jim Crow was sometimes almost as bad. Sometimes as bad. Yet, by the end of the 1960s, we seemed ready to put that behind us and to move forward as a unified nation devoted to ideals of equality and possibility. Belief in that made me proud and, yes, patriotic.


When my family moved back south in 1961, it was both the time of the Civil War centennial and the Civil Rights Movement. Watching both, I was confirmed in my faith in the union, faith that had developed over my earlier young life. There may have been a certain romanticism attached to the lost cause of the south, but the reality of it was that the United States was better off without dominate states rights and with a federal government committed to the protection of the rights of all. I had become a real patriot, proud of my country, fundamentally and permanently... and I thought all other southerners were moving that way, as well.

Now, fifty years later, I am beginning to think I was wrong. Rick Perry thinking Texas could secede from the union; Grover Norquist wanting to drown the Federal government in a bathtub. Even the hatred of Obama as "not one of us" (as non-white) is making me think that, after all this time, the victory of the north is receding, with the revenge of the south at hand. Making me think that the "patriotism" of much of America isn't patriotism for the United States at all, but for a Confederacy hiding in US clothing (much as, in the mountains after the Civil War, a U and S compressed together--or so the story goes--was a sign of lingering Confederate sympathies, as in the sign from my great-great-grandfather Joel Dimmette's post office in the picture here). Our Supreme Court seems to be handing the states the rights the southern ones once thought should be theirs but that they lost through war. Attitudes of hatred toward Washington are often hiding attitudes of hatred toward the once-victorious north.

So, who really won the Civil War?

Right now, I'm beginning to believe that no one did, though the old south may be growing in dominance. The south, as the old saying had it, has certainly risen again. Today's patriots aren't proud of a union of fifty states dedicated to liberty and justice for all, but are advocates of an ersatz "America" that is a stand-in for what had once seemed a defeated racist, classist, and economically oppressive system, a system now on the verge of being re-instituted over the country as a whole, not even just over the region where it began. That's bad for all of us, especially the real patriots, north and south.


Wednesday, June 20, 2012

The Judgment of Writing

For the CUNY Assessment Test in Writing (CATW) a scoring rubric breaks analysis of student writings (responding to prompts that include short texts) into five categories. These are:
  1. Critical Response to the Writing Task and the Text;
  2. Development of Writer's Ideas;
  3. Structure of the Response;
  4. Language Use: Sentences and Word Choice;
  5. Language Use: Grammar, Usage, Mechanics.
Each of these is scored on a scale of 1-6, with the scores of the first three categories doubled for the final result.

At the start of scoring sessions, a round of "norming" is standard procedure. This brings scorers together, making sure that each understands the ratings in approximately the same way so that deviation in scores will be at a minimum.

From a listserv I subscribe to, I find that the norming concept is now being taken a step further (though not with CATW but with another scoring rubric). Setting up a bar graph of the results in each of four (rather than five) categories, the researcher hopes to establish a standard grade for the putative essay associated with each particular configuration, doing so through a norming process called "Social Judgment Analysis" (SJA), a process that was developed in the 1970s for policy-conflict resolution and that centers around the work of Kenneth R. Hammond.

In Leonard Adelman, Thomas R. Stewart, and Hammond's “Application of Social Judgment Theory to Policy Formulation” (Policy Sciences 6 1975, 137-159), the authors state that:
social judgment theorists have developed computer graphics technology as a means of resolving policy differences. Such devices can provide (1) immediate statistical analysis of the judgment process, in terms of weights, function forms, and consistency…, and (2) immediate pictorial description of these parameters. (141)
the primary advantage of the present computer graphics system to policy-makers is that it makes explicit, both statistically and pictorially, where agreement and disagreement lie; or in other words, the cognitive differences that result in disagreement. In short, it serves a clarifying function. (142)
They go on to describe how they put what would come to be SJA to use:
Since the participants had different policies concerning the relative importance of the various functions…, the first step in the study was to describe each participant’s policy, in terms of weights, function forms, and consistency, i.e., do policy-capturing. Such action would permit 91) the pictorial representation of the participants’ policies and thereby aid them in understanding their similarities and differences, and (2) the groups or clustering of the participants in terms of the homogeneity of their individual policies. (147)
Their conclusions are:
specifically, (a) social judgment theory asserts that policy quarrels are often cognitive in origin…. The theory also asserts that (b) computer graphics technology makes explicit, both statistically and pictorially, the cognitive differences that result in disagreement, and (c) that such clarification should result in the understanding and subsequent resolution of such differences. (156)
Applying this as a norming process, rather than as one for problem solving, can have, the researcher implies, certain benefits. An "expert" reader of the bar graphs developed through the process can learn a great deal about the writing and even the writer--and about the opinions of the scorers (which can then be discussed and even adjusted through the analysis). If it were applied to CATW (it won't be: it's not needed there... I just show that as an example of a rubric and so that I can talk about norming), the "expert" might even be able to tell quickly where to place a student on a needs spectrum.

Before I go further, I should say that I have no problem with SJA. In fact, it relates to Robert Leston's excellent chapter "Smart Mobs or Mobs Rule?" in our book Beyond the Blogosphere: Information and Its Children. The problem is that this (as norming does in general, but to a much greater and much more troubling extent) can strip writing of its primary purpose, communication (or effectiveness) and writing evaluation of its necessary close link to the act of communication itself. If used as a structure for evaluating writing, it will encourage the type of writing that George Orwell warned against just after the end of World War II in "Politics and the English Language," rewarding use of 'dying metaphors,' 'operators or false verbal limbs,' 'pretentious diction,' and 'meaningless words.' Though proponents might try to argue differently, an SJA norming must completely ignore the rules for writing Orwell offers:
(i) Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.
(ii) Never use a long word where a short one will do.
(iii) If it is possible to cut a word out, always cut it out.
(iv) Never use the passive where you can use the active.
(v) Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
(vi) Break any of these rules sooner than say anything outright barbarous.
It's almost impossible for any norming process to take these rules into account, for none of them is quantifiable.

Any scoring rubric is problematic, anyway: reducing writing to numbers always removes it from the dynamic of communication. Though the rubric is created in order to remove the subjective element, it also removes the written document from the stimulus/response/reinforcement active paradigm that is real communication--and makes possible the 'barbarisms' Orwell details. Any rubric removes the writing under consideration one step from what should be the real purpose of evaluation, consideration of the effectiveness of the writing. Adding an additional step of creating a visual representation (the bar graph) further removes evaluation from the writing task itself.

A rubric is an artificial device for a specific purpose and should not be considered beyond that purpose. I suspect the researcher knows that and doesn't want to use it but for one specific purpose. The problem is, it probably will be. In CATW, the rubric is used with the exam for placement in First Year Composition. The developers of the exam and the rubric understand that there is limited value in what they have constructed and make no claims for it beyond the single purpose. Even that is troubling, for it does not stop others from imagining that writing can be assessed comprehensively and effectively through rubrics, leading to arguments that 'machine grading' of essays can be effective. Yet that doesn't even work in situations like the CATW: there are just too many judgment calls that a machine can't make. But there are plenty of people who want to believe that it can.

The researcher in question uses SJA to explore the differing attitudes towards written work within different teaching populations by asking teachers to match bar-graph instances with specific letter grades. The idea is to then see what the differences are in the answers and to use that to better understand the needs perceived in the different environments.

In addition to my concern that this can become an assessment tool, especially for 'machine grading,' I worry that the information the researcher will gather is flawed, and for one quite particular and specific reasons: Writing teachers like me cannot complete the survey, so the information will be lopsided, at best. And necessarily incomplete.

When I tried to complete the survey, I was faced with a series of bar graphs of four bars each, the bars representing different areas of judgment of writing on a scale of 1 to 4 (similar enough to the CATW rubric for me to understand what was being done). I had to match each configuration to a letter grade.

As I started, I felt a strange sensation, almost a physical vertigo. Something was wrong. As I tried to make a selection, I realized I could not; I felt as though there were a wall between me and what I was trying to evaluate.

Upon reflection, I realized that, indeed, there was: I was being asked to evaluate writing without the ability to see the writing. And I could not do it, not even in the most abstract fashion. I could not withdraw myself from the focus on actual communication that is at the heart of my teaching. What I was being asked to do had nothing to do with actual writing, but it claimed it had. The cognitive dissonance was so great that it paralyzed me. As the researcher said, it would be too much work to actually ask people to read all of the essays. This is more efficient. But I could not do it.

Maybe it is more efficient. But it is not evaluation of writing. Even by participating in such a survey, I would be tacitly agreeing that writing can be evaluated, to some degree at least, through graphic representation of previous evaluations of defined parts of the physical artifact. By participating, I would be abetting those who would like to see essay assessment done by machines, who believe that writing can be stripped of all of its dynamism and reduced to formalized squiggles alone, leaving communication completely out of the picture. I could not.

We are at a dangerous crossroads, already turning towards unwarranted reliance on assessment tools rather than on teachers and readers. For a nation that claims to have faith in the individual, this is peculiar--but it is also changing learning into the mastering of form. Real content gets stripped away, as it does in this particular usage of SJA (it doesn't in others, where the content remains central because the scorers are also the contributors). We cannot afford more of that, so should be extremely careful about feeding our mania for data, no matter how well intended.

Tuesday, June 19, 2012

The End of Power... or the End of Her Party?

It has taken me almost two weeks to get my breath back, so shocked was I.

On June 7, as most who follow the Washington pundocracy know, Sally Quinn (wife of former Washington Post editor Ben Bradlee) declared that power (in Washington, at least) is over. Done. Kaput. Money has replaced it:
Now, at a party, if you find people staring over your shoulder to see who’s more important in the room, they’re usually looking at someone rich, rather than someone powerful.
Maybe Quinn can be excused. She grew up, after all, within Washington's chattering class and probably has always assumed the party is the center of the earth. In fact, I think she mistakes "party" for "power." Anyhow, she was hired to write about that class by the man who would later become her husband....

What she does not seem to have realized, as she penned this piece, is that she isn't lamenting the end of power in general, but the end of her own party, though she did once write an article about the 'traditional' Washington hostess's demise called "The Party's Over." She, and the people she has adored since the sixties, are being muscled aside once more but by a different group... and each new group seems raw, uncouth, and interested only in money to those they replace.

Quinn makes me remember Irving Berlin's song "The Hostest with the Mostest on the Ball" as an indication that things, no matter what Quinn may think (and has thought, for twenty-five years, at least), haven't changed:

I've a great big bar and good caviar
Yes, the best that can be found
And a large amount in my bank account
When election time comes round
If you're feeling presidential
You can make it, yes indeed
There are just three things essential
Let me tell you, all you need
Is an ounce of wisdom and a pound of gall
And the hostess with the mostest on the ball.
That's from 1950, sung by Ethel Merman in Call Me Madam. It's about Perle Mesta--remember her? Thought not. She came to Washington on the back of Oklahoma oil money. She's not that different from the yahoos (as Quinn probably sees them) who are arriving in Washington today. She's not that different from Quinn herself--even though Quinn grew up in the DC environment and, therefore, did not have to learn its ways as an adult.

At the start of her article, Quinn writes:
In April, at the White House Correspondents’ Association dinner, my husband, Ben Bradlee, and I found ourselves sandwiched between the Kardashians and Newt and Callista Gingrich. Heavily made up and smiling for the cameras, the reality TV family and the political couple were swarmed over by the paparazzi.
There seems to be a certain jealousy here, or nostalgia. Not to worry, Ms. Quinn. You are not alone. In a few years, a Kardashian will write a similar lament about the next generation of Washington darlings.

Monday, June 18, 2012

Tailgates and Substitutes

Back in the 1960s, when I was even more naive and gullible than today, I would get incensed by differing versions of Bob Dylan songs. Which was the "right" one, I would demand of myself. I had a suspicion that he had written the raw versions that turned up on bootlegs, in songbooks, and elsewhere... and that the ones recorded by Joan Baez (in particular) had been gussied up by someone else.

I didn't understand what had been going on with the songs.

One of my favorites from that time, and one with major changes in lyrics between versions, is "You Ain't Goin' Nowhere." I think I knew the Baez version first, which may have been part of my problem. She sings one verse as:
Genghis Khan, he could not keep
All his kings supplied with sleep;
We’ll climb that hill no matter how steep
When we get up to it.
Later, I heard Dylan singing this on one of his own recordings:
Genghis Khan and his brother Don
Could not keep on keepin' on;
We'll cross that ridge after its gone,
After we're way past it. 
There was something as whimsical about the latter as there was beautiful about the former. At that time, whimsy was more my favorite.

Now, looking back, I see that I completely missed an example of how the song-writing process works. Dylan had a tune and an idea of his lyrics, but they were not fleshed out when he first wrote the song. Many of the words of that earlier version that he recorded, then, were simply placeholders.

Somehow, I like that a great deal. Nothing is every really finished, anyhow. Everything can keep evolving, keep growing.

Not that that makes anything better. Just different. And that's the beauty of it.

Constricting Art Through "Ownership"

In yesterday's New York Times, a lawyer named Michael Rips presented a piece entitled "Fair Use, Art, Swiss Cheese and Me" about the artist DavideSalle's use of a picture of him. He ends:
It is David Salle and the artistic movement with which he is associated, and to which he has greatly contributed, that give value to his paintings — and not, sadly, my image. For those who believe otherwise, I have boxes of old photographs I would be happy to sell.
This is an important point not just about fair use but about the artistic commons: something of ours may be used in a work, but that doesn't mean that we have contributed to the work. That something of ours, also, should (for the most part and certainly after a limited period of time) be in the commons anyway, making it all of ours so open to the use of any one of us. This helps all of us.

The specter of copyright violation, as "copyright" is interpreted increasingly liberally and extensively by both courts and legislative bodies,  threatens to put a damper on creativity in numerous areas. Museums have:
argued that they would be forced to hire lawyers to investigate their collections for works containing borrowed images, and given the ubiquity of such images in 20th-century art, the cost to the museums would be unsustainable. The more likely, though no less troubling, alternative is for museums to censor what they exhibit.
If museums feel they must censor, artists will censor themselves... as musicians are already doing from fear of infringement.

This does not bode well. The need for a commons is enshrined in the US Constitution in Article 1, Section 8, where Congress is empowered:
To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.
The time limit is there to ensure that a growing commons exists for creative artists of all sorts to draw upon while, at the same time, making it possible for writers and discoverers to make a reasonable profit from their activities before the results become part of the commons.

Today, the commons is being strangled. There are rays of hope, though. Rips' understanding of the real source of value in art is one.

Saturday, June 16, 2012

Sense and Sensibility

Last fall, soon after I had taken on the role of NYCCT campus coordinator for the CUNY-BA program, a young woman came to talk with me about it. I wanted to pull her record up on the computer and asked for her Social Security Number.

"I don't have one," she said, in perfect American English. "I was born in Mexico and was brought here before I was two, but I have never had legal status."

She made a point of it. Soon, once I was able to see her record, I understood why. Her grade-point average was above 3.5. Many of the classes she had taken were in math and the sciences where she clearly excelled.

"I want to go to medical school, but I don't know if I would be able to, given my situation."

I didn't know if she could, or not.

"What about financial aid?" I asked her.

"I get none," she answered. "I work, illegally of course, as well as going to school."

This young woman will now get the chance to contribute to the American society she is a part of (her legal status is an irrelevancy--she is an American).

She should have always had it.

Friday, June 15, 2012

Biometric "Hysteria"?

Diane Ravitch may be adamant and forceful... but hysterical?

In a post today, she refers to another, one by Gary Houchens, a professor of "Educational, Leadership, & Research" at Western Kentucky University. His is entitled "Biometric hysteria: the anti-research mentality of the educational status quo." In a postscript Ravitch writes:
I do not like to refer to gender and I seldom do. But I can’t help but mention that there is a long history of men asserting their superiority by calling women “hysterical.” Why is it that men never are “hysterical,” only women?
Good question. But let's leave that aside.

Houchens' use of language isn't merely playing on sexual stereotypes, but creates assumptions where questioning might be more appropriate. "Anti-research mentality"? "Educational status quo"? These are just the types of phrases that the so-called educational "reformers" have been using for years to derail any questioning of what they are doing.

Let's look at the latter first: The assumption behind Houchens title, as the article shows, is that the Gates Foundation is not, somehow, part of the "educational status quo." That's a ridiculous statement on the face of it: Gates money is moving into a central position within American education. It has reshaped the status quo: just go to Common Core and you will find this:
The Common Core Curriculum Mapping Project, created and operated by Common Core, is funded in part by the Bill & Melinda Gates Foundation.
Right now, there is nothing having a greater impact (nothing new, that is) than the Common Core standards. It is establishment all the way.

The "reformers" who, like Houchens, still like to style themselves as outsiders trying to correct a corrupt system, became the insiders with the establishment of No Child Left Behind a decade ago.

What galls me more, however, is that "anti-research mentality" idea. Those of us who really do care to research education know that there's much more to it than the sort of lab testing of new methods of quantifying educational results or methods than the Gates biometrics project (or any of the others like it) can encompass. The likes of Houchens (or so I assume, given his choices of phrases) see "research" in much the way they see "assessment," as something leading to numbers. They love to test.

They don't seem to like to study.

Real research encompasses the past and what people learned in the past, most of which is not quantifiable. One cannot look ahead adequately without looking back substantially.

When I was a kid (this was during the summer of 1961), I was part of a programmed-instruction ("teaching machine") test group at Harvard, where my father was spending the summer as one of the researchers. They tested me (and others) for much the same purpose the Gates-funded project wants. It was all new and exciting.

But, by the end of the decade, it had all been abandoned (expect as tangential tools).

Why?

Because unmediated interaction between teacher and student (or student and student, or student and mentor) had been shown to be much more effective. Even B. F. Skinner, in The Technology of Teaching, had reached the conclusion that reliance on technology alone will never bring about improvement in education. Only reliance on teachers (and improving their performance) will do that.

Has Houchens even read Skinner? Have any of the "reformers" studied what was tried, and rejected, in the past?

Just askin'....

Houchens writes that he's:
willing to give researchers the benefit of the doubt and see what they can find by doing some basic exploratory studies.
Why bother? We already know that the earth isn't flat.

Wednesday, June 13, 2012

More on Blogging as a Research Component

I've posted twice (here and here) on "The Tranformative Potential of Blogs for Research in Higher Education" by Jana Bouwma-Gearhart and James Bess, but don't feel I'm nearly done with the topic. After all, it is something I've been interested in for years.

As I argue in the first of my books only blogging, The Rise of the Blogopshere, blogs have the potential of re-establishing a vibrant and unproscribed public sphere (as Jurgen Habermas describes it). This can only happen, however, as long as blogs are really and truly open--accessible to all and easily created by anyone. In fact, Bouwma-Gearhart and Bess echo Habermas, who sees the public sphere as, essentially, an 18th-century phenomenon undergoing constriction ever since, arguing (as I do, more generally) that blogs can open up academic discussion to a free-wheeling nature that has not been seen (except in unusual circumstances) for a long time and that can happen in real time with immediate collaborative response, debate, explanation, and even change:
In a blog setting, collaboration has the potential to happen in real time; the give and take of idea sharing and discovery give immediacy to progress in the research and writing. Veteran bloggers, in fact, claim appreciation for the promise of quick self-publishing and related reaction afforded by blogs (Lasica, 2001). Thus, the use of the blog allows the vast resources embedded in the research community to be brought to bear in the formative stages of the research with the summative stages represented subsequently in published material. For Fleishman (2001), blogs promise greater reader interaction with writers (more so than traditional modes of communicating ideas such as publishing in periodicals, etc.). Blogs can have multiple categories and some interfaces allow users to choose categories of interest and filter pertinent postings (Rhodes, 1999). According to Halavais (2006), “Blogs seem to be particularly good at establishing and exchanging what Merton calls ‘specified ignorance’ … [or] a new awareness of what is not yet known or understood and rationale for its being worth knowing” (p. 119). (260)

This affording of social value in relation to new offerings is a norm in academic communities, of course. In many ways, blogs mimic these communities’ strengths and have been shown to have the potential to evolve into true virtual communities, with a distinct community culture and affording of specific social value to their respective participants (Blanchard, 2004; Efimova, Hendrick, & Anjewierden, 2005b; Granovetter, 1973). We contend that blogs may, in fact, provide a new space for capitalizing on pre-existing and beneficial norms and forms of formal academic collaboration, including what Halavais terms the invisible college, or “the collective creation of a school of thought by a distributed group of scholars, often using both formal and informal channels to communicate their ideas” (p. 123). Blogs may additionally support the informal contacts and communication required for effective contribution, survival really, in the academic environment. Halavais (2006) likens the blog to “archetypal scholarly communication settings: the [research] notebook, the coffee house, and the editorial page” (p. 117). The blog, like these other settings, “is an effort to move thought into the social realm, by presenting facts, ideas, and requests for assistance-and ultimately build knowledge” (p. 120).
They can do even more. Walt Whitman, in "When I Heard the Learn'd Astronomer," contrasts the "applause in the lecture-room" with looking "up in perfect silence at the stars." What blogs can do is allow us to move seamlessly back and forth from one to the other--not having to, as Whitman did, leave one for the other.

'I Don't Have Time for This Nonsense'

Via Diane Ravitch's blog I've been learning more about the Gates Foundation desire to explore galvanic skin-response monitors as a classroom tool. Ravitch links to an article providing a hint of the rationale:
Gates officials hope the devices, known as Q Sensors, can become a common classroom tool, enabling teachers to see, in real time, which kids are tuned in and which are zoned out.
Any good teacher in a classroom with a reasonable number of students can already do this, of course. Any sort of an 'assist' of this nature is simply an insertion between teacher and student, removing the teacher one step further from the necessary personal interaction that is the heart of teaching. Focusing directly on the students as people takes almost all of the teacher's classroom time. Teachers don't have time for something that will distract them from this primary task. However:
To Sandi Jacobs, the promise of such technology outweighs the vague fear that it might be used in the future to punish teachers who fail to engage their students' Q Sensors.

Any device that helps a teacher identify and meet student needs "is a good thing," said Jacobs, vice president of the National Council on Teacher Quality, an advocacy group that receives funding from the Gates Foundation. "We have to be really open to what technology can bring."
Even when it distracts us from our task? Come on!

I am all for using technology when it broadens our possibilities, when it adds tools without taking away from procedures that are effective and necessary. When technology makes teachers pay more attention to the technology than to the student, as would happen (to some degree, at least) with monitoring the skin-response bracelets, I am much more hesitant.

In his book When Students Have Power: Negotiating Authority in a Critical Pedagogy, Ira Shor writes of 'Siberia,' the place in the classroom furthest from the teacher, where certain students go to zone out. Shor's strategy (one of them, actually) is to go to Siberia, to sit next to the students and coordinate the class from there. He couldn't do this were he tied to any sort of monitoring station. He wouldn't have time for it--the nonsense would have taken over.

Tuesday, June 12, 2012

Using Blogs for Research and Writing in the Humanities

In their article "The Transformative Potential of Blogs for Research in Higher Education,"  Jana Bouwma-Gearhart and James Bess write:
Blogging recognizes the message of social constructionism and the possibilities for new collaborative, real-time modes of information exchange that permit contributions from a vast number of potential expert collaborators from around the world. Blogging may allow for more egalitarian involvement of qualified academics on problems of interest, the establishment of more extensive-boundaried and more communicative communities of scholars, and more effective community involvement in the formative stages of research and presentation of research findings. Additionally, we would predict that with the opening up of information giving and receiving (with due recognition of needs for personal upward mobility), the academic community writ large will become, in a relatively short time, much more collaborative and collegial. (268)
Today, the blog retains the sullying image of the isolate in the basement and 'concern-troll' attitudes that if anyone can do it, its value is suspect. But that will change, especially in academic situations as scholars in the humanities (and elsewhere, but it is the humanities I know best) begin to discover that better work can be done through a base in the network of personal, independent blogs than through academic journals or managed websites from universities or even think-tanks. Blogs act as aggregators, and scholars look to them for sources and possibilities they might otherwise miss. I think I have discovered more through blogging over the past eight years than through any other information source--including libraries (though, I should note, much of my writing has been on the blogosphere, so that shouldn't be a complete surprise).

There are real problems blogs must overcome if they are to reach their full academic potential. The first comes from that image of the blog as a playpen, as a place for those who can't make it in the serious world of academic publishing. We still revere Oxford University Press; no blog can compete. The fallout from this is that blogging today has an unpleasant odor to some scholarly nostrils, especially to nostrils on hiring, re-appointment, tenure, and promotion committees. Only scholars already secure in their careers (or with nothing to lose for other reasons) can turn to blogging with confidence.

What I imagine for the future of academic blogging--in the best of all possible worlds--is something breaking down disciplinary, institutional, and ideological barriers. This goes in the opposite direction to a lot of what I am seeing these days where some scholars are trying to establish their own new bailiwicks in an unfolding digital environment, erecting barriers rather than tearing them down. Among these is the "movement" of "digital humanities," a group that wants to define (and define things out) rather than explore. What I'm seeing as the future of blogging, instead, moves beyond the digital, subsuming it as a given, to a much wider utilization of myriad possibilities. It is a world where even academic journals have atrophied, replaced by the blogs of individual scholars, each judged by how they are used by other scholars (and even by those beyond academic communities).

Already, I find I am using a wider range of tools than ever in preparation for my next book. I have my own library, personal contacts and discussions with colleagues, the electronic possibilities I have downloaded to my Kindle Fire and carry with me, the journal articles behind firewalls that I can access through CUNY libraries, more widely available journals and articles, websites dedicated to particular topics, and blogs that not only provide new insights themselves but that also lead me to sources I might never have come across on my own--and that are considerably more recent than anything available on my bookshelves. Over the years, I have gained confidence in my ability to differentiate between the lead and the dross, no longer retreating quite so much to the safe academic imprimaturs. I can ask questions from all sorts of people, even learn from them... as I recently did from an Australian musician who has returned to school for an undergraduate degree who put an essay of his online.

Over the next few years, especially as those of us more comfortable researching within the new and expanded environment (I don't want to call it "digital," not wanting to be confined to that) reach positions where we are serving on promotion committees, etc., it will become safer and even easier for scholars to work openly and with the broader palate. At that time, as Bouwma-Gearhart and Bess predict, our academic pursuits really will have become "more collaborative and collegial." And better scholarship than ever will have resulted.

Monday, June 11, 2012

"The Transformative Potential of Blogs for Research in Higher Education"

In the March/April issue of The Journal of Higher Education there's an article, "The Tranformative Potential of Blogs for Research in Higher Education" by Jana Bouwma-Gearhart and James Bess, that makes me think that the very potential of the title may be on the point of being realized. Bouwma-Gearhart and Bess write:
It is our contention that, as a result of the interplay of evolving cultural norms and communication technologies, as well as the changing demographics of academic research communities, it is now propitious to consider the utility and advantages of an even broader, more inclusive community of scholars (Goodman, 1962). Instead of involving only a few collaborators inside one’s proximate and accustomed academic research circle, ongoing research can now be informed by the contemporaneous inputs of a wider cadre of experts from around the world. These tangential researchers, whose traditional role has been as post-publication reviewers and revisionists, can both access and contribute the freshest ideas and knowledge in real time, even as those very ideas are being developed and refined, rather than having to wait until they are formally published. We call this anticipatory participation1 in research projects and concept development, or as Tapscott and Williams (2006) define it,prosumption—collaboration in advance with contemporaneous and eventual users of the ideas. In the case of a research community, these users are researchers themselves. If this collaborative, contributory practice becomes the norm, academic knowledge will thus be much more broadly socially constructed (Berger & Luckmann, 1966; Hibberd, 2005). (249-250)
Oh, yes! This is exactly what I want to see and want to be more involved in: academic knowledge broadly socially constructed. This is even why I published a Call For Papers on this blog yesterday instead of just posting where specialists go: I see no future for scholarship narrowly defined, with scholars "characterized as isolates, most commonly toiling alone in their work places" (249). I want to be in discussion with as many people involved as possible, and from many differing perspectives.

This is why I have been so interested in blogging. Not only does it carry possibilities for expanded conversations of all sorts and a re-emergence of a real public sphere, but it can make the kind of scholarship I enjoy most even more fun.

Yet, almost as important to me as the paper as a whole with its galvanizing presentation of possibilities is this:
This medium, with its potential to provide universal access to and exchange of ideas almost as they are being created, has been greeted with great acclaim in both the popular and academic domains (Kline, Burstein, de Keijzer, & Berger, 2005). As Barlow (2008) notes, however, innovations such as blogs “are not simply a function or result of the technology that distributes them. … Blogs are also a new and original cultural phenomenon, reflecting more the changes and needs in society than simple realization of technological possibility” (p. 1). We build on Barlow’s notion in this paper, arguing that blogs will reflect and meet modern needs of the higher education research community.
When you've been pushing an idea, trying to get others to try it on and move it in their own directions, it's extremely gratifying when that actually happens, when you see that you haven't been alone, crying in the wilderness.

Thanks, Bouwma-Gearhart and Bess. You've made my day in more ways than one.

Why Take Composition Courses?

The brouhaha about automated grading of essay exams came to mind the other day when I was asked, as an English teacher, to comment on whether or not a student might be able to submit a series of lab reports to fulfill a composition requirement. If automated grading catches on, we are going to see more of this, and may eventually see this leading to the end of composition-as-we-know-it. Perhaps that is not completely a bad thing (I am not really fond of how we teach writing or even of  how it is integrated into the curriculum), though it would be a set-back for our students. Of course, that's a pendulum that would, of necessity, swing back once we see a generation of writers coming out of college even more incompetent to communicate in writing than those graduating now. Here's a part of how I responded:
As you have requested, I have examined the lab reports. Unfortunately, I do not see how they can possibly be construed as meeting the requirements of a writing course, be it the basic composition course all students must take or even Advanced Technical Writing, the course covering material closest to the lab-report writing of the sort provided. Both of these are courses I teach, and I would not accept these lab reports as fulfilling anything more than a small part of what I expect of my students.

There are quite a number of problems with any consideration of this type of work as substitution for a writing course in English, but I will focus on only two as prime examples of why these lab reports do not show the type of work required in an English writing course. Either problem would be a sufficient basis for rejecting a request to use these reports in place of doing the much more extensive and nuanced work of a real composition course.

First, the lab reports follow what is, basically, a single formula. As a result, they indicate no command of the variety of written communication covered within any English writing course. In addition, the nature of the formula is not one that necessitates the types of revision and reworking that are part of the process of writing that can be required of a college graduate. Nor does use of the formula allow the student to face and overcome critical questions of audience and effectiveness, areas of importance in any writing class--and in writing in the world one enters on graduation. Finally, the use of a report formula allows for the completion of a document without the student ever having to deal with questions of transition, tone, or style--all significant elements of writing as taught in an English writing course. For these reasons and more, no English composition class teacher would ever accept such lab reports as fulfilling more than one assignment--if they (or one of them) would even be accepted for that.

Second, the actual writing by the student in these reports is quite slim. Because of their structure, lab reports require little original composition, allowing boilerplate prose to carry most of the burden. At most, there is but a paragraph or two of what an English professor would accept as original writing in any of these lab reports. Writing is an attempt at communication, not simply filling in categories or listing results. Students learn to do that by concentrating on writing as the primary activity in a composition class. Here, the writing is simply a coda to another process and another learning exercise.

As a result, I cannot recommend that these lab reports be considered in any way as the equivalent of what a student is expected to do and learn in a college English composition course.
My objections would be rendered meaningless, were automated grading to move first into standardized testing and then (by logical extension) into the classroom. We would no longer be able to teach writing, but would be teaching the filling of forms, the establishment of a formula, and the arrangement of squiggles on a page or screen. No one really needs to learn all of that for any purpose but passing a test. Eventually, schools and colleges would realize this and would dispense with composition courses altogether.

They would dispense with them... until the day would come, necessarily, when someone would point out that new graduates were emerging with no ability to communicate in writing about anything they had learned or that they might discover.

Sunday, June 10, 2012

Assessing a Dynamic

Written communication is no static thing, no set of squiggles on a sheet of paper or a computer screen. It's a dynamic involving at least two individuals (or, at the very minimum, the idea of a second individual, even if that second one is simply an extension of the self). Assessing written communication through the squiggles alone, then, is a doomed exercise. It's about as useful as judging an automobile without ever starting the engine, let alone seeing it run.

Yet this is what many would like us to do. It is the bottom of a slippery slope that began when it was realized that different readers come up with different evaluations of the same written work--and do so, even when both readers are teachers or experts. The first step by those this alarmed was development of grading rubrics, guidelines meant to make sure everyone "read" a paper in the same way. This forced a new concentration on the squiggles, of course, and a move away from consideration of the dynamic. It also presupposed something of a Platonic form for the written "essay" (whatever that is), an ideal that all essays can aim for. All of this is nonsense, developed simply for ease of evaluation of assessment. That is, it was developed so that writing "success" could be boiled down to a number that could be judged against another.

Todd Farley, whose book Making the Grade: My Misadventures in the Standardized Testing Industry details just how fraught with errors (to say the least) American educational assessment is, wrote recently about the newest phase of the mania for machine grading of written work for The Huffington Post. He shows just how limited a view of writing is being proposed for assessment:
Provocative thoughts in those essays? The automated scoring programs failed to recognize them. Factual inaccuracies? The scoring engines didn't realize they were there. Witty asides? Over the scoring engines' heads they flew. Clichés on top of clichés? Unbothered by them the scoring systems were. A catchy turn-of-phrase? Not caught. A joke about a nitwit? Not laughed at. Irony or subtlety? Not seen. Emotion or repetition, depth or simplicity, sentiment or stupidity? Nope, the automated essay scoring engines missed 'em all. Humanity? Please.
And that's just the start of it. The machine has no way of knowing if the essay is doing the job of communication that is its putative goal. It can only assess if the squiggles conform to a particular set of patterns set for the page.

In Verbal Behavior, B. F. Skinner attempted to develop a system for considering speech (and, by extension, writing) as a dynamic instead of as a 'thing.' Even in the 1950s, he was able to recognize that we were missing something when we evaluated language use simply through sets of formal rules. We've come a long way since then--or so we like to believe. Why, then, do we constantly regress to a view of language that, even fifty years ago, was recognized as insufficient?

We can only assess writing (and speech) acts through their impact and the resulting dialogues or actions. We cannot successfully assess them through examination of only a part of them. Through, in the case of writing, squiggles.

Saturday, June 09, 2012

CFP: "Star Power: Celebrity Rule in New Hollywood"


I am looking for brief proposals (title and synopsis, 250 to 500 words) for essays that could be included in Star Power: Celebrity Rule in New Hollywood, an upcoming two-volume collection to be published by to Praeger Publishers. The essays will focus on individuals important to contemporary Hollywood. They should be about 7,500 words (plus notes and bibliographical material) and will be due by January 31, 2013 (proposals by August 31, 2012).  Contributors will receive a free copy of the set.  Though I am not necessarily looking for essays by scholars, the essays should be scholarly. Email proposals to me at abarlowatcitytechdotcunydotedu.

Overview:
Today’s top media stars parlay their star power into much more than money and immediate control. Many of them develop what amount to mini-empires where they can take on different roles in different projects, sometimes acting, sometimes directing, sometimes producing  This isn’t new: Building on the successes of Charlie Chaplin, Orson Welles, and a few others of the Studio Era, people like Woody Allen and Clint Eastwood began to expand their own activities a generation ago. Today, it is commonplace, but has never been seriously examined. This set will look into how individual artists take success today in one area of entertainment and use it to move successfully into other entertainment activities.
Possible subjects (though there are many more who could be included):

Volume 1: The Power of Youth, Beauty and On-Screen Talent

 From the Disney Factory
1. Christina Aguilera
2 Myley Cyrus
3. Jonas Brothers
4. Brittany Spears
5. Justin Timberlake

Star Power:
6. George Clooney
7. Matt Damon
8. Johnny Depp
9. Will Ferrell
10. Tina Fey
11. Mel Gibson
12. Angelina Jolie
13. Sean Penn
14. Robert Redford
15. Adam Sandler
16. Ben Stiller
17. Stanley Tucci
18. Denzel Washington
19. Keenan Ivory Wayans
20. Oprah Winfrey

Volume 2: The Powers behind the Camera

21. J. J. Abrams
22. Judd Apatow
23. Michael Bay
24. Kathryn Bigelow
25. Tim Burton
26. Coen Brothers
27. Matt Groening
28. Peter Jackson
29. Spike Lee
30. Seth MacFarlane
31. Tyler Perry
32. Chris Rock
33. Jason Reitman
34. Robert Rodriguez
35. Martin Scorsese
36. Ben Silverman
37. Kevin Smith
38. Steven Spielberg
39. Quentin Tarantino
40. Joss Whedon

Copyright Reversion

After enactment of the first copyright law in England in 1710 (the Statute of Anne), rights to works reverted to the author after a period of 14 years--if the author were alive. If not, the works entered the public domain. A living writer could renew copyright for an additional 14 years. Reversion to authorial control, in other words, was built into the law. This level of ownership of the work by the author (not the copyright holder, often a different entity) disappeared after the 18th century, all rights now resting on the copyright holder.

A step towards moving copyright control back to authors--in scholarly situations, at least--was taken by the Modern Language Association (MLA) the other day. The new agreements:
leave copyright with the authors and explicitly permit authors to deposit in open-access repositories and post on personal or departmental Web sites the versions of their manuscripts accepted for publication.
In the past, control of copyright (as in most other instances) has remained with the publication. Now it is beginning to be returned to the scholars, setting a new standard for ownership of scholarly work and constituting a frontal assault on the barriers to dissemination of scholarship set by academic publishers, especially the commercial ones.

According to InsideHigherEd:
The new MLA policy appears to move beyond those of other humanities organizations -- although some of them have created ways to work with authors who want their scholarship in open access repositories. The American Historical Association, for example, holds copyright on articles that appear in its journals, but its author agreement tells authors that -- if they ask -- they will be granted permission to post articles in repositories and on personal websites. The Organization of American Historians -- which publishes The Journal of American History with the Oxford University Press -- gives authors a link that can be used for open access repositories. But Nancy Croker, director of operations for the OAH, said that "we do hope that an author would not circulate their article in such a way that it jeopardizes the integrity of the publication as a whole."
In other words, Croker is saying that publications come before authors and come before the needs of the community of scholars, that access is some sort of gift, not a right. The publication still wants control. With its new policy, the MLA is arguing otherwise, that the publications will be better served by openness than by any semblance of control of work that was, of course, done by others in the first place (and, for the most part, done unpaid by the publications).

This is a great advance--even if it is really only a reversion. Let's hope that other publishers of scholarly work will be willing to follow suit. Of course, they may eventually have to: fewer and fewer of us are willing to pay to go behind publisher firewalls, especially when we are paying only to fill the coffers of commercial enterprises, not to support the work itself.

Friday, June 08, 2012

Just What Is "Breitbarting"?

Well, for one thing, it is not journalism.

When I wrote "The Pride and Reward of Falsification: Post-Objectivity as Post-Responsibility" for the recent book News with a View (edited by Burton St. John and Kirsten Johnson) centering on Andrew Breitbart and James O'Keefe, I wasn't thinking in terms of a specific, definable strategy, certainly not one that would come to be known as "breitbarting." I should have been.

Though O'Keefe styles himself as following the tradition of the muckrakers of a century ago, the purpose of breitbarting is quite different. Instead of exposing corruption, the purpose of brietbarting is to present the appearance of corruption. This is what O'Keefe, with diminishing success, tries to do. This was what made Breitbart famous before his untimely death.

Breitbarting, in fact, is all about appearance--and about self-promotion (in that, it does go back to the muckrakers, but that's for another time).

The prototype breitbarter was "Jeff Gannon" (James Guckert), whose shenanigans within the White House press corps led, among other things, to the founding of ePluribus Media. Guckert created the persona "Jeff Gannon," a putative journalist, to gain access to the White House press room on day passes. The ultimate goal was to ask President Bush a loaded question under false pretenses for the purpose of demonizing the opposition and gathering attention to "Gannon." This succeeded, but "Gannon" was also quickly exposed as a result--and in a way quite unflattering and beyond his control.

Breitbart and O'Keefe learned from the "Gannon" example. Real breitbarting, when it appeared, moved beyond "Gannon" in a couple of significant ways. First, the breitbarters made sure they were ready to expose the deception themselves, and to use that as part of the story. Second, they never use the technique within a friendly or neutral venue, making sure that, no matter the consequence, they can always fling any charges of deception back on the people or place they have deceived.

None of the breitbarters participates in journalism, but that was never really their point. To paraphrase the old ad, they only play journalists on TV... and that is the point:
They use the trappings of [journalistic] objectivity while manipulating information to produce proof of the point or believe that had brought them to the story in the first place.[...] [P]artisan activists have had no compunction about styling themselves as journalists, and the public has had little reason not to accept them as such. ("The Pride and Reward of Falsification," 30)
"Breitbarting," then, is the construction of a persona for the purpose of infiltrating a stronghold of a political enemy with the intention of capturing on video responses to loaded questions or scenarios constructed for 'damned if you do, damned if you don't' outcomes.

The technique comes to attention as breitbarting moves beyond Breitbart's own "Big" websites and O'Keefe's antics, this time at the Netroots Nation conference in Providence, RI. Someone named Anne Sorock attended a panel with the express purpose of embarrassing Native American panelists with a question concerning Elizabeth Warren's ancestry. She was exposed in the midst of her attempt, but posted her story anyway. As usual with breitbarters, she manipulated the event she wrote about both in words and in selective editing. This time, however, she was called out. First, at the conference and then in the comments on the site she writes for by someone who also attended, someone writing under the name "Geekmobile."

One of the brilliant parts of breitbarting is that the breitbarters have learned that being called out, even immediately, even irrefutably, doesn't matter. What matters is the attention. Sorock may or may not be on the way, now, to media stardom, but that (and confirming right-wing prejudices) is her intent--the truth or falsehood of what she has "uncovered' is irrelevant.

Tuesday, June 05, 2012

Learning, Teaching, and Talking

Just what does a Massive Open Online Course (MOOC) have to do with education? In a couple of posts last month (the first one is here), I hinted that their real relationship to education is pretty much the same as that of any tool (books, for example), but that they can rarely be the basis for education by themselves. Access to information, in other words, is not the same as access to education. That's why, as a college professor, I see no threat from the entry of such educational behemoths as Harvard and MIT into the MOOC business. If anything, they are simply providing another way for my students to get the information they need in order to start learning.

Steve Krause makes an attempt to explain this by differentiating learning, teaching and credentialing, concluding that he really doesn't see MOOCs as "the future of higher education on the internet." I agree. As Krause says, MOOCs allow little room for teaching--and teaching is an integral part of higher education.

When I was living in Chicago in the uncertain economy of the 1970s, some three or four years out of undergraduate school, working in inventory control for an import house and (later) in the parts department for a car dealer (and selling cars at night), most of my entertainment was books, mainly science fiction and mysteries. As time went on, these became less and less satisfying and I cast around for books that had a little more meat to them. I remember running across Balzac's Pere Goriot, which led me to a number of his other novels... but it was Faulkner's The Hamlet that galvanized me.

I realized, when I put the book down, that my knowledge of literature was paltry, at best. I wanted to know more, and to read more, but didn't see my nearly random browsing through used bookstores and libraries as sufficient--nor did I think any particular 'great books' list would help. I wanted to be able to follow the pathways that books opened up for me, but felt I needed discussion and guidance, discussion that I could participate in and guidance tailored to my own particular interests and inclinations. I needed teachers.

So, I decided, what I needed to do was to go to graduate school, to get a Masters degree in English.

Fortunately, though my undergraduate degree was in Philosophy, I had graduated from a college (Beloit) with a strong interest in teaching, where one of the things I had learned was how to use a teacher to further my own education. I had also come to understand that a great deal of education arises from interaction with fellow students. Through this, I was better prepared for graduate school than I believed at the time.

And, partly because of that preparation, I loved it.

At that point in my life, I had no interest in becoming an academic, but reading was invigorating, as was the give-and-take with professors and students. Sometimes I think that, if I could have, I would have stayed at the University of Iowa forever... would that the glory of exploration could have always stayed with me. The educational environment was delicious though, by the time I had moved on and was nearing completion of my PhD, I realized that one does, at some point, have to emerge from that cocoon.

I was never a particularly good student, never a star. Instead, I gained from being around stars, both students and professors, whose presence allowed me to try things I never would have tried on my own.

For the real stars, perhaps, being in the environment of an educational institution may not be absolutely necessary. But not everyone is cut out to be an autodidact--I know I wasn't. I needed to be around people who were doing things, who were thinking and talking--who were excited about the world and about learning through exploration. And who loved sharing what they found, who loved teaching.

Schools like Harvard, MIT, and Stanford, all involved in MOOC projects, understand this full well, and know that the MOOCs are no competition for what they are doing on their campuses. So do most American high-school students when they are applying to college. If they can, they want the experience that I had, for they understand (almost innately) that the experience gained at an 'elite' college or university will help them more, in the long run, than anything a MOOC or any other online "education" can provide.

For most of us, our real educations arose from interaction, and interaction on a close, personal level. At this point, though they may help, online teaching aids cannot replace that.

Monday, June 04, 2012

Just What Are We Assessing? (Sigh)

High-stakes reading comprehension exams, at least one that I know of, for high-school students give line references pointing to the source for the correct answers to each question. It is possible to get a perfect score without ever having read the passage--just having used the cues. What, then, is being assessed? Certainly not ability to read a passage, digest the information, and carry it forward in another context. At best, assessment of these test results is of ability to succeed at a sort of treasure hunt where prizes have been seeded and slightly oblique hints provided.

Is that useful?

Of course not. And it privileges those who have gone through some sort of test prep. Those who have simply learned to read well, and who follow the instructions to read the passage first are at a disadvantage, especially since the test is timed.

But who cares? Who is assessing the assessment?

Nobody.

We've gotten so far removed from any assessment of assessment that debacles like New York State's pineapple-and-hare question only come to light when students point out their idiocy. Testing giants like Pearson's are developing questions simply as questions that follow a formula, not that serve any real purpose of evaluating student progress in learning.

Garbage In, Garbage Out?

It certainly seems that way. The tests provide the garbage (the questions), the students stir it around a bit, and numbers are produced from their activity--more garbage.

The other day, Diane Ravitch posted a blog with the plaintive title "Why Do We Treat the Tests As Scientific Instruments?" Good question, and one that she has been asking for some time. As have I. As have many others.

The response?

A resounding silence.

Or simply more claims that we have to have "data." Only then can we effectively evaluate our schools, our teachers, and our students.

But what if that "data" is, in reality, garbage?

Ravitch writes:
Why do we (and state legislatures and the U.S. Department of Education and the media) treat these tests and the scores they produce as accurate measures of what students know and can do? The reader [who had asked a question sparking the post], who clearly is a teacher, reminds us that the tests can’t do what everyone assumes they can do. They are subject to statistical error, measurement error, and random error. They are a yardstick that ranges from 30″ to 42″, sometimes more, sometimes less. Yet we treat them as infallible scientific instruments. They are not.
Not only are they not "infallible scientific instruments," but their value as creators of any useful information is doubtful, at best.

Writing tests have to focus on the page and not on communication, but it is communication that is the heart and soul of writing. Why this focus? Because "communication" is almost impossible to assess numerically, while formulaic usages complying with a standardized grading rubric can be (if we ignore the fact that there is even a subjective element to assigning the numbers for parts of the rubric--something we ignore through what is called "norming," making sure every grader gives a particular test approximately the same score). Students are assessed on a kind of writing that meets established rules, but it is not a kind of writing that students will engage in anywhere beyond writing classrooms preparing them for standardized tests.

In some respects, what students are taught to do isn't even writing, but putting together pieces of a jig-saw puzzle. Little of it has anything to do with effective communication.

It's long past time that we start assessing the assessments, but are we going to do it?

No.

There is too much invested in high-stakes testing (the entire, and hugely profitable "reform" movement in education is based on it) for anyone but the few on the fringes to call out that this emperor has no clothes.