Thursday
Jun252015

That Damnable Flag

When the subject of the War Between the States first arose in my childhood home (how it may have arisen I have no idea) my father carefully explained that the war was fought over something called “states’ rights,” which he had difficulty explaining further. He had learned that in school in southern Missouri, just as, before his time and after, millions of other schoolboys and schoolgirls had been taught across the states of the neoconfederacy.

Putting the best possible construction on it, let us say that they were so taught because they were innocents and it was thought desirable to preserve their innocence. But of course it was balderdash. It was a rewriting of history, an ex post facto justification of the unjustifiable. As historical analysis went, it was not only not correct; it was, to borrow a famous phrase from the history of physics, not even wrong.

I am wary of memory these days, but I think that I felt at the time that it seemed an inadequate reason for war. What I may (or may not) have intuited but been unable to articulate is that one does not really go to war over abstractions like “states’ rights.” For those one goes to court. War requires very great and very material stakes. Many millions of dollars worth of property in human beings and their labor qualifies.

The transformation of a devastating war to preserve human slavery into a noble struggle to defend an abstraction was a remarkable accomplishment, one that was possible only with the wholehearted support of an entire population. Thus developed a culture that found glory in defeat and, protected by the freedoms that America celebrates, consolation in waving the symbols and relics of their “cause” endlessly before the rest of us. “The past is not dead,” Faulkner wrote of that culture, “It’s not even past.”

It’s now 150 years since the end of that war. The last veteran died 60 years ago. It’s 50 years since the Civil Rights Movement finally erased most of the structure of legal discrimination that had been built to protect the losers from the consequences. Yet only now has a portion of the ruling class of the South begun to consider that the Confederate battle flag may not be their best face.

It would be an irony of major proportions if an end of Confederate flag-waving actually did arise out of an evil deed by a victim of that very culture. And while I do not begin to expect it, it would be an even greater irony if somehow the end of the Civil War cult were to begin in, of all places, South Carolina.

For any who might be interested, here are links to my posts on the run-up to the sesquicentennial of that horrible time:


Sen. DeMint Ignores His State’s History (Dec. 17, 2010)

Countdown (Dec. 24, 2010)

Countdown, cont’d (Jan. 19 2011)

Countdown, Part 3 (Feb. 4, 2011)

Countdown, Part 4; or, Madmen and Poetry (Feb. 8, 2011)

Countdown, Part 5: Surrender in Texas (Feb. 16, 2011)

Countdown, Part 6 (Feb. 18, 2011)

Farce to Tragedy and Back Again (Feb. 25, 2011)

Countdown, Part 7 (March 2, 2011)

Countdown, Part 8 (March 4, 2011)

Countdown: Interlude (March 14, 2011)

Countdown: Interlude 2 (March 22, 2011)

Countdown: Interlude 3 (April 2, 2011)

Of Course You Realize This Means War! (April 12, 2011)

Sumter Surrenders (April 14, 2011)

Robert E. Lee (April 20, 2011)

Jefferson Davis Wrings a Tear (April 29, 2011)

Justifying Civil War (June 25, 2011)

Tuesday
Jun092015

The Late Three R's

The linguist John McWhorter, a certified public intellectual, suggests in the Daily Beast that our culture is rapidly becoming one of oral rather than written expression and that this is not necessarily a bad thing. Hardly anyone reads books, he notes, and yet we’re doing OK. Or anyway, OKish.

He reminds us of the beautifully expressive and deeply moving letters written by ordinary soldiers in the Civil War, such as those that Ken Burns incorporated into his television series. He neglects to allow for the filtering effect of time and taste. No doubt there were other sorts of letters. “Dere Ma, I shot me a  yanky yestidy, Luv Festus” is not so likely to have survived or, if it had, to be quoted.

The two exemplars McWhorter uses to illustrate his thesis about a new orality are Kim Kardashian and Cornell West. Not surprisingly, the editors of the Beast chose to illustrate McWhorter’s essay with only a photo of Kardashian; I infer that West lacks the eye-catching cleavage. Also not surprising, though a good deal less in our collective face, is that McWhorter expresses his thesis and provides evidence and argument for it in the form of a 2,400-word written essay, which people who find the subject interesting are obliged to read.

What McWhorter seems to be endorsing is the abandonment of the kind of consecutive thought and meaningful debate that careful writing and reading permit. Certainly there are no grounds on which to suspect Kardashian of any such capability, and West has long since found a way into a comfortable cocoon where he need not bother. But there remain in the world matters meriting serious discussion. Oddly, McWhorter suggests that in this new age of orality

       ...a public intellectual’s main work could...consist of a series of 15-minute
      podcasts...displaying solid command of serious literature and ideas

but he fails to explain just where that literature and those ideas are to come from if not from a thinking and writing class.

And why would they bother, anyway? McWhorter wonders whether everyone needs to be taught how to write an essay. He does not consider that the answer might be, No, not everyone, just anyone who might need to be able to understand one when he or she encounters it. McWhorter suggests a new emphasis on teaching oral rather than written expression, reminding us that the Greeks were very good at instructing their young in the “oratorical skills of rhetoric and persuasion.”

“Toastmasters,” he says, “trains legions in the art of making an argument orally: upon what grounds do we reject that approach as inherently unscholarly or logically unsophisticated?” Well, for one thing there is the matter of reviewing what someone has said for accuracy or consistency and then reflecting on the merits of his case; one does not, cannot, do this on the fly, while the speaker is speaking. One awaits the printed version.

Certainly we recall scattered phrases from famous speeches:  “Here I stand; I can do no other!” “I know not what course others may take; but as for me, give me liberty, or give me death!” “You shall not crucify mankind on a cross of gold!” “I have nothing to offer but blood, toil, tears, and sweat.” “Ich bin ein Berliner!” Every one a rallying cry, a carefully crafted -- on paper, after much thought -- appeal to emotion. Premise, argument, demonstration -- not so much.

But the fact is that the 19th century that McWhorter holds up as the high point of finely crafted writing is also recalled by many as a great age of oratory. It is almost as though the two modes of language were interdependent. It is as though Abraham Lincoln had schooled himself by reading the King James Bible, Pilgrim’s Progress, Robinson Crusoe, and the Arabian Nights, then become one of the finest prose writers of his century, then composed and delivered some of the greatest orations in history. And if that is true, then perhaps it is both forms of expression that mysteriously have declined down to our sad day.

Your tweet, your instant message, your blog comment -- what do they convey? The impression of a moment, the stray thought, the unconsidered response to some random stimulus. Lincoln’s Gettysburg Address, at 272 words, nowadays could be compassed in a dozen tweets. Yet he worked over his draft for hours to attain just the effect he intended. It may be that he thought the occasion just that important, or that he respected his audience just that much.

Fifty years ago the student newspaper at the University of Michigan serialized War and Peace -- three or four lines a day, in the classified-ad section. That was meant as a joke.

Friday
May292015

Fed on Peeps, I Guess

It had to happen, I suppose. Just when I get around to turning 70, idiocy strikes my alma mater. Yes, I’m old and crotchety and a product of my times and all that. But, really....

A professor at Northwestern University (motto: “Whatsoever things are true....”) is under fire for daring to criticize the regulatory regime lately adopted to control relations between, or among, the genders. Her criticism appeared in that hotbed of revolutionary and/or reactionary (who can tell anymore?) manifestoing, the Chronicle of Higher Education.

Protest was knee-jerk swift and just as intelligent. Words like “violence” and “inflammatory” and “terrifying” were tossed about with the usual insouciance regarding their conventional meanings. Demands were made.  And would any protest these days be worthy of the name if there were no mattresses? The professor is now under formal Title IX investigation.

Most damningly, “trigger” words were involved. <cue old man> Time was, a trigger was a mechanical device whose purpose was to cause the firing of a bullet. That would be “bullet” as in a small, hard projectile with the capacity to cause great bodily harm or death. Today’s triggers seem to produce a form of neurasthenia in some young persons and opportunistic rage in others.</old man>

It is hard to escape the conclusion that some of our students are not in it for an education. To them I might say, If all you are after is a certificate, that is what the University of Phoenix is for. No one there will provoke you. Others seem beyond the reach of education, as perhaps their upbringing and prior formal training have left them unable to cope with no longer being the superstar so comfortingly celebrated by the bumper sticker on Mommy’s SUV.

<cue old man again>Damned if we weren’t just a bit tougher in my day, and willing to learn something.</old man>

Tuesday
May262015

Thinking About Nothing

Twenty years ago or thereabouts I read Hubert Dreyfus’s What Computers Can’t Do (1972; actually, I read the 1992 revised edition called What Computers Still Can’t Do, relishing the nyah-nyah zing of the  new title). Dreyfus is a philosopher of the phenomenological school, and in his book he effectively undercuts the project to build an artificial intelligence on the assumption that the brain and mind are essentially like computer hardware and software.

Dreyfus argues that intelligence, as we commonly understand and use the word, is inescapably an embodied faculty. That is, your brain/mind and mine are not simply OEM gear that would work equally well anywhere. They are yours, in the one case, and mine in the other, both intimately connected with, a part of, the body they seem to occupy. The body’s concerns are the mind’s as well. The mind is always the mind of this body, located here, with this particular view of the world and this history and these plans and worries and hopes.

This kind of argument struck a chord with me. The difference between the predictions and the productions of the standard AI folks had gaped from the beginning and grew wider with each minor achievement. The hullaballoo that greeted IBM’s Big Blue chess-playing computer was nothing short of absurd. The fault lay chiefly with the media, of course. Reporters had no interest in considering what Big Blue actually was; they needed it to be brilliant, powerful, and just a little ominous. In fact, it was a machine that performed certain operations on bits of data. The "chess" happened at the point where the results of those operations were interpreted by human beings as though they were moves in a game of chess.

Big Blue does not play chess. It has no idea of what chess is. When it wins, it feels no gratification, for it does not know that it has won anything. Unless specifically programmed to do so, it will never propose a game.

(Imagine approaching the computer: “Yo, Blue! How they hangin’?” And Blue says “Yo, dog. Grab a chair. Black or white?” Ain’t gonna happen.)

For some reason, after reading Dreyfus I was struck by the thought, It’s death. Death is the difference. No robot is ever going to have human-like intelligence because the robot cannot -- or anyway, need not -- die. It is some mostly unconscious intuition of mortality that motivates human thinking. That there is an unspecified but inescapable outer boundary to our efforts somehow drives us, or draws us, into performance.

Years after reading Dreyfus, but still thinking about his insight and (dare I claim it?) mine, I even wrote this:

          Death is in us, somehow,
          Hid amid the DNA, perhaps, or
          Organelles in cells, passed down from
          Some bacterial mother Eve.
          It whispers unheard, a word within,
          To give us pause, or cause, or cast
          A sombre hue on all we do.
          And yet absent that hint of fate,
          That goad to go and think and make,
          Mere robots waiting for some command
          That never comes.

Comes now this article in the Chronicle of Higher Education about three psychologists who have developed what they call a “theory of terror management,” predicated on just this idea that the knowledge of mortality, of death, is what drives virtually all of human behavior. As a scientific hypothesis their idea is, or ought to be, subject to testing and reviewing by properly skeptical peers. But it’s all soft science where they work. One objection noted in the article is that their hypothesis conflicts somehow with evolutionary science, which sounds rather more like the noise of a toe being stepped on than a helpful criticism.

The article refers several times to one variant or another of the phrase “terror of death.” I don’t know if that is the writer’s notion or the theorists’, but I quibble. Certainly it is a standard trope in literature, but that doesn’t warrant unexamined use in science. I suggest that the terror they correctly identify lies deeper, associated not with death but with non-being.

“Death” is an event, or as some would have it a “stage in life”; it is also a cartoon character (you know, black cloak, hood, scythe). It is something imagined, by naming it or by picturing it, and it is imagined precisely to shield ourselves from what is truly terrible because it is unimaginable, and that is the thought of not existing at all. If you, like me, have no expectation of an afterlife, then that thought of ceasing to be must give you pause, to say nothing of the heebie-jeebies and likewise the fantods.

Think of the moment you awaken from an anaesthetic. Then think of the moment just before that: nothing. No awareness, no thoughts, no dreams, no sensation. Just plain nothing. Now transpose that complete blankness to the end of your life. No consciousness, no memory, no nothing. You are not, and it is not simply that you are no longer but as though you had never been. There’s your terror. My mind simply bounces off, for that sort of utter nothing is, in every way, incomprehensible.

And so we invent Mr. Death -- no friend, but something that is in the world and so is seemingly comprehensible. He’s come for friends and family and he will one day come for us. Meanwhile we can dress up like him on Hallowe’en, make movies featuring him, in myriad ways make him something we can deal with. But mainly we can use him to screen ourselves from the terrible truth.

It’s no wonder that the idea of an afterlife is so appealing to so many.

Saturday
May232015

Q.V. second

This month's episode:  Poet v. Poet; or, What's that word? Anal iteration?