Categories
Argumentation Pedagogy Short Essays Writings

Bandwagon Arguments in Academia, Redux

(9/11/23) I have received an intriguing request for a short essay I’d forgotten that I wrote. With some light editing, here it is again, after some digging in the archives. 2018, maybe? In general, the thoughts here hold true. Maybe more so.

One of my pet peeves when reading academic arguments is the persistent and lazy use of the bandwagon fallacy – i.e. many people think X, so X is right. Although, in this particular version, it is more along the lines of “The vast majority of qualified scholars in this subfield think X, so X is right.”

Where should I begin my critique, I wonder? That popularity is no guarantee of validity? That popular ideas deserve to be interrogated just as much as unpopular ones? That the unprofessional arrogance displayed by using this fallacy is only trumped by its stupidity? That taking such a position attempts to cut off future productive scholarship at the knees? And, perhaps finally, that using it is a sure sign of the weakness of one’s position?

Yes, this is a target-rich environment, to be sure. Let’s try some examples.

Exhibit A – “Best Practices”

If I had a nickel for every time someone appealed to “best practices” in my semi-home field of rhetoric and composition and its sister technical communication, I would be able to take my family out to a series of nice dinners.

Behind the concept of “best practices,” it turns out, is a crude bandwagon argument. To follow “best practices” in teaching in tech comm, for example, is to use the techniques that are well attested in the scholarship, supported by “name” academics whose ”names” can be dropped liberally in conversation, and that are ultimately safe and uncontroversial.

Screw that.

I don’t care if 99.9% of the members of NCTE (National Council of Teachers of English, BTW) support a given mode of instruction. I only care about whether or not it works. Show me whether or not it works – not how popular it is, or what academics happen to endorse it. Give me evidence, not sponsorship.

I have known very few real top-flight scholars in my career thus far. If they have something in common, though, it would be that none of them follow trends or take a poll before they plant a flag. The pursuit of knowledge eschews such petty and empty considerations – and so does logic. Someone dedicated to such an ideal would never use popularity as evidence of anything except popularity. Academic arguments are to be evaluated on their own merits, not on whether or not they are in season.

So, in short, while “best practices” might have once had a more innocent connotation, now it just makes me irritable. It represents the worst of academia, when it is at its pettiest – when it is political.

Exhibit B – A Historical Jesus

I’m gearing up to teach the Synoptic Problem in Studies in Religious Texts again, so this has been on my mind of late. One of the subtopics that naturally comes up with the SP is how much of the gospel materials are based on any historical Jesus – which then leads to whether there was a historical Jesus, and if so, what can we say about him?

“Mythicist” arguments, arguing that Jesus has no historical basis and instead is a kind of assembled myth, are as old as the hills, dating back to the first pagan critics of Christianity. I’m agnostic on the issue due to what I see as a failure of everyone writing or speaking on the matter to make a decisive case (due to the paucity of evidence in any direction) but I am frankly peeved at the standard position – that mythicism is nonsense because no mainstream biblical studies or religious studies academic thinks there wasn’t a historical Jesus.

Now, I hardly need to point out at this point in my post that such an argument is one big bandwagon fallacy (as well as an argument to authority, but I’ll leave that one for some other day). It is telling a questioning undergraduate to sit down and shut up, pulling rank, asserting the primacy of one’s subdiscipline, and being an arrogant twerp, all at once. These are all things I despise and oppose.

So I have a certain sympathy for the mythicists as underdogs. That doesn’t mean they are right – they still have to make a case, and so far no smoking gun has appeared – but they have a decent case that is just as strong as the default one.

So why do they get such a hostile reception? Why the flippant and repeated use of the bandwagon fallacy in response (occasionally laced with a choice insult about one’s employment prospects, educational background, and sanity)?

Well, let’s return to rhetcomp for a moment. The most telling and long-lived idea in rhetcomp is process pedagogy – the belief that writing is a process rather than a product and should be taught accordingly as a series of repeating and mutually informing steps instead of emphasizing the text that results. Now, feel free to correct me if I’m wrong, but I can’t think of a single instance of a process compositionist slapping down anyone who challenged or questioned process by saying, “The vast majority of composition academics support process theory. Therefore, your argument is a fringe belief and not worthy of a full treatment.” If such a pretentious mandarin exists, please send me a citation, but I don’t think one does, or ever will.

Now, at the same time, there is that old chestnut mentioned before – “best practices” – that is used instead to enforce consistency. But as it turns out, “best practices” is mostly political cover, because it can mean whatever the instructor wants it to. Composition is a field full of rugged individualists. Some are old-school grammar mavens, some are process fanatics, some are post-process theorists, and others are expressivists, and others (really most) defy easy categorization. We know how to selectively cite. Some of us resist this, of course, but not all – not even most.

Back to the historical Jesus. There is a great wiki page that has collected countless putdowns of mythicists (they are all down near the bottom). Perusing them will reveal that they are basically all variants of the same technique: bandwagon fallacy + insult to education, occupation, or sanity + optional ridiculous comparison to Holocaust denial.

Why are they all the same? Why so prevalent?

First, there is no downside. Picking on mythicists is a risk-free power projection. It’s functionally no different than a bunch of jocks stuffing a nerdy kid into a locker. I have more power than you, so in the locker you go. There is no penalty.

Second, more fundamentally, the nerdy kid is a existential threat. He represents a counterargument to the jock’s primacy – that logic and curiosity might trump their relative powerlessness outside of the artificial world of the school. Similarly, the biblical studies folks know their authority is severely limited outside of academia, and in particular, the theological schools. Outside of it, free thought reigns. Can’t have that. The existing pecking order must be maintained, at least temporarily. In the locker you go.

In a perfect world, biblical studies academics would lay open the question of a historical Jesus. But in order to do that they would have to open their minds. And if you think the average person has trouble with that little task… well. It’s not a question of a threat to existence of the discipline. Opening up the question would doubtlessly lead to an explosion of relevant literature. It would be good for the field, showcasing at last a bit of historical respectability. And such studies do exist.

But the possibility is a clear a threat to individual egos – which is why I think the jock-bully comparison is apt. There is nothing more fragile than a bully’s ego. It has to be constantly fluffed and pampered like Donald Trump’s psuedo-hair. Otherwise it falls apart. Why? Because, ultimately, there isn’t much under the combover. There is no defense for a historical Jesus that doesn’t special plead Christian sources – which brings me to my favorite example.

Exhibit C – The Book of Mormon

The non-Mormon academic consensus is that Joseph Smith, the founder of Mormonism, was a fraud. The Book of Mormon was not written from golden plates handed over by the angel Moroni, but cobbled together from 19th century Protestant mysticism and the KJV. The jocks are very clear about this.

However, there is another body of academics that call themselves experts on the Book of Mormon – and they are all Mormons. They have all kinds of arguments supporting the authentic nature of the text, including sworn eyewitness statements – the famous Three and Eight – to the existence of the golden plates, with literary analysis showing its originality (check out Orson Scott Card’s defense sometime – it’s fascinatingly doltish).

So there is a problem here, namely that there is more historical evidence for the inspired composition of the Book of Mormon than there is for Jesus – despite the fact that the form of the offered evidence – multiple eyewitnesses – is basically the same. And yet the mainstream historians make quick sport of Smith, and defend Jesus’s historicity to the death.

How, do you wonder, can they expose as a fraud the recent formation of a religion so easily, but secure certain historicity for someone supposedly dead for nearly two thousand years for which we have no reliable non-Christian attestation?

The reason the dice keep coming up seven and eleven is not the incredible luck of biblical studies. It’s because the dice are loaded. And if you point this out? Well, the majority of academics support X. Back in the locker, you.

One more thing.

Another quality I have noticed in scholars, as opposed to academics, is that they almost never defend anything. Instead, they assault. I would use another metaphor family, but the martial one is just too fitting. It might be an unexplored area, or an old position or subject has been neglected, or a trend that has spiraled out of control – but they are always aggressive, constantly stalking and pouncing like half-starved tigers, relentlessly seeking improved understanding.

Playing defense is, after all, the slow death of anything resembling intellectualism. You trade a life of seeking new ideas and understanding in for the apologetic goal of preserving the beliefs of the past, usually in exchange for minor power of some sort – employment, tenure, social respectability, money – the usual earthly rewards. Maybe you get paid in spiritual coin, but either way, sounds like a devil’s bargain for me.

But what do I know? I’m just an English professor, of questionable sanity. My arguments couldn’t possibly have any merit. I’m a member of the lunatic fringe – a crackpot, a verifiable crank, a babbling child talking of adult things he couldn’t possibly comprehend.

And that is how the bandwagon fallacy is essentially the ad hominem fallacy in another guise; by elevating the group, it savages the individual. This is why it deserves the fiercest opposition we can muster.

Categories
Politics Short Essays Writings

Would a Birth Certificate Settle It? Probably Not.

Well, the Tampa Bay Times beat me to it.

So I didn’t watch the GOP mock debate, but apparently DeSantis related an unusual story about a 23-week baby aborted in 1955 and left in a bedpan outside that is still alive and an anti-abortion activist.

That didn’t seem possible given 1950s neonatal care, save a papal-level miracle, but it occurred to me that the birth might be old enough for the birth certificate to be public. I dug around a bit last night because I like archival stuff like this.

Unfortunately, for the purposes of public fact-checking, Florida birth records are sealed for 100 years. Plus, the certificate wouldn’t necessarily be accurate in a sensitive case like this. The TPT’s piece found the same info about the Browder family that I did last night, though they also seem to have found a newspaper clipping or two that reported the birth weight, though not a verification of the term.

A lot of family stories in this vein can be partially legendary. I alternatively debunked and confirmed several things that my grandparents told me while doing the family genealogy. For an example of a more innocent sign-of-the-times kind of story, my grandmother often mentioned that she listed herself as 18 instead of 16 on her marriage certificate; a fact textually confirmed by her presence on an earlier census with an age discrepancy.

I don’t doubt Browder/Hopper was born very premature and the outline of the story is true, but a far more likely scenario that doesn’t contradict the narrative would be that her mother was farther along than 23 weeks.

Categories
Book Reviews Short Essays Writings

Weizenbaum’s Computer Power And Human Reason

Computer Power and Human Reason by Joseph Weizenbaum was first published in 1976. My interest in ethics had not, until recently, steered me toward it, but having now obtained a copy and considered Weizenbaum’s arguments, I’m quite pleased I did.

It is not a book about computers, or programming, or “computer science,” or even an anti-AI screed. Rather, I’d call it a set of increasingly intense philosophical essays about what computers are capable of and what they should be used for, two things that are not necessarily the same, as well as what it means to be human and what it means to be ethical.

Weizenbaum holds that “computers,” speaking broadly, may eventually exhibit what we might call intelligent behavior, but that behavior, limited by the digital switching of 0’s and 1’s, will be that of aliens and fundamentally different; they cannot ever have human intelligence. Why? Because humans have self-directed goals and purposes and wills, and interact with the world in a fundamentally different way (having experiences rather than data). We can both exhibit and feel pride and cowardice, fear and joy, all qualia, and mostly importantly, we can judge matters, rather than simply make decisions, based on our unique, relative experiences. While computers excel at laborious bureaucratic tasks beyond any single human, they cannot ever have the human experiences that actual humans use as the foundation for their values, which then allow humans to make judgments.

Weizenbaum repeatedly quotes a former colleague that challenged him to come up with something a computer could do that a judge (presumably of the legal variety) could not, with their answer being a flat “nothing.” The vigorous humanism of his book-length rejoinder is something to behold. Not only does he castigate the slippery-slope, positivist argument that human-like AI is inevitable as with all technological progress, but he twists the knife further, noting the decision to pursue AI research blindly is itself inhuman. As computers cannot judge like humans, they cannot be ethical, and Weizenberg warns that they should not ever be given work that involves judgment. There are hints of the networked world-to-come in his chapters, but just like anyone in 1976, he doesn’t see just how quickly miniaturized, networked computers are coming.

What he does see clearly are the ethical concerns. He notes any future speech recognition will ultimately only serve the cause of increased surveillance – check. Weizenberg was the programmer behind ELIZA, the famous therapist chatbot, and was alarmed at how quickly some people connected to its lines of code like it was a real human being.

What would he think of modern speech recognition and generative AI? Nothing good. My earlier assessment of ChatGPT is more or less the same as his description of the limits of AI, though he pushes it much farther, noting (even as today) the increased dehumanization and automation of modern society, and lamenting the passive acceptance of a overly computerized future where humans cede more and more power to computers that can never have any real knowledge of human experience, and accept, without thinking, an overly technical approach to complicated human problems.

There are two related passages I’d like to replicate here as they spoke to me as a sometimes disgruntled English professor:

During the time of trouble on American university campuses, one could often hear well-meaning speakers say that the unrest, at least on their campuses, was mainly caused by inadequate communication among the university’s various constituencies, e.g. faculty, administration, students, staff. The “problem” was therefore seen as fundamentally a communication, hence a technical, problem. It was therefore solvable by technical means, such as the establishment of various “hotlines” to, say, the president’s or the provost’s office. Perhaps there were communication difficulties; there usually are on most campuses. But this view of the “problem” – a view entirely consistent with Newell and Simon’s view of “human problem solving” and with instrumental reasoning – actively hides, buries, the existence of real conflicts…

… instrumental reason converts each dilemma, however genuine, into a mere paradox that can then be unraveled by the application of logic, of calculation. All conflicting interests are replaced by the interests of technique alone.

p. 266

This man certainly worked at a university.

The last chapter, “Against the Imperialism of Instrumental Reason,” is a powerful attack on a soulless worship of reason as inhumane. The climax of the argument, for me, is this:

The lesson, therefore, is that the scientist and technologist must, by acts of will and of the imagination, actively strive to reduce such psychological distances, to counter the forces that tend to remove him from the consequences of his actions. He must – it is as simple as this – think of what he is actually doing. He must learn to listen to his own inner voice. He must learn to say “No!”

Finally, it is the act itself that matters. When instrumental reason is the sole guide to action, the acts it justifies are robbed of their inherent meanings and thus exist in an ethical vacuum. I recently heard an officer of a great university publicly defend an important policy decision he had made, one that many of the university’s students and faculty opposed on moral grounds, with the words: “We could have taken a moral stand, but what good would that have done?” But the good of a moral act inheres in the act itself. That is why any act can itself ennoble or corrupt the person who performs it. The victory of instrumental reason in our time has brought about the virtual disappearance of this insight and thus perforce the de-legitimization of the very idea of nobility.

p.276

Bravo. The closing chapter is quite strong, but I’ll limit myself to one more paragraph:

… It is a widely held but a grievously mistaken belief that civil courage finds exercise only the context of world-shaking events. To the contrary, its most arduous exercise is often in those small contexts in which the challenge is to overcome the fears induced by petty concerns over career, over our relationship to who appear to have power over us, over whatever may disturb the tranquility of our mundane existence.

p. 276

When we do not think what we choose to do matters, that is a remarkably good indicator that it does.

The insidious nature of the worldview, then, that Weisenbaum critiques is a mental trap that shuts down what makes us human – our will and agency.

Computers, by the end of the book, become a metaphor or tool for understanding what makes us human – and what does not. There is a very powerful assembled argument that the highly specialized knowledge that computer science and data-driven research claims to possess is at a serious disadvantage when compared to the comfortable familiarity with ambiguity in the humanities. The discussion of language models and composition in earlier chapters suggests Weizenbaum was not field-cloistered from literature and writing – this is a interdisciplinary work.

When I read such arguments, I think about the contemporary anti-intellectual politics of Florida and Texas, but I also think about the larger awareness of the “rhetoric of science” concept since the writing of this 1976 book and the mixed and increasingly sour bag of candies that the Internet turned out to be. I also think about every interaction with a corporate entity I’ve ever had, and how my own university works.

It’s hard to find a print copy of this book, but an ebook version is not difficult to find. I highly recommend it. It has aged well. As a closing thought, the epistemology of The New Rhetoric seems quite capable with Weizenbaum’s ideas here as a reckoning with WWII, though his examples primarily concern Vietnam.