My first new post in a while. And I’m ranting — albeit somewhat quietly. This time I’m reacting to a newly published research article about assessing student writing within MOOCs. Balfour, the author, provides a review of two technologies for assessing writing when you have a huge student-to-teacher ratio: Automated Essay Scoring (AES) and Calibrated Peer Review (CPR). Follow the links if you want to know more about either of them. If I was held at gunpoint and told I had to use one, I’d definitely pick CPR. But neither of these tools will address the concerns of teachers, employers, policymakers, etc. about workplace writing.Here’s a somewhat random collection of representative concerns:
- “The writing skills of those coming into the workforce are more deficient than ever before.” Senior vice-president of a commercial real estate brokerage quoted by the Vancouver Sun in 2012.
“While M.B.A. students’ quantitative skills are prized by employers, their writing and presentation skills have been a perennial complaint. Employers and writing coaches say business-school graduates tend to ramble, use pretentious vocabulary or pen too-casual emails.” Diana Middleton for the Wall Street Journal in 2011.
“Recent graduates may be trained in academic writing, but we find that kind of writing too verbose and wandering.” Member of the Business Roundtable as quoted in the National Writing Commissions’s 2004 report, Writing: A Ticket to Work . . . Or a Ticket Out.”
“It’s impossible to calculate the ultimate cost of lost productivity because people have to read things two and three times.” Governor Mike Huckabee quoted by USA Today in 2005.
- “. . . employers need to face the reality that many of their employees are poorly prepared to meet the responsibilities of everyday business writing. Remedial training can be costly, but the alternative is to risk losing critical opportunities in an already challenging economic environment.” Bill Kozel for Training magazine in 2009.
“. . . educational reform must expand to include ideas; the ability of students to think, reason, and communicate . . . The National Commission on Writing’s 2003 report, The Neglected “R”: The Need for Writing Reform.
These concerns focus on the ability (or lack thereof) to COMMUNICATE a useful message professionally to a workplace audience. Computers cannot yet accomplish anything close to human communication. The National Council of Teachers of English published a position paper on the use of AES systems a couple of months ago. One of their points:
Computers are unable to recognize or judge those elements that we most associate with good writing (logic, clarity, accuracy, ideas relevant to a specific topic, innovative style, effective appeals to audience, different forms of organization, types of persuasion, quality of evidence, humor or irony, and effective uses of repetition, to name just a few). Using computers to “read” and evaluate students’ writing (1) denies students the chance to have anything but limited features recognized in their writing; and (2) compels teachers to ignore what is most important in writing instruction in order to teach what is least important.
So computers cannot substitute for the judgments of human readers about professionalism or utility or accuracy or persuasiveness or . . . based on a written message. I say this as someone who worked closely on development of one of the AES systems to assess business writing quality. I was initially optimistic because of my belief that linguists might now be able to help identify text elements that lead to judgments about things like professionalism. Don’t get me wrong. I didn’t believe computers could substitute for humans — just that they might approximate a meaningful sample of human judgments. But linguists either can’t do this. Or, more likely, the companies that own AESs aren’t interested in putting in the effort required to develop such products.
All systems are currently designed to analyze “essays.” That’s where the money is. The market for standardized testing rules. For months, I tried to get the developers to recognize differences between essays and email updates or executive summaries. (I’ve argued many time before that essays are a very different genre from anything pros write at work.) Ultimately, the developers ignored me. Despite my linguistic training and my 25 years of experience teaching professional writing. (I took a hot shower to get myself clean after collaborating with the enemy.)
I stand firmly with the National Writing Commission, which wrote in Writing and School Reform in 2006:
Training children or adults to exercise isolated skills or simple procedures can be done by the numbers. Unfortunately, educating children or adults to meet the complex demands of modern life cannot be carried out that way. Genuine teaching and learning are intensely personal, not scripted . . . They take place when minds engage around substance.
- MOOC Students Who Got Offline Help Scored Higher, Study Finds (chronicle.com)
- Amateurs Lack Genre Awareness (proswrite.com)
- From Reading to Writing: Why the Essay isn’t Working (dpod.kakelbont.ca)