A computer can give more effective and timely feedback than I can…sometimes. The potential of “robo grading” excites me. In the case of redundancies, clichés, passive voice, sentence variety, and other writing concepts, a focused report generated by an algorithm can do more than I can. I have used Writer’s Workbench in the past, my colleagues have used ETS Criterion, and I want to try White Smoke. Of course, a free option via Google Add-ons has enormous appeal.
As I mentioned in my previous post, I wanted to let my students experiment with ProWritingAid, one of the new Add-ons offered in Google Drive. This week students submitted novel analysis essays on The Catcher in the Rye, so I imagined online editing reports could get them started on revision while I mark the essays.
Unfortunately, my students and I are underwhelmed. Now, we need to give ProWritingAid a better chance. These opinions are based on just one test of about 30 minutes. We will try it again very soon, but based on this week’s toe-in-the-water trial the feedback was too clunky and abstract to be of much use:
Our feedback on ProWritingAid’s feedback:
Very easy to use: Setup took all of three minutes, and students required no instruction.
Free…sort of: They offer a free 14 day trial, but then the worthwhile features require payment. Judging on the quality of feedback during the trial, other pay-for products offer better value.
Works better for an experienced writer: Other online feedback systems offer users tutorials on writing concepts. A student, then, can run a passive voice report and before editing her work, receive some brief instruction and practice with the concept. Without this feature in ProWritingAid, my novice writers felt lost with the feedback. For instance, I found the “sticky sentences” report very helpful, but my students just got stuck. They could not make changes without my help.
Works better on longer pieces of writing: Students’ essays were around 900 words, so the reports noted only a few errors. I keep all first drafts of my blog in one Google Doc, so I ran reports on last year’s posts. When dealing with 25,000+ words, clearer patterns emerged. In this case, it seemed the longer the piece, the more effective the feedback.
Feedback is slow: An admittedly churlish point here considering the speed of my feedback on student essays, but for a robograder, ProWritingAid has too much wait time. When running the “clichés and redundancies” check, for instance, ProWritingAid highlights all clichés and redundancies in different colors, but to edit users look at a pop-up window. In this pop-up window, a user must scroll sequentially through the errors. In a large document, this takes too much time. To get to error number 21, a user must scroll through 1-20 first, and that feels clunky, especially when there is a two to five second lag when clicking the “next” button. Students quickly gave up. I did, too. We could also scroll through the document and then click on the highlighted text to change the feedback, but we had to wait 10-15 seconds for the feedback in the pop-up window to catch up. Adding to our frustration, about half the time the pop-up window feedback would not sync to the highlighted text no matter how long we waited.
Overall, I am thankful for Add-ons. Having these features certainly improves the overall user experience of Google Drive, but in the case of ProWritingAid, I found the lack of tutorials and the lag time between the document and the pop-up window to be too frustrating. The feedback wasn’t worth the wait.
We will try to work with this tool again, but I don’t see it becoming a regular tool we use. I very much want to make a robograder part of my English classroom, though, so if anyone has recommendations in this regard, please share.