The Latest AI Challenge to Teaching: ChatGPT

Since its launch in late November 2022, the new AI tool ChatGPT has college instructors in a well-deserved frenzy. The free (for now) technology introduced by Open AI is an easy-to-use chatbot that can interact with the user to generate serviceable, if bland, prose, among other things. And its product is good enough to have college instructors altering pedagogy and the nation’s largest public school system banning its use entirely.

While the folks at ChatGPT admonish students not to use the technology for schoolwork—going to far as to call use of any automated writing tool “cheating”—instructors are familiar with just how little those constraints mean when students choose to turn in work they have not completed themselves. Validating this perspective, Study.com recently conducted a survey that revealed 89 percent of 1000 student respondents reported that they had already used ChatGPT to help with a “homework assignment.”

Business communication is not exempt from this threat. Recently a Business Insider reporter used ChatGPT’s ability to create fictitious résumés and cover letters. While not perfect, these AI-produced documents were chillingly adequate. Hiring managers who saw the job-related documents said they would have given the applicant an interview despite the fact that the letters and résumés “lacked personality.”

Some commentators see good in the situation. What’s the harm if ChatGPT is used to produce a first draft that the student later revises, or vice versa—the student writes a draft that ChatGPT edits? Writers have been using tools to streamline the laborious task of writing for centuries, from the thesaurus to spellcheck to Grammarly. Who among us has not used a citation generator, proponents of AI ask.

One of the advocates is Dr. Cynthia Alby, a professor of teacher education at Georgia College & State University who has written that AI is only going to improve. She suggests that instead of punishing and surveilling students who turn to it, instructors can view the situation as an opportunity to turn their attention to developing students’ information literacy, research and study skills, and metacognition. Dr. Alby envisions a new paradigm in which students develop foundational skills via self-paced AI modules, ultimately enabling them to move to higher level learning situations such as case-based learning and team projects.

Nevertheless, professors around the globe have celebrated a senior at Princeton who recently created a tool that can detect whether a piece of writing has been created via ChatGPT. His GPTZero isn’t a panacea, however. Even he admits that AI writing tools are here to stay but that they must be used “responsibly.”

Still, from a student’s perspective, ChatGPT is not just convenient; it even adapts to the student’s level of understanding and adjusts to create a more personalized experience. But the downsides are numerous: (1) The AI tool furthers students’ reliance on technology and could disadvantage less tech-savvy learners. It is questionable whether the students using AI in this manner are actually learning to write. (2) ChatGPT does not provide any sources for the copy it retrieves. (3) AI has been occasionally proven to be wrong because the algorithms “vacuum” up information available on the Internet without being able to discern their veracity. The tool’s ability to provide up-to-the-minute knowledge about college-level topics is doubtful.

It’s unlikely that this type of AI will replace human instructors anytime soon. The lack of human interaction would be the antithesis of what face-to-face teaching and even distance learning offer, not the least of which is the instructor’s expertise and creativity: working with students and providing individualized feedback—something machines can’t replicate. Yet.

This topic is far from settled, but writing instructors who ignore the latest challenge to classroom teaching do so at their own peril.

 

 

Leave a Reply