ChatGPT is Not the Calculator: The Implications of Generative AI for Education and Writing


 

by Bonnie Eissner and Robert A. Scott

ChatGPT is Not the Calculator: The Implications of Generative AI for Education and Writing

Maurice NORBERT/ Shutterstock

ChatGPT, the generative AI chatbot, has been likened to the calculator, the printing press, the word processor, and the internet in terms of its impact on writing and education. These technologies changed the way we teach. They eliminated some jobs. They changed our means of writing. But they did not replace the process of human thought and creativity. Chatbots like ChatGPT threaten to do this, and educators and writers should be both worried about the implications and proactive in addressing them.

“It is the biggest change to how we think about teaching writing and writing that I think we’ve ever had,” said Jane Rosenzweig, director of the Harvard College Writing Center, in a recent podcast about ChatGPT. Rosenzweig has written extensively about the implications of ChatGPT in essays in the Boston Globe and the Los Angeles Times and in her Writing Hacks newsletter. “Writing is hard because the process of getting something onto the page helps us figure out what we think — about a topic, a problem, or an idea,” she wrote in the Los Angeles Times. “If we turn to AI to do the writing, we’re not going to be doing the thinking either.”

She and other educators rightfully caution that schools should not ban chatbots. Doing so won’t stem their use. Instead, educators at all levels, and especially university professors, must grapple with the technology and its implications while preserving the rigor and rewards of teaching students to distill and articulate their ideas through writing.

Other college faculty members who have incorporated generative AI into their teaching have shared their experiences and suggestions. Jason Tougaw, director of the M.F.A. program in creative writing and literary translation at CUNY’s Queens College, taught a new course, “AI and the Near Future of Writing” this past fall. He and the writers in the class experimented with using ChatGPT and Google Bard as a collaborator, helping them to think about their writing, its themes, ideas, and styles.

In Tougaw’s estimation, the bots generated truly original, insightful ideas about 15% of the time. “And 85% of the time, we got stuff that was really formulaic and predictable,” he said in a recent interview.

He advises fellow faculty to explicitly address the many ethical issues of generative AI technology, from plagiarism to its exploitative labor practices. He also found that a sense of curiosity and play helped him and his students. That approach may work well with others who are already invested in writing and skilled at it.

For students who are less experienced in writing and less enamored of it, other tactics may preserve the experience of writing and of articulating thought. These include inviting students to grade essays written by ChatGPT. One professor who did this found the exercise sharpened her students’ critical reasoning skills. They were unimpressed by the chatbot-generated work and recognized it as sterile.

To avoid overreliance on AI in writing, teachers and professors can demand original work, the documentation of references, and an explanation of how information is assembled.

Educators and parents share a responsibility to protect students against the loss of human judgment because of subservience to machines. It is the imperative of education. To ignore this responsibility is to invite unintended consequences too few seem to care about.

The “law of unintended consequences” states that the actions of people always have effects that are unanticipated. Social scientists, especially economists, consider these effects regularly, but others, from politicians to the general public, are apt to ignore them at society’s peril.

The American sociologist Robert Merton identified five sources of unintended consequences, including “ignorance” and “error.” While complaints of these unintended consequences are often levelled at governmental programs, they also occur in higher education, private enterprise, and in our personal lives. All result from a failure to think clearly.

Writing is thinking with a pen or keyboard. To teach writing is to teach thinking. Writers decide on the problem or goal and how the narrative will achieve its purpose.

They first choose the voice appropriate to the task. Is it to express joy or anger? To persuade or garner sympathy? Do they want to advocate or titillate? Explain or rebut? Each goal has a different style and tone, and each should match the writer’s personality unless they want to mimic someone else.

Writers make meaning from disparate facts, perspectives, and ideas. Even for fiction writers, this process involves sleuthing facts, assessing the validity of the information found, and weaving it together in creative ways that suit the mood and purpose.

These are the tasks of humans who must make judgments about purpose and material. These tasks require critical thinking and the skills and abilities of analysis and argument. AI and other tools can be of assistance, but they cannot be a substitute if our goals are the transformational benefits of education.



Source link