Inkoly/Shutterstock
Generative artificial intelligence (AI) has caused significant disruption in the workforce and education. Higher education, in particular, is faced with tough decisions about how to navigate this new era. Questions abound from individual faculty members, such as whether to use AI detectors and how to incorporate AI into assignments. Colleges and universities are also seeking guidance on how to shape institutional policies around this rapidly evolving technology.
To gain insight into these issues, I recently spoke with Jose Antonio Bowen and C. Edward Watson, the authors of “Teaching with AI.” While AI is here to stay, the good news is that adjusting to these changes largely comes down to curricular and pedagogical modifications, they said — areas in which faculty already excel. They reassured me that this is a challenge that faculty can handle. Let’s take a look at some of their guidance on policies, AI detectors, and specific ways faculty can adapt their teaching in this new era.
Creating Policies
“[Keeping AI policies up to date] is a really significant challenge because the tools are evolving, their capabilities are evolving, and student practices are evolving,” said Watson. “One of my core recommendations to campuses is to pull together a committee — do a shared governance approach — to develop an AI policy. However, when you complete the first draft of that policy, do not disband that committee.”
He advises institutions to implement the policy, see how it functions “in the wild,” and regroup to troubleshoot and revise the policy as needed. Continuing to do this work is critical since AI is evolving so quickly. New tools and changes to existing tools are released daily. While Bowen and Watson caution that it is virtually impossible to keep up, they say it is imperative to try.
To help institutions stay abreast of changes and new tools, they maintain a list of AI models and tools, available here.
“You just have to stay in the game and keep talking to students,” Bowen said. “Recognize that this is quickly moving. We’re still in the Ask Jeeves phase if you like. It will settle down at some point, maybe, but it hasn’t yet.”
Considering Whether To Use AI Detectors
Just as there are numerous generative AI tools available, AI detectors are also now widely available.
When asked about these platforms, Watson said, “I think that they’re largely a bad idea.”
Bowen echoed those sentiments, saying, “There are a lot of things we recommend that are better than AI detectors because that’s a punitive backend.” Instead, as discussed in “Teaching with AI,” they recommend designing assignments and setting parameters on when and how AI can be used.
They also warned that AI detectors are distinct from plagiarism detectors in that they are not smoking guns. AI detectors evaluate a text’s patterns and sentence structures in an attempt to determine whether it was generated by an AI writing tool. They don’t provide definitive proof that a student used AI.
“Given some of the data out there, hundreds of thousands of students have been flagged erroneously for using AI,” said Watson. He worries about the ramifications of false accusations, such as students dropping out of school or, worse, harming themselves as a result of perhaps already struggling to be in college and then getting falsely accused.
“I feel like what can happen as a result of the negatives far outweighs — in my mind — the positives of being able to say, ‘I caught that one student cheating,'” he said.
Bowen added that no one should be allowed to use an AI detector without passing a training first. “If you don’t understand what it does, how it works, and what harm you might do, you don’t get to use it,” he said.
Still concerned about cheating? Here’s what to consider instead.
“Students cheat because they have opportunity, they have time pressure, and they don’t have motivation — you didn’t explain to them why this was important,” Bowen explained. “People don’t cheat at the gym because you go to the gym to do the work — because fitness coaches are better than we are at explaining why you have to do the push-ups.”
To reduce the chances of your students cheating, one strategy to test out is flexible policies regarding extensions or late work. If a student realizes that an assignment they haven’t started is due tomorrow and you have a strict late work policy, they’ll likely pivot to the web or AI.
“What if students were given the ability to have a two-day extension on a couple of assignments?” Watson said. “No questions asked and no grade penalty. The student would then choose to take that rather than cheat. And then if they take the extra couple of days, that means they’re doing the work, which means they’re actually learning, which has the dual benefit of improving learning outcomes within a class and also diminishing the chances that students will cheat.”
Also, make sure you’re motivating students to put in the work. “You should be explaining to students why the learning process is important and not just the product, why this is relevant, what you’re teaching them that matters, and the why of your policy,” said Bowen.
Designing Assignments in This Age
Part of the hurdle of overcoming fears of AI is shifting our mindset. Instead of worrying about cheating and AI replacing all human effort, we need to think about how AI can assist us and our students.
Employers may now expect students to be able to use AI to improve their work (or do a first draft of something that they can then improve). So, instead of banning AI entirely, Bowen and Watson recommend designing assignments that use AI for part of the work but not all of it.
“There are lots of assignments that you can have students do where either they’re using AI as a tool to do better work, or they’re making AI work better,” Bowen explained. “We detail a lot of these in the book and also on the website.”
Watson shared an example for writing assignments. You may want to have students use AI in the brainstorming phase for a paper, write a draft on their own, and then use AI to receive feedback. The key is to “be explicit about why you can’t use it for this part of the assignment or for the assignment at all” and “clearly communicate why” to your students, he said.
Staying Positive
The rapid expansion of AI is certainly disrupting higher education. It’s a change we can’t afford to ignore, but Bowen and Watson remind faculty to find the positive.
“This isn’t a flash in the pan, this isn’t going to go away,” Watson admitted. “This is a significant paradigm shift that’s impacting all of our disciplines and the way that we teach, but ultimately, this is a curriculum challenge and a pedagogical challenge, right? And we’re really good at handling curricular and pedagogical change in higher ed. This is our bread and butter.”
He also said AI has real benefits for faculty, such as revising course assignments, coming up with lecture materials, or even helping with grading. He and Bowen advise faculty members to take advantage of what AI has to offer them. Find something that AI can do for you and reclaim some of your time for other tasks.
The Bottom Line
Bowen encourages faculty to be involved and “stay in the game” as AI continues to evolve. “I can’t change the technology,” he said. “I don’t have that kind of power. But if we are not going to teach ethics, who is? If we are not going to teach people how to use this tool responsibly, who is?”
AI is undoubtedly changing education, work, and the world, but exactly how is unclear. Bowen likened it to the automatic transmission.
“Most of us [today] think driving an automatic transmission is not a moral failing, right?” he said. “We allow that cognitive offloading, but we still are driving the car. And the truth is, we don’t know what kind of cognitive artifact AI is going to be. It probably will involve some cognitive offloading. Like the automatic transmission, it might be okay, but we need to be involved in that — in thinking about it and helping our students think about it — because it’s going to make a difference for the future of human creativity and human being.”