How should AI be used in college classrooms?
By Anna Kezar
Since ChatGPT launched in November 2022, higher education has faced mixed reactions to the growing integration of artificial intelligence.
Types of AI and how it is used
AI has gained wide usage in a short amount of time, according to a 2024 meta-analysis from the Journal of Educational Computing Research. Generative AI programs such as ChatGPT, Google’s Gemini and Microsoft Copilot are frequently used in classrooms.
In a college setting, students mainly use AI for idea creation, information gathering and writing revision because it provides convenience and increases efficiency, according to a 2024 study on student perspectives of AI in education from the American Psychological Association.
In the study, 54% of the 569 college students surveyed reported that they hadn’t used AI programs. This could be because at the end of 2023, 18% of higher education instructors had “highly restricted or banned” the use of AI, according to a study by Transylvania University, a leading institution in AI education research.
Implications on student learning and career preparation
AI can enhance learning as it promotes critical thinking, engagement, motivation and self-efficacy, according to a 2024 meta-analysis on whether AI improves academic achievement. AI can enhance creativity and problem-solving as students learn to collaborate using its tools.
“Gen-AI was effective in improving college students’ academic achievement at a medium effect size,” according to a 2024 meta-analysis.
Besides the benefits to classroom learning, teaching AI prepares the next generation for a work environment that increasingly uses AI, according to Courtney Thrasher, education instructor and AI policy committee member at Grace College in Winona Lake, Ind.
“We have the wrong fear,” Thrasher said. “Jobs are not going to be taken away by AI, people who use AI are going to take the jobs.”
Everyone will have AI in their job, it just depends on the extent. Because of this, instructors should be teaching students how to best use AI within their discipline, according to Thrasher.
Stephanie Fernhaber, professor of entrepreneurship and innovation at Butler University in Indianapolis, asked her class of freshman students if they used AI. Three out of 25 raised their hands. When she asked the same question to her class of MBA students, 90% affirmed they had.
“People in the business world are using it and if we don’t prepare our business students to use it, we're not equipping them,” Fernhaber said.
Teaching students the proper function and attribution of AI is how instructors and institutions can ensure ethical use, according to Fernhaber.
This includes understanding that any information put into the model will be used to train it in the future and the implications of its internal algorithmic biases.
“The problem is they are using it but they don’t really understand it,” Fernhaber said.
Potential Drawbacks
Overly frequent use of AI can lead to dependence on the tool which can stunt academic and intellectual growth and regress critical thinking skills, according to a 2024 study.
“You still need a brain,” Fernhaber said. “You still need to leverage your creativeness and your thinking … but it can help you be more efficient.”
Completely AI-based learning is not effective in the long term because students become reliant on the technology, according to a 2024 study.
“Search engines, AI and databases can help you find an answer and even help you learn, but to fully grasp a new subject means that your mind is involved with the learning process,” Tonya Fawcett, director of library sciences at Grace College, said.
Objections to AI use in academics include concerns about accuracy. AI tends to “forgo accuracy for the sake of precision,” according to a 2024 study. This makes it crucial to verify AI-generated information with other sources.
AI in higher education poses threats to academic integrity. Without direct observation, instructors may assume AI is used in any part of an assignment, making cheating more accessible and harder to detect, according to a 2024 study.
Methods for directing AI usage
AI has been effectively embedded into courses by encouraging students to use AI tools throughout the brainstorming, drafting, revising and editing process, according to sample lesson plans collected by AI at Transy.
AI is most efficient in medium-sized classes of 20 to 40 students, where collaboration is easier and instructors can better monitor usage, according to a 2024 meta-analysis.
Strategically tailoring AI usage to each discipline increases learning, according to a 2024 meta-analysis. In the natural sciences AI use is limited because of the emphasis on hands-on experience. In the humanities, AI usage may be higher as the tools help revise essays, evaluate different perspectives and enhance research.
One of the main barriers to successful AI implementation in courses is the instructor’s lack of AI education.
“It would take an institutional shift. It would take faculty willing to get basically trained in how to use it and revising their courses and their assignments,” Thrasher said. “But that’s probably where we need to head.”
Of course, this means instructors have the time, energy and motivation to learn, which some do not for a variety of reasons, according to Fernhaber.
“Professors have to learn too, and change is hard when you don’t have the time or motivation to but you’re kind of forced to,” Fernhaber said.
How AI policies are instated
In light of the concerns and benefits, there is a growing need for institutional AI policies.
Butler University’s Generative AI faculty framework includes a disclosure required for all syllabi. Basic principles require students to state how they used AI, work within the permission granted by the instructor and citing AI.
California State University defines academic dishonesty with AI as “using artificial intelligence (AI)-generated material in an assignment or exam without prior instructor approval.”
In a survey of 569 college students, 54.42% agreed that “any policies governing the use of ChatGPT for students should be left up to individual courses and instructors rather than mandated by the university,” according to a 2024 study.
A general institutional policy should give preference to a departmental policy, which gives final say to the policy of each instructor, according to Thrasher.
Within any policy there should be clear parameters for what AI can and cannot contribute to the learning process. An AI scale that rates how AI can be used in each assignment within a course could be added to syllabi. In any academic setting, AI should aid the educational outcomes and individual learning of the students, according to AI at Transy.
One potential barrier to AI integration into courses is that higher education institutions often place plagiarism detection programs into their online courses that will flag suspected plagiarism and AI-written material.
However these programs, such as TurnItIn, have a high rate of unreliability because they only text-match rather than actually detect dishonest practice, according to a 2019 study. This can lead to misplaced accusations of dishonesty that are harmful for the student and the institution.
In place of these programs, professors should look for the signs of AI misuse in patterns of language, incorrectly cited sources, forced or unnatural tone and factual errors, according to AI at Transy.