The emergence of artificial intelligence chatbots like OpenAI’s ChatGPT in the past few months has educators starting conversations on whether or not A.I. should be implemented in the classroom.
University of La Verne Acting Provost Roy Kwon said that all industries including higher education are grappling with what it means to have A.I. chatbots in their lives.
“I think that although it’s going to be really hard, because rightfully so a lot of faculty members grappled with the question of things like plagiarism and original work, things of that nature,” Kwon said. “I think that higher ed. is going to have to learn to live alongside these technologies that are available to our students.”
A.I. chatbots are a form of generative A.I. that are capable of generating responses based on the input that is provided. Chatbots use a transformer model to analyze and understand large amounts of information, which creates statistical models from the information that is captured. This allows generative A.I. systems to predict what the next word is and to construct sentences in order to produce a response; they can also create images, audio, and videos.
Yehia Mortagy, information technology and decision sciences professor and member of the Faculty Technology Committee, said A.I. started as a general problem solver, followed by the belief that computers were intelligent, but people realized that there was no intelligence in computers and now knowledge is being generated using computers.
ChatGPT’s rise has brought attention to other A.I. chatbots that are available. Google A.I. is currently experimenting with their own chatbot called Bard. Unlike ChatGPT who is trained on a dataset that was collected up to 2021, Bard is trained on a dataset that is constantly being updated, so it has access to the latest information and research.
Microsoft’s search engine Bing that can be accessed through the Microsoft Edge Browser has an A.I. chatbot feature that can also answer questions in a natural language and write poems and stories and share ideas for projects.
“ChatGPT is a very powerful machine with unlimited storage for all practical purposes, and better understanding of statistical analysis, new methods and new ways of calculating things allow ChatGPT to accumulate pages and billions of pages of information, knowing the encyclopedia, reading all of these articles, all of the stuff is ready information,” Mortagy said. “Now I can ask a question and based on all of the things that they have, they can answer it.”
Concerns regarding the use of A.I. chatbots have led to school districts and departments like the New York City Department of Education, Los Angeles Unified School District and Seattle Public Schools to ban or limit the use of ChatGPT.
Bill Swartout, computer science research professor and co-director of the Center For Generative A.I. and Society at USC, said there are two approaches that educators are taking when it comes to AI chatbots, which is discouraging students from using it and if they are detected using it they are marked down on assignments. The other approach is figuring out a way to use AI in a productive way when educating students.
“I don’t think that an outright rejection of ChatGPT or A.I. is the way to go,” Kwon said. “In fact, I would say that there’s a lot of ways that we could creatively implement and incorporate A.I. into the classroom, and that’s across all disciplines, not just in computer science, but from the humanities, to the arts, to the sciences, that I think will be positive for everyone.”
Educational programs have begun to embed A.I. detectors to help educators determine what is A.I.-produced content. Turnitin.com a plagiarism detection program recently released an A.I. detection to help educators identify when A.I. chatbots have been used in their students’ submissions.
“In my profession as a professor it becomes really the question of not whether I should prevent ChatGPT from being used by my students,” Mortagy said. “But how do I change my presentation material, how I present them the modes and the modality in order to allow the student learning with the use of ChatGPT and to allow me to make the assessment with the existence of ChatGPT, not without it.”
He said he has not figured out how to implement A.I. into his classes and does not believe many people have either, but he thinks we are living in an exciting time full of changes.
Some universities have begun implementing rules, suggestions or restrictions on the use of A.I. in the classroom. The University of Central Florida has a page on their website that has three categories on how students and faculty members can handle A.I. The three categories are; neutralize the software, teach ethics, integrity, and career-related skills, and lean into the softwares abilities. Georgetown University has a similar page with suggestions on how faculty can approach the situation in regards to speaking to their students, designing assignments and how to detect A.I.-produced content.
“Calculators were illegal to use when they first started,” Mortagy said. “When I went to college, you couldn’t use a calculator, that was illegal, but you can’t say that now. We have to start learning that you cannot stop progress.”
Samira Felix can be reached at firstname.lastname@example.org.
Samira Felix, a junior journalism major with a concentration in print-online journalism, is news editor for the Campus Times. She previously served as a staff writer.