Recent reports show the capability of artificial intelligence (AI) to write academic essays.
According to an article in The Atlantic, “[t]he world of generative AI is progressing furiously.” Stephen Marche, the article’s author and “a former Shakespeare professor,” provided technology from the AI company Jasper as an example.
A press release from Jasper announcing its “$1.5 billion valuation” described its “generative artificial intelligence,” or AI that creates content. Jasper’s AI enables individuals and businesses “to break through writer’s block, create original art, and repackage content for format, language, and tone,” according to its press release.
“With the [Chrome browser] extension, content creators who find themselves stuck can call up Jasper with a single click or keystroke and get contextual recommendations for original content whenever writer’s block strikes.”
Most of Marche’s article focuses on ChatGPT, the generative AI from the company OpenAI. ChatGPT responds to prompts “to answer followup questions, admit its mistakes, challenge incorrect premises, and reject inappropriate requests,” according to a description on OpenAI’s website.
ChatGPT wrote a graduate-level academic essay in response to a prompt from Mike Sharples, “Emeritus Professor of Educational Technology at The Open University, UK.”
In a blog post, Sharples pointed out mistakes in the AI’s response to his prompt about learning styles: ChatGPT “fashioned a plausible-looking but fake reference” and “invented a fictitious research study from Dunn and Dunn to critique learning styles.”
Despite the errors, OpenAI is soliciting feedback from users to improve ChatGPT, according to its website. With expected improvements to the technology, Sharples and others have pointed out the risks that generative AI like ChatGPT pose to academic integrity.
University of Pennsylvania Wharton School of Business professor Ethan Mollick told NPR that “‘AI has basically ruined homework.’”
“Plagiarism software will not detect essays written by Transformers, because the text is generated, not copied,” Sharples wrote. A transformer is an AI that can “‘learn’ how to do some task by training on existing data,” according to an article in Quanta Magazine.
ChatGPT joins other technologies that make it easier for students to plagiarize. Though instructors can use plagiarism detection platforms such as Turnitin, which scans essays for similarity to other written content, a blog post by the professional development company TeachThought describes these platforms’ shortcomings.
“For instance, students take someone else’s paper and before handing it in as their own, they change every English ‘o.’ [sic] ‘e,’ ‘a,’ and ‘c’ with Cyrillic letters ‘о’, ‘е’, ‘а’, and ‘с’ that look the same,” the blog post reads.
In addition to letter substitution, students can present the text as an image, surround text with white-colored letters, and, like generative AI, “make up fake references or state real references with fake page numbers.”
For example, the platform Fiverr has advertisements from contractors that offer academic essay writing services. One advertisement says that the contractor will deliver a “plagiarism-free essay” on topics including English, history, psychology, and ethics.
In his blog post, Sharples suggested how courses might respond to plagiarism and other issues with generative AI. “Each student writes a short story with an AI. The student writes the first paragraph, AI continues with the second, and so on. It’s a good way to explore possibilities and overcome writer’s block,” says one suggestion.
“Students will employ AI to write assignments. Teachers will use AI to assess them. Nobody learns, nobody gains,” Sharples said. “If ever there were a time to rethink assessment, it’s now. Instead of educators trying to outwit AI Transformers, let’s harness them for learning.”
For Marche, generative AI poses a threat to humanities courses. “The essay, in particular the undergraduate essay, has been the center of humanistic pedagogy for generations,” he wrote in The Atlantic. “It is the way we teach children how to research, think, and write.”
Campus Reform has reported on the way in which technology and changing standards impact academic integrity and college students’ abilities to research, think, and write.
As the COVID-19 pandemic took standardized tests and college courses online, Campus Reform reported that the University of California-Berkeley saw a “400 percent increase of alleged academic misconduct” in 2020 compared to the previous year.
At Texas A&M University, an online finance class caught students using a study aid website while taking quizzes. An anonymous student wrote a letter placing “responsibility on the professor,” and another student told Campus Reform that he “placed part of the blame for the scandal on the administration—which, he alleged, failed to help students during the challenging semester.”
An analysis by Campus Reform News Editor Jared Gould concluded that, amidst “[d]eclining knowledge and academic rigor,” students are demanding easier classes. According to one survey, “the majority [of respondents] expressed that at least one class in their schedule was too difficult and that the university should force professors to make classes easier.”
In addition to taking courses with fewer pages of required reading, students cannot pass a basic citizenship test, according to an op-ed Gould referenced by Campus Reform Higher Education Fellow Nicholas Giordano.
Gould wrote that “the standards keep getting worse and students’ tolerance for challenges keeps getting lower.”
Courtesy of Campus Reform.
From CNSNews - READ ORIGINAL
Some media, including videos, may only be available to view at the original.