Artificial Intelligence: Inside the Classroom and Beyond

A student types into ChatGPT: “Write a 1500-word essay on major themes in Franz Kafka’s The Metamorphosis.” The student clicks “enter” and ChatGPT begins working, accessing its infinite data sets from various networks. A large language mode of interconnected and self-supervised learning, referred to as “Generative Pre-trained Transformer” (GPT) delivers five double-spaced pages exploring alienation and dehumanization within The Metamorphosis.

In meeting rooms across the country, educators are attempting to identify and limit the use of Artificial Intelligence (AI).  Specifically, interfaces that can write essays like ChatGPT pose a plagiarism risk. Educators are adding punitive AI clauses to plagiarism statements on syllabi. One professor at Texas A&M-Commerce failed more than half of his class after ChatGPT falsely claimed it wrote their papers, and the university withheld their diplomas.1 

Academic integrity and dishonesty aside, one fear at the center of this issue is the following: students are forfeiting engagement with challenging ideas–ideas that build critical thinking skills.  Major goals in higher education are that students deepen their understanding of humanity, challenge their opinions, and broaden their perspectives on the world around them. 

Subsequently, educators believe that AI renders thinking exercises moot. AI can write initial responses supporting claims with direct quotes and providing an argument for a persuasive essay. Students relegate the thinking exercise to technology rather than their minds. This predicament highlights one ethical consideration for this technological advancement, namely its relatively uncharted territory. While some welcome AI’s ambitious and growing capabilities, others have more foreboding warnings. What began as a plagiarism issue, now represents an existential question about society’s future collapse.

In a statement on AI Risk, signed by major innovators in the field, scientists warn with a single sentence statement: “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.” 2

What does this mean for education?

In The Chronicle of Higher Education, Joseph M. Keegin, a doctoral student in philosophy at Tulane University, refers to ChatGPT as a “plagiarism machine.”3 Students at all levels are submitting work written by AI. However, the reality is obscured by the initial alarm. The proliferation of ChatGPT and university climates post-covid are connected. 

In classrooms, students are facing high levels of disconnect. Dubbed the “Covid Generation,” high school and university students are struggling on many campuses. Students are “defeated,” “exhausted,” and “overwhelmed.” A university professor explains that “twenty to 30 percent of her students do not show up for a class or complete any of the assignments” and “if she asks questions on what she’s been talking about, they don’t have any idea. On tests, they struggle to recall basic information.”4

Educators and higher education professionals have witnessed students struggle as they emerge from COVID lockdowns. Many students are unaccounted for and never returned to campus. For students who did return to campus, the transition back is fraught with mental health challenges and executive dysfunction difficulties. Thus, the reliance on AI is compelling. Students utilize the interface to complete work that is difficult to initiate or challenging to segment into shorter goals. With compromised focus and upcoming deadlines, students turn to ChatGPT to get started and complete assignments. 

However, while some students saw their GPA increase (12%), a majority in one survey did not see improvement.5 This fact belies the helpful utility of AI and instead highlights the more complex factors driving its usage and misuse among students. 

Ultimately, students are struggling, and AI is a bandage for a larger wound that campuses are just beginning to mitigate. AI should not be a replacement. Instead, educators can leverage AI as a tool to deepen learning experiences with disconnected and struggling students. AI technology can support students with learning disabilities or executive dysfunction concerns by supporting processes for learning. At LAS, one resource coaches use with students is Taskade, powered by artificial intelligence. This application “instantly generates task lists, mind maps, meeting agendas, and custom workflows.” For example, students can use AI technology to summarize large pieces of complex text to then engage critically with the material in meaningful ways. 

Many educators lack knowledge of AI abilities due to its relative novelty and ever-changing nature. Concentrated efforts to understand are paramount to responding to the current situation in high school and college classrooms. 

AI is undoubtedly a tool that will remain in use in some capacity. So, understanding how it works and how to leverage technology as a teaching tool is vital. 

Works Cited:

  1. A Texas professor failed more than half of his class after ChatGPT falsely claimed it wrote their papers https://www.businessinsider.com/professor-fails-students-after-chatgpt-falsely-said-it-wrote-papers-2023-5#:~:text=A%20professor%20at%20Texas%20A%26M%2DCommerce%20failed%20more%20than%20half,Rolling%20Stone%20reported%20on%20Tuesday.
  2. https://www.safe.ai/statement-on-ai-risk
  3. https://www.chronicle.com/article/chatgpt-is-a-plagiarism-machine

 

Get ahead! LAS Educational Coaches™ provide structure, support and accountability.

Contact LAS

Receive Monthly Newsletters