AI poses possible issues for learning, equity

By Rose Bialk

Along with the usual syllabus run-through and attendance policies, one of the first discussion topics that was brought up in my classes this semester was the question of to what extent artificial intelligence (AI), specifically ChatGPT, should be allowed in the classroom. Because this is such a new resource, none of my professors had set policies on how they thought it should be used. Rather, they opened the floor for student opinions. 

I heard a range of thoughts from my peers: that ChatGPT is useful and we should use it unchecked at our own discretion; that we should not use it to write content but should be allowed to use it to accumulate sources for research; and that we should not use it in any capacity. As both a mild technophobe and an English major with concerns about jobs in the humanities being displaced by AI, I align myself closest to the latter stance. 

I have never used ChatGPT and I do not intend to anytime soon. This is partially due to a desire not to have that firsthand realization of just how much it is capable of—I find reading news about AI disconcerting enough, and reminiscent of the uncanny valley.

But aside from my automatic fear response surrounding all things AI, I do also logically believe that the use of ChatGPT in the classroom is not a net positive. Firstly, it is widely recognized that ChatGPT poses a plagiarism issue. Many plagiarism detection software programs, such as GPTZero and AI text classifier, have gained traction, which claim to be able to determine whether students are using AI to write essays and responses. 

These fail-safes, however, are not as foolproof as we would like to think. There are entire websites and Reddit threads devoted to the best strategies for getting around AI detection software. 

Not only does using ChatGPT to do your work for you inhibit your own education, it devalues the work that everyone else puts in by doing it themselves. It reduces motivation to treat the assignment as an actual opportunity to learn, and turns it into just another meaningless hoop to jump through on the way to getting a degree. 

“I think ChatGPT is a crutch that will prevent students from actually needing to learn content. Although I can see its use for small tasks, like how to email a teacher with questions, it also prevents students from developing the soft skills that completing those small tasks allows,” an anonymous K-12 teacher said in a survey from the National Education Association.

Simply banning ChatGPT is not the answer, and my professors recognize this. Cheating has been around long before AI, and the people who want to cheat will always find a way. Restructuring curricula to prioritize critical thinking and place less of an emphasis on regurgitation and summary is an important adaptation. Many educators are working hard to come up with more imaginative prompts that students will have a harder time feeding to ChatGPT. 

Yet, this poses another challenge: In a system where teachers find themselves sidelined, undervalued and burned out, how do we find the recources and motivation to build increasingly ingenuitive classroom environments? 

Some school districts are resorting to replacing human teachers with digital ones, such as in Mississippi, where the teacher shortage is one of the country’s worst. 

The implementation of digital teachers to fill gaps inhibits the social benefits of school. More reliance on AI decreases interactions between teachers and students, as well as among students. 

“Children may also begin turning to these conversational AI systems in place of their friends,” notes an article from Education Next. 

Now that we have all been through the somewhat hollow experience of attending class on Zoom, we should value more than ever the socially interconnected environment of an in-person classroom. From an academic standpoint, educators have a better sense of how to help individual students when they communicate with them. AI-dominated learning not only fosters loneliness, but a sense of anonymity in the classroom.

The human teacher shortage is not only an issue for quality of education, but for equity.  Despite advantages of personalized learning, AI can limit the challenges that create the most lasting learning and promote bias that disproportionately impacts students of color.

“Researchers have found that schools that serve high percentages of minority students and students in poverty have more difficulty finding and retaining qualified educators than (w)hiter, more affluent schools,” Moriah Balingit wrote in a Washington Post article.

Furthering this inequity is the fact that the AI systems put in place to “teach” at these schools are designed primarily by white developers—a skew that allows implicit bias to be encoded into AI. An article from Brookings titled “AI is coming to schools, and if we’re not careful, so will its biases” compares this situation to similar issues faced in courtrooms.

“We’ve already seen the risks of using biased algorithms in the courtroom: Software used to forecast the risk of reoffending incorrectly marks Black defendants as future criminals at twice the rate of white defendants,” the article states.

The authors also point to the precedent of standardized testing as a sign of how not actively considering the role of systemic racism can create vastly unfair methods of ascribing success. AI has not only been used as an instruction material, but also as a way to assess how much students know and in which level courses they should be placed—similar to standardized tests.

“(T)hat sorting can be fraught with pernicious consequences if the algorithms don’t consider students’ nuanced experiences, trapping low income and minority students in low-achievement tracks, where they face worse instruction and reduced expectations,” the article states. 

Another issue associated with using AI in the classroom is the privacy risk. Using AI generative technology allows the information we feed it to be stored and analyzed, creating opportunities for identity theft. Using AI to analyze grades also encodes students’ academic performance for eternity, where every mistake and opinion becomes a piece of data that follows them around. While this might not be any different from the data that our phones store about us, it is an expansion into a new environment: the classroom.

However, it is worth acknowledging that AI resources have also alleviated some of the draining workload that teachers struggle with. Being able to feed simpler assignments and tests to AI to grade takes a burdensome and time consuming weight off teachers’ shoulders—something that could potentially entice more people to take teaching jobs.

AI grading also allows for students to receive instant feedback on assignments. However, this feedback will be less individualized and in-depth. Every issue regarding AI is multifaceted, and requires a weighing of priorities that can vary from teacher to teacher and student to student. 

Are we willing to sacrifice privacy for personalized learning tracks? Are we willing to embrace new technology at the cost of academic integrity and social inclusion? Or should we resist the direction in which the world is moving and risk being left behind in obsolescence?

I think that if the role of AI in the classroom is going to be dominant, it will need much more privacy regulation and intentional forethought to mitigate bias. I have heard a comparison that the legal gray area around AI is much like the Wild West; we are making the rules up as we go. As usual, the people making the rules are the ones who stand to profit from them. 

As for the AI that we face in our own LC classes, I think that ChatGPT should be used sparingly and only when students and professors are on the same page about how to use it. 

Utilizing it creatively can be both efficient and useful, but I am wary of how quickly it can monopolize our motivation to do anything on our own. 

This is why I appreciate the class discussions that acknowledge the unusual circumstances we are in so that we can make case by case agreements as a class. 

Ignoring it and pretending that people aren’t going to crank out that late night essay with a few prompts of Chat GPT is not only ignorant, but it changes the expectations of students. 

I am grateful to have the privilege of getting a college education. If AI enhances this education, I will adapt to  the changing times. But all too often, it feels to me like AI streamlines our education, taking away the crucial academic challenges that push us toward success. Keeping a critical eye on this rapidly changing educational landscape is more important than ever.

Subscribe to the Mossy Log Newsletter

Stay up to date with the goings-on at Lewis & Clark! Get the top stories or your favorite section delivered to your inbox whenever we release a new issue. 

Be the first to comment

Leave a Reply

Your email address will not be published.

AlphaOmega Captcha Classica  –  Enter Security Code
     
 

*