UNF's #1 Student-Run News Source

UNF Spinnaker

UNF's #1 Student-Run News Source

UNF Spinnaker

UNF's #1 Student-Run News Source

UNF Spinnaker

Students using generative AI to write essays and solve problems: Should we be worried?

Generative AI has entered the mainstream in recent years, prompting reactions ranging from cautious fascination to dystopian fear. These ever-improving programs are being used for various creative and academic purposes.

While AI programs that fabricate digital art like Dall-E have raised questions of ethics and plagiarism, one of the most well-known AI models currently in use is Open AI’s ChatGPT. 

According to the Pew Research Center, among the two-thirds of teenagers aware of ChatGPT, 19% use it for schoolwork and 57% think using the software to write an essay is unacceptable.

This statistic significantly changes when discussing mathematics and research; 39% and 69% of people think it’s okay to use it for these purposes, respectively.

ChatGPT is a large language model chatbot that Open AI claims can “help with writing, learning, brainstorming and more.” (Madelyn Schneider)

Vaughan James is a science communication specialist at the University of Florida’s Biodiversity Institute with experience teaching students. He views this technology as a double-edged sword.

“I do think [ChatGPT is] useful for rote writing tasks,” James said. “A lot of writing is formulaic. These styles lend themselves well to AI intervention, at least in terms of structure.”

“[But] content is a whole other story,” he said. 

To James, while these AI tools can aid in some tedious academic tasks, checking AI-generated content for accuracy is still incredibly important.

“While I think that it’s fine for people to use the tools at their disposal to do the work they need to do, that only really applies if the tools work correctly and consistently,” he said. “I’m not at all convinced that ChatGPT does either.”

These concerns are not unfounded, especially since ChatGPT tends to generate false information, further complicating its regulation. 

“Language models don’t provide facts; they provide language,” James said. “The consequences are people believing they’re getting facts and just trusting what they read—unproductive at best and dangerous at worst.”  

Josh Gellers, a University of North Florida professor and member of its Generative AI Working Group, believes that using AI in academia can have benefits and risks.

“It’s a massive leap forward in technology,” Gellers said.

However, he said this new software comes with significant drawbacks. ChatGPT is an example of a large language model, a program that gathers and analyzes the content of already-published internet data.

This data generation and aggregation method led to the ethics and plagiarism debates plaguing the current conversation around AI.

“[These AIs] take a lot of data, and they use parameters [to] try and establish connections,” Gellers said. “It predicts what it thinks you’re looking for.” 

However, because of how these AIs gather information, the information they generate cannot be considered the work of the person using the program. 

“It’s not the work of the student,” Gellers said.

Creative fields like writing, art and music production may be the most controversial areas to use AI.

For 25-year-old Noah Studeman, a Jacksonville University graduate who majored in theater, AI models like ChatGPT can be helpful tools in the creative process. But they can also be a crutch.

“In terms of using [ChatGPT] for academic and creative work, it’s good to help,” Studeman said. “I think it can be helpful, but you have to make sure that you don’t just use the AI model word-for-word. Otherwise, you’re not putting in any actual effort yourself.”

Despite the controversy, Studeman thinks AI can be helpful in the creative industry if used carefully and in moderation. 

“I think AI is great for giving you ideas and maybe helping with world-building,” he said. “[But] it’s easier to not use it at all and do [the work] yourself.”

To Studeman, human nuance is far more critical to creative fields than AI programs with “no capability of understanding emotion.”

No formal rules regulate the use of generative AI at UNF, but there are resources for professors, such as online guides to ChatGPT-proof” assignments and tools to help detect AI.

___

For more information or news tips, or if you see an error in this story or have any compliments or concerns, contact [email protected].

Leave a Comment
More to Discover
About the Contributors
Thomas Herrold, Reporter Intern
Madelyn Schneider
Madelyn Schneider, News Editor

Comments (0)

Spinnaker intends for this area to be used to foster healthy, thought-provoking discussion. Comments are expected to adhere to our standards and to be respectful and constructive. As such, we do not permit the use of profanity, foul language, personal attacks, slurs, defamation, or the use of language that might be interpreted as libelous. Comments are reviewed and will be removed if they do not adhere to these standards. Spinnaker does not allow anonymous comments, and Spinnaker requires a valid email address. The email address will not be displayed but will be used to confirm your comments.
All UNF Spinnaker Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *