AI

Artificial Intelligence sparks discussion, debate – The Echo


California Lutheran University currently has no campus wide stance on the usage of Artificial Intelligence for its students and faculty. However specific departments have their own stances that are unique to their own classes.

AI is a tool that has seen massive and rapid growth over the past few years. Dean of the College of Arts and Sciences Tim Hengst said many academic institutions don’t quite know how to deal with it.

“We’ve done a lot of work on AI and generative Artificial Intelligence. As a university, we don’t have a set policy,” Hengst said.

The School of Management has created its own policy which states “We believe that AI tools can be used to support student learning, but that they should not replace or displace that learning.”

The policy also states that students can use AI for assignments, but are instructed to do so in specific ways as determined by the professor. Those ways vary from department to department.

“We’ve kind of relied on each department to define how they want to use it. I know the English Department has expressed the most concerns about it because they have, you know, they’re focused on writing intensive assignments,” Hengst said.

Assistant Professor of Finance John Garcia said he has been able to create and train a chatbot for his classes that students can ask questions to and get immediate answers back instead of having to wait for him to respond to them. He also said AI has been useful for his students when generating code or organizing financial data.

“Some of my machine learning classes, they’ll use it to generate some code and then kind of build off that and another, some of my finance classes will use it to basically summarize kind of the key risk from a 10-K.” Garcia said.

Garcia also said the School of Manage- ment’s policy encourages students to use AI but does not require it, and it is up to the pro- fessor to decide the best ways to integrate AI as a tool in their classroom.

“We can’t ignore it. You wanna get stu- dents’ guidance as to what’s fair and what’s not fair, so that you can feel comfortable us- ing it in the right context, because in the real world, you’re gonna use it,” Garcia said.

Garcia also said classes specifically focused on how to interact with specific AI would be interesting to develop.

“I very much like to think through what an AI major would look like, like an AI in So- ciety [major],” Professor of Political Science Jose Marichal said.

Marichal said that in order to have a major focused in AI specifically it would need to in- clude both how to use it and how to critique it. One of the critiques that Garcia, Marichal and Hengst all said was that AI is unable to critically think.

“Of course, ChatGPT knows how to do some basic things, but that doesn’t mean you don’t have to learn,” Garcia said. “There’s some things that you want to learn so you can build the ability to critically think and connect those to do something that’s different, some- thing that enables you to think critically.”

Garcia said AI controversy also affects in- structors because it is becoming very hard to notice if certain work was done by the student or by an AI, but it is not completely undetect- able.

“If you’re taking a pre-version, it’s clear if you’ve used AI without even applying a tool to check, but if you’re using more of the paid tools like GBT 4, Gemini Advanced, then it’s much more difficult, but there’s still some similarities,” Garcia said.

Garcia said the smaller class sizes at Cal Lutheran give professors the opportunity to get to know their students better, which in turn helps them to understand if using an AI platform for assignments would be out of character for them or not.

“That’s one of the good things about having small class sizes,” Garcia said. “You’re able to have, you know, better understanding of your students and that enables you to more effectively identify potentially if AI is being used for different things.”





Source

Related Articles

Back to top button