Empowering Student Learning: Navigating Artificial Intelligence in the College Classroom – Faculty Focus
Like many of our colleagues, the emergence of ChatGPT and other generative artificial intelligence sites initially created a mild panic. Out of the panic, an emerging field of practical, practitioner-based research has begun to emerge. The call for calm, patient reflection has been noted (Naidu 2023, para 10). Others argue that AI, in various forms, has been used to elevate students’ performance in the classroom, and in some cases, overcome barriers to learning (Shippee 2020, 20). AI can also be used as a classroom tool to reframe the learning experiences and as a “partner” in learning (Benson 2023, 30). While being mindful of the concerns of plagiarism, equity, and access, some have argued educators must not only accept AI in the classroom but must help their students use it effectively as part of their digital literacy (Bender 2024, 9). The Association for Supervision and Curriculum Development (ASCD) and the International Society for Technology in Education (ISTE) are expanding the idea of partnership and generating an AI coach specially for educators, called StretchAI (https://iste.org/ai), to create more inclusive learning experiences.
As educators, we believe in the importance of fostering innovation and enhancing learning experiences. However, as the conversations in the faculty offices turned to AI and the students’ use of the tool, we were worried students in our undergraduate courses would use AI to generate all assignments and their learning would essentially cease. To begin to understand how the students perceived and worked with AI, we brought our concerns to the students in our undergraduate courses.
Using a series of in-class conversations and four anonymous Google form surveys, we asked students about their experiences and perceptions of using AI to prepare for class. We were surprised by the students’ comments. Most students shared they would rather do their own work. Students regularly shared they were not interested in using a generative intelligence site to complete their work—they were not ready to trust AI with their learning, and more specifically, course grades. Students were concerned with punitive consequences for using AI.
They were instead interested in using generative AI sites (mostly ChatGPT) to improve their work. Students most noted the use of AI to refine their thinking and to generate ideas. They reported using the output from the AI site as a “push” for moving their task forward. The AI output was the rough draft students would tailor to meet the requirements of the specific task. Students shared they wanted to do their own work and believed AI could improve their ideas. Students also knew they could not blindly trust a generative AI output, which, like an overconfident classmate, could spread incorrect information.
Considering students’ candid sharing about their understanding of AI tools to complete course work, we made a choice to trust our students. We added a policy post to the course LMS.
Students are expected to generate and submit original, personally composed tasks for each assignment. The use of Artificial Intelligence to create and submit work under the guise of original work is not acceptable. Students feeling the need to consult an Artificial Intelligence site to generate ideas or to suggest alternative wording may do so by emailing the professor before submitting the work and explaining how and why the AI will be used. The student will then insert a footer to the assignment containing a statement noting the degree to which AI was used in the completion of the task.
We want students to feel free to explore how AI can suggest multiple levels of complexity for a given topic and do so efficiently. We also acknowledged how integrating AI into coursework improperly could undermine the importance of critical thinking, creativity, and independent inquiry by providing students with ready-made answers or solutions. Working with elementary education students, we stress to our students that ideas and information generated from AI programs must be balanced and supported by meaningful learning experiences created by teachers’ knowledge of the students enrolled in the class. Considering the benefits of AI-enabled insights with the preservation of pedagogical principles that prioritize active learning and intellectual engagement is an important grounding concept in creating a balanced learning experience.
Integrating AI into assignments can encourage students to develop crucial skills for the digital age, including data literacy, critical thinking, and problem-solving (Bender 2024, 10; van den Berg and du Plessis 2023, 9). By engaging with AI technologies, students can learn to evaluate information sources, discern patterns, and synthesize findings which can prepare them for the demands of an increasingly data-driven society. Rather than viewing AI as a threat to intellectual rigor, educators could recognize its potential to enhance students’ analytical capabilities and foster a deeper understanding of complex subjects.
To support students in effectively utilizing AI, educators can adopt several strategies.
Strategy 1: Evaluation
Educators can provide guidance on selecting reliable AI tools and interpreting their outputs. AI outputs are not infallible and do produce “hallucinations” – incorrect information presented as fact. AI-generated content may include irrelevant information because deep learning models can produce outcomes that initially appear coherent but lack depth (Cano et al. 2023). By leading in-class whole group reviews of outputs generated by AI, educators can ensure students navigate the landscape of AI resources responsibly. Educators can offer further opportunities for hands-on experimentation with AI technologies through workshops that can help students develop confidence in their ability to determine what to use and what to discard from AI outputs.
Strategy 2: Balance
While AI can undoubtedly enhance the learning experience, it’s crucial to strike a balance between leveraging AI’s capabilities and fostering students’ independent learning and critical thinking skills. One way to achieve this balance is by designing assignments that require students to not only utilize AI tools but also engage in reflective analysis and interpretation of the results. By encouraging students to interrogate the outputs generated by AI algorithms, educators can cultivate a culture of inquiry and skepticism, fostering intellectual curiosity and independent inquiry.
Strategy 3: Collaborate
Incorporating collaborative elements into assignments can help mitigate the risk of overreliance on AI. By encouraging students to work in teams to formulate questions to input into the generative AI site and to interpret results collaboratively, educators can foster a sense of collective responsibility and peer accountability. The shared responsibility promotes a holistic approach to learning that encompasses both AI-enabled insights and human judgment while supporting students as they explore how to effectively and fairly use AI.
Employing these strategies (and others) in classrooms can be beneficial to students in fostering innovation and enhancing learning experiences.
Resources to learn more about AI in the classroom:
AI4K12.org: This website offers professional development opportunities specifically tailored to educators interested in incorporating AI into K-12 and higher education curricula. https://ai4k12.org/
Hands-On AI Projects for the Classroom provide educators with a variety of activities to teach students about AI as well as an AI Ethics Guide – for free! https://iste.org/ai
Melissa Parks, PhD, is an associate professor of elementary education at Stetson University in Deland, Florida. Dr. Parks is an active member of the National Science Teaching Association (NSTA) and is currently a member of the NSTA Early Childhood- Elementary Science Teaching Committee.
Mary Ellen Oslick, PhD, is an associate professor of literacy/reading education at Stetson University in DeLand, FL. She is an active member of the National Council of Teachers of English and currently serves as a member of the Children’s Literature Assembly’s award selection committee for the Notable Children’s Books in the Language Arts (NCBLA).
References
Bender, Stuart Marshall. 2024. “Awareness of Artificial Intelligence as an Essential Digital Literacy: ChatGPT and Gen-AI in the Classroom.” Changing English, (2024), 1–14. doi:10.1080/1358684X.2024.2309995.
Benson, Alayne. 2023. “The Future of AI in Education: AI Classroom Partners.” XRDS: Crossroads, the ACM Magazine for Students 29 (3): 30–35. https://doi.org/10.1145/3589646.
Cano, Yvette Mucharraz, Venuti, Francesco, & Martinez, Ricardo Herrera. 2023. “ChatGPT and AI text generators: Should Academia adapt or resist?” Harvard Business Publishing. https://hbsp.harvard.edu/inspiring-minds/chatgpt-and-ai-text-generators-should-academia-adapt-or-resist
Naidu, Edwin. 2023. “Leading Academics Believe Fears over ChatGPT Are Misplaced.” University World News. https://www.universityworldnews.com/post.php?story=20230222071308123
Shippee, Micah. 2020. “No Brainer: AI in the Classroom.” Teach, September, 20-21. van den Berg, Geesje and Elize du Plessis. 2023. “ChatGPT and Generative AI: Possibilities for Its Contribution to Lesson Planning, Critical Thinking and Openness in Teacher Education.” Education Sciences 13, no. 10: 998. https://doi.org/10.3390/educsci13100998