Retreat Recap | AI: Teaching & Tools (Feb 2, 2024)

On February 2, 2024 over 230 instructors, staff, and graduate students gathered in the DeLuca Forum and on Zoom for the Teaching Academy’s 2024 Winter Retreat – AI: Teaching & Tools. Seventeen speakers from a wide range of disciplines shared anecdotes, stories and experiences of diverse ways in which AI is already being used in education to support teaching and student learning.

Missed the event? Want to look back?

Stories of AI & Teaching

  • “Expect more from students with AI” | Wendy Fritz, School of Business, shared her positive experience integrating AI in teaching, encouraging responsible student use. She highlights AI’s time-saving benefits in creating creative quizzes and rubrics, emphasizing the importance of policy adherence. Fritz suggests using AI for critical evaluations with Bing Chat and recommends an article by Ethan Mallick. She concludes by urging educators to design assignments that challenge humans rather than tasks easily handled by AI.
  • Building and Training AI Models for the Classroom | Kaiser Pister, Computer Science, discussed his two-year experience integrating AI in teaching. He received a microgrant to explore belonging in computer science classrooms and used transcriptions of his lectures to build a Piazza Bot, generating student answers based on relevant content. Pister also utilizes GPT for creating new exam questions, allowing students to generate their own practice questions and incorporating GPT-generated jokes to add humor to the class.
  • Using AI to improve students’ learning experience | Andy Kuemmel, Computer Science, discussed integrating AI in CS502 to improve education. He employs Chat GPT for after-lecture practice and has students generate practice questions. Kuemmel explores using AI to interpret complex programming assignments, seeks exam preparation ideas, and aims to make lectures more engaging. He emphasizes aligning assignments with learning objectives and providing clear expectations when incorporating AI. Kuemmel encourages in-person discussions on the essence of being a good tutor in the AI era.
  • Using AI to personalize learning | Lauren Rosen, Collaborative Language Program, shared that she harnesses AI for personalized learning. She uses Chat GPT to create choice boards for diverse summative assessments, and tailors content to different language proficiency levels and the varied disciplines and interests of students. Rosen emphasizes AI’s role in providing alternative explanations, fostering student agency, and promoting reflective learning through seeking corrections and self-reflection. She advocates for personalized learning to unlock each learner’s potential and cultivate a love for inquiry and learning.
  • Bias & Stereotypes in AI Output  | Emily Hall, Writing Across the Curriculum, acknowledged the creative uses of AI in education but raises a cautionary note about perpetuating biases in generative AI output. She highlights biases rooted in training data exemplified by gender stereotypes in Chat GPT’s recommendation letter. Hall emphasizes linguistic bias, exclusion of nonstandard dialects, racial bias, and the potential for AI to generate biased comments when assuming specific personas. She urges educators to be vigilant about bias in AI output, teach students about AI limitations, and encourage critical thinking in AI interactions.
  • My AI Tutor | Tony Orzechowski, Engineering Professional Development, introduced “My AI Tutor,” a Chat GPT-integrated application focusing on personalization and learning equity. The tool tailors responses based on student bios, incorporates a low-stakes testing feature, and offers detailed feedback. Autogen facilitates collaborative exploration, and a platform simplifies content population for instructors. The platform translates materials into a vector database, addressing bias concerns. Orzechowski aligns with previous speakers, emphasizing relevance, domain-specific language, and pedagogical best practices. The first part of the platform has been launched, with additional features forthcoming.
  • Using AI in Statistical Analysis | Syed Abdul Hadi, Wisconsin Center for Education Research, discussed AI applications in English language proficiency testing. Highlighting the importance of test forensics, he describes using pseudo-AI programs to identify anomalies in students’ test behavior, flagging potential issues like preknowledge or item compromise. Another use involves simulating student performance with AI to predict outcomes based on changes in test difficulty or content. Hadi also addresses a personal project on automated essay scoring, using fine-tuned large language models to assess essays against human-scored data, aiming to reduce bias and ensure grading consistency.
  • Helping Students Learn to use AI | Roey Kafri, Curriculum and Instruction, shared his approach to teaching students about Chat GPT, emphasizing it as a tool similar to other potentially dangerous tools in maker education. He outlines three key questions: understanding the tool’s capabilities, recognizing potential dangers, and learning to avoid misuse. Kafri describes a course for pre-service teachers where students engage with Chat GPT by creating different iterations of stories, poems, jokes, and more. The focus is on making the learning experience fun and iterative, encouraging students to experiment responsibly with the tool.
  • AI-Assisted Coding in STEM Education | Duncan Carlsmith, Physics, shared that he uses Chat GPT in teaching physics to undergraduates. He emphasizes the tool’s benefits in simplifying code generation, providing explanations, and supporting multilingual coding. Carlsmith demonstrates its efficiency in converting equations from scientific journal PDFs into executable code and invites further exploration through a shared link.
  • Student perceptions of AI in sustainability education | Andrea Hicks, Civil and Environmental Engineering, talked about using Chat GPT in teaching sustainability and engineering. She highlights the importance of questioning everything in both sustainability and AI tools. Hicks shares an assignment where students analyze a prompt generated by Chat GPT and critically reflect on its strengths and weaknesses. The students’ reflections reveal diverse perspectives on AI, from appreciating it as a starting point to feeling cheated by the ease of obtaining answers. Hicks emphasizes the need for more research on the impact of AI in education.
  • Using AI in Job Searches | Nate Jung, Technical Communications Program, College of Engineering, shared two stories about his experiences using AI in the college classroom. In the first story, he describes a cautionary tale where the prompt for a rhetorical analysis assignment led to AI-written responses that were comical. He acknowledges the importance of providing clearer guidance to students. In the second story, Jung discusses a successful experiment where students used Chat GPT to draft a technical report. He found that the technology was good at producing readable prose and genre templates but struggled with research and argumentation. Jung highlights the potential of AI in lowering barriers and reducing anxiety during the initial drafting process.
  • Using AI to Prompt Socratic Questioning | Jan Miernowski, French & Italian, conducted an experiment in a literature class where students acted as Socratic tutors using Chat GPT to challenge claims about Balzac’s novel “Le Colonel Chabert.” While the machine effectively prompted students to be more specific, it tended to loop and lacked the ability to generate ideas or critical thinking. Miernowski remains positive about the experiment, emphasizing that Chat GPT cannot fully replace the role of a Socratic question person. Only one-third of students had Chat GPT accounts, suggesting concerns about widespread usage may be overstated.
  • Student Uses of AI | Debra Deppeler, Computer Science, shared her experience with the impact of Chat GPT on her computer science course. After observing unusual exam results and students reporting varied use of Chat GPT, she investigated further. In the summer, she learned about Chat GPT and its prevalence in her class. For the fall semester, she implemented a policy to document its use, but it was largely unsuccessful. Now, in Spring 2024, she discourages its use, emphasizing its limitations in promoting skill development. Despite cautionary tales, she recognizes the challenges in monitoring and regulating its use.
  • Gobin.Tools: An AI Tool to Support Neurodivergent Humans | Dan Pell, Center for Teaching, Learning and Mentoring, presented Goblin Tools, a set of single-task tools designed for neurodivergent individuals. These tools include Magic ToDo, Judge, Estimator, Compiler, Formalizer, and Chef. Magic ToDo helps break down tasks into manageable steps, and The Formalizer adjusts the tone of text. Dan demonstrated Magic ToDo and The Formalizer, emphasizing their potential applications in supporting individuals. He also noted that their function is language-independent.
  • Using AI to overcome challenges in job searching | Adam Gratch, Law School Office of Career and Professional Development, discussed the practical use of generative AI in law school career coaching. To help stressed students overcome writer’s block for cover letters, he has them list three accomplishments, uses Chat GPT to generate a basic cover letter template, and then guides students to edit and personalize it. This approach helps students efficiently overcome the stress of crafting application documents.
  • AI Classification in Diagnostic Medical Imaging | Kaitlin Sundling, SMPH Pathology, presented a workshop on cancer diagnosis using machine learning. Trainees classify cells visually, learning concepts like supervised learning and algorithm training. Sundling emphasizes patient privacy and notes the workshop’s success in hands-on AI learning for healthcare applications.

Thank you to our speakers! — John Zumbrunnen, Vice-Provost for Teaching & Learning, Wendy Fritz, School of Business, Kaiser Pister, Computer Sciences, Andy Kuemmel, Computer Science, Lauren Rosen, UW Collaborative Language Program, Anthony Orzechowski, Engineering Professional Development, Emily Hall, Writing Center, Syed Abdul Hadi, Wisconsin Center for Education Research, Roey Kafri, Curriculum & Instruction, Duncan Carlsmith, Physics, Andrea Hicks, Civil and Environmental Engineering, Lynne Cotter, Journalism & Mass Communication, Nathan Jung, Technical Communications Program, Jan Miernowski, French, Debra Deppeler, Computer Science, Dan Pell, Center for Teaching, Learning and Mentoring, Adam Gratch, Law School, Kaitlin Sundling, Pathology and Laboratory Medicine

Thank you to the Volunteers and Planning Committee! — Angela Kita (Co-Chair), Christine Rybak, Dan Pell (Co-Chair), Franklin Hobbs, John Martin, John Parrish, Karen Hershberger, Karin Spader, Kelly Copolo, Lisa Jong, Mary K. Thompson, Peter Van Kan, Tim Dalby

Questions about the Retreat? Want to help plan? Contact Angela Kita & Dan Pell retreats-uwta@g-groups.wisc.edu