Generative AI


This page will explore the topic of generative AI, including its impact on the classroom as well as potential instructional uses for generative AI tools. 

We recognize that there are different approaches to teaching across disciplines and do not assume that all faculty will incorporate generative AI or do it the same way. Whatever way you may choose to engage (or not) with generative AI in your courses, it is important to do so thoughtfully and strategically. We hope this page will be helpful in this regard. The toolkit includes a wide range of approaches and strategies for you to consider incorporating into your courses. It is not meant to be a ‘how-to’ guide for integrating generative AI but more of a resource to help you think through and decide what options are best suited for your teaching context. We encourage you to think about your teaching approach and context as you review the strategies shared in this document.

If you would like to discuss strategies for your specific context, please reach out to your Instructional Designer or drop into the TLC Drop-In Hours. If you would like to share your experiences with AI, please complete the Experiences with Generative AI Survey.

What is generative AI?

Generative artificial intelligence (“AI”) tools use machine learning models to create new, human-realistic content in response to a prompt. The tools are readily available to students, and just as previous educational technologies have done, AI tools are changing educational, learning, and knowledge practices. Given a carefully worded prompt, these tools can compare texts, mimic critical thinking, and generate content that reads like a personal narrative (Mills, 2023). It is no wonder that academics are spending so much time thinking and discussing the implications of generative AI to teaching and learning.

Much of the concern around ChatGPT, Bing AI, Google Bard, and other generative AI tools has centered around the challenge of students using them to write and submit their assignments. How will we be able to assess student work to support their learning if they are not submitting work that they are producing themselves? How do we teach students core skills like critical thinking and communication with generative AI tools so easily accessible? These are some of the questions that faculty in the US and beyond have been pondering. One of our responsibilities is to teach students how to engage with these tools in responsible and ethical ways while also being critical of the ways these tools influence our world.

As you consider your approach, we urge you to remember that we are preparing students for a world that already has generative AI. Students may already be using generative AI tools in a variety of ways that supports their learning, such as:

  • Research support/project inspiration: Sending prompts on assigned research topics to receive various perspectives on the subject and related concepts.
  • Thought partner/learning partner: Engaging with AI to explore complex concepts or academic material, find alternative perspectives on a specific topic, and simplified explanations and summaries of challenging academic concepts.
  • Writing enhancement: submitting prompts or assignment drafts to receive feedback and suggestions for improving sentence structure, grammar, and language usage.

How are faculty approaching AI in their teaching?

The Teaching and Learning Center has identified four levels of use for your consideration. You might consider including a statement on your syllabus that communicates to students your approach to engaging in the course. If you would like help with drafting such a statement, please reach out to your Instructional Designer or reach out to the Teaching & Learing Center for assistance.

Level of Use Description
Free Use Generative artificial intelligence (AI) tools may be used freely and without acknowledgment throughout the course. Provide an explanation to students for why you are allowing the use of generative AI in the course. Work with students to help them understand the boundaries and suitable applications of these tools to ensure their responsible and ethical use in accordance with the course objectives and guidelines.
Use with Acknowledgement Generative artificial intelligence (AI) tools may be used throughout the course, but students are required to provide citations or acknowledgment when utilizing these tools in their assignments, discussions, or any other course-related activities. Proper acknowledgment ensures academic integrity and demonstrates understanding of the source and influence of generative AI tools on their work. Explain to students your rationale for this approach and how you would like them to cite or acknowledge tool use. See VIN University Libraries for examples of citations. Please note that current AI detectors are not accurate, and faculty should be careful not to accuse students of AI misconduct without sufficient evidence.
Use with Permission Generative artificial intelligence (AI) tools may be used with prior permission in select scenarios or assignments, as communicated by the instructor. In these situations, students must clearly cite or acknowledge the use of these tools using the specified format, such as including the tool name and source in their work. The rationale for allowing or disallowing these tools will be explicitly explained to students, helping them understand the limits and appropriate uses of generative AI tools in the context of the course. Examples of appropriate use and citation guidelines will be provided to ensure students comply with the requirements and maintain academic integrity. Please note that current AI detectors are not accurate, and faculty should be careful not to accuse students of AI misconduct without sufficient evidence.
No Use In situations where you believe it is crucial for students to create their own work independent of generative AI tools, be explicit in articulating this prohibition as well as explaining why it is in place. As you consider this approach, recognize that it is not possible to reliably detect AI-generated content. The number of generative AI tools is growing, and it is becoming easier for individuals to create their own customized tools. Detection tools are in a perpetual race to remain current and thus far have not been proven to be reliable. If you plan to use an AI detecting tool, be aware of the risk of harm a false positive can create for a student.

How can I use generative AI in my classroom?

Feedback Provider: Students load their work into the AI tool and ask for feedback. This can be a guided exercise for students or one they do independently. Instructors can also submit student work to an AI tool for initial feedback ideas as part of the assessment process. (Source: Brent Anders, 2023; DitchthatTextbook

Enhance “Think, Pair, Share”: After asking students to pair and share their initial thoughts on a question, have them submit their idea to or ask the question of an A.I. tool. Then students review the response together to see how it compares to what they originally generated before sharing with the class. (Source: Sarah Dillard

Grade the AI: Students take an AI created item and provide a grade with feedback on how well the AI generated product meets a set of criteria. (Source: Matt Miller, DitchthatTextbook)

Outwrite the AI / Anticipate the AI: Provide students with a prompt and ask them to write a better response than the one generated by the AI tool. Share the AI generated response and ask students to articulate how their writing was better. Alternatively, ask students to anticipate how the AI will respond to a given prompt. (Inspired by Karen Costa

Example Generator: Use the AI to provide an unlimited number of concept or application examples. Students can then “compare across different contexts, explain the core of a concept, and point out inconsistencies and missing information in the way the AI applies concepts to new situations.” (Mollick & Mollick, 2022) 

Debate the Bot / Hold a Conversation: Utilize the AI tool as one side in a debate and have students take the other side. This can be done both individually or as a whole class. Alternatively, students can engage in a conversation with the AI tool on a specific topic.  (Source: Matt Miller

Personalize Learning Plans: Students ask the AI tool to create a learning plan specific to their needs and conditions. Alternatively, students could also request study materials from an AI tool.

Align Assessments to Course Goals: AI tools may expose assessments that are misaligned to learning goals or lack relevance to students. As you review your assessments, be sure you can articulate how the assessment provides evidence in support of the student learning goal and be able to articulate a clear rationale for why or how the assessment is going to benefit a student.

Personalize Writing Prompts: Ask students to make connections between or apply their own personal knowledge and experiences to course concepts and topics. (Source: Practical Responses to ChatGPT

Use Genre-Based Assignments: Engage students’ motivation and curiosity with genre-based assignments that inspire creativity and reflection, for example artist statements, public service announcements, or turning-point essays. (Source: How to Create Compelling Writing Assignments in a ChatGPT Age)

Cite Specific Material or New Information: Ask students to reference or connect their work to current events or conversations in the discipline or use specific references from course materials like notes, readings, or other information sources that are not available on the free internet. Currently, ChatGPT only uses information through 2021. (Source: Artificial Intelligence Writing)    

Ask for Creative Outputs: Incorporate visual or auditory components in the assessment. For example, items like infographics, memes, presentations, graphs, charts, diagrams, drawing, podcasts, audio responses, or videos.

Use a Flipped Class Approach: Students engage with course content outside of class and then apply that information in class. For example, ask students to write a brief response to questions you might have asked for homework in class and then engage in peer review of their writing.

Assign a Handwritten Exercise: While this has drawbacks (namely difficult to read handwriting and paper management), asking students to write out information by hand may help them better remember the content.   

Check Your Writing Prompts: Load your assessment prompts into an artificial intelligence tool and assess the output. If the AI response would earn an adequate score, adjust your prompt.  

Try Social Annotation Options: As a replacement for short writing prompts related to readings, ask students to use social annotation tools to engage with a text alongside their peers. You can use WCU-provided VoiceThread, Google Docs, or Microsoft Word via OneDrive to conduct this type of activity. 

Fully Incorporate AI into a Writing Assignment: Allow AI writing tool assistance for an assignment. Ask students to submit prompt(s) used, original AI tool output, and a document with tracked changes showing how the student added depth, clarified misinformation, added alternative perspectives, and any other improvements. (Source: Brent Anders)

Tools and Resources

Turnitin AI Writing Detection Tool

What is Turnitin?

Turnitin is a text-matching software available through WCU’s learning management system (D2L) that offers tools to support academic integrity.  When an instructor enables Turnitin in a D2L course, student assignments are submitted to Turnitin which then matches text within the assignment to existing text in a database of previously submitted papers, internet sources, journal, and other publications. Turnitin generates a ‘Similarity Report’ that provides a summary of text that has matched with another source. All submitted assignments are added to Turnitin’s global database for text-matching with future submissions. It is important to note that Turnitin only checks for text similarity, and not for all forms of plagiarism. As Turnitin explains, it is more accurate to think of their tool as a ‘text-matching’ or ‘similarity-checking’ tool rather than a ‘plagiarism-checking’ tool. Text is flagged by the tool when it matches against other text in their database.

Turnitin recently rolled out an ‘AI Detector’ tool to help faculty identify instances of academic dishonesty where students submit AI-generated material for course assignments. The ‘AI Detector’ tool tries to match submitted text against AI-generated text. Details on how Turnitin’s AI Writing Detection works are provided on the company's FAQs. The challenge is that it is difficult for this or any tool to reliably detect AI-generated text because AI language models are designed to mimic human-generated text.  
There has been considerable concern expressed by faculty and institutions in the US and beyond about the potential impact of the AI Detector tool on courses, student learning, and the academic environment. Some institutions have made the decision to opt out of the tool because of these concerns. Here are a few examples: 

  • Lancaster University (UK)’s VP of Education released a statement calling on the university to turn off Turnitin AI Detection because of the impact they have observed with students being unjustly reported for academic misconduct based on the Turnitin Reports
  • University of Michigan-Dearborn has opted out of the tool citing concerns about protecting students’ digital rights.
  • Colorado State University paused the rollout of Turnitin AI Writing Detection Tool because of the potential impact of the tool on teaching and learning.
  • University of Pittsburgh’s Teaching Center disabled the AI detection tool in Turnitin concluding that “use of the detection tool at this time is simply not supported by the data and does not represent a teaching practice that we can endorse or support.”
  • UCLA opted out of Turnitin's AI Writing Detection because of concerns and unanswered questions.

It is apparent from these examples that there are serious legal and academic ramifications associated with using Turnitin AI Writing Detector. It is risky to use the AI Detector Similarity Report as the sole basis for determining that a student has violated academic integrity because it is not wholly reliable.  

Turnitin at WCU

The AI Writing Detection tool is currently enabled at WCU.  The Teaching and Learning Center (TLC) and Information Systems &Technology (IS&T) urge faculty who may choose to use this tool to do so with caution. Consider the following recommendations for using the tool ethically and responsibly: 

  • Err on the side of caution, because the Turnitin report is not completely accurate.
  • Make comparisons to students’ previously submitted work.
  • Talk with the student about the work that you are questioning and give them an opportunity to share their process for completing the assignment.
  • If you suspect that the student used AI, consider giving them a chance to re-do the assignment.
  • Keep in mind that as AI continues to advance, differentiating between AI-generated and human-generated content will only become harder over time.

Conversations at WCU

Check our Events Calendar for upcoming workshops and events to get involved with conversations about generative AI already happening at WCU, or fill out the Experiences with Generative AI Survey to start another!

During the Fall 2023 semester, the Teaching and Learning Center conducted a webinar series on Teaching with AI.

Resources Curated by Prof. Andy Famiglietti

What is an LLM, and how does it work?

Collection of assignments engaging with text generation by WAC Clearinghouse

Advice document prepared by digital rhetoric professor Edward Schiappa and digital media professor Nick Montfort for their colleagues at MIT , which identifies some key issues for thinking about generative AI in the classroom

Siva Vaidhyanathan’s op ed about using ChatGPT as a “teachable moment”

John Warner’s “How About We Put Teaching at the Center”, another piece that succinctly explains a philosophy for approaching AI tools

For those interested in a more scholarly approach that delves into the larger questions of meaning, N. Katherine Hayles's “Inside the Mind of an AI: Materiality and the Crisis of Representation” is very good

Hayles's writing also complements pieces like Bender’s “Stochastic Parrot” essay quite nicely

Another resource of interest to those wrestling with the larger questions of meaning and AI text is the “Again Theory” forum Matthew Kirschenbaum organized for Critical Inquiry

For even more resources, check Prof. Famiglietti's Zotero library

Downloadable Resources