Guidelines and resources related to the use of AI tools in writing instruction/writing assignments.
Guidelines for all faculty (whether you allow use of AI tools for writing assignments or not)
- On your syllabus, assignments, and in conversation with students, foreground the value of engaging in the act of writing–of learning to make choices about how to communicate in particular situations, with particular purposes, to particular audiences. A key part of a university education is developing understanding and control of the rhetorical practices of the disciplinary communities of one’s major and prospective career. Such understanding develops through individual engagement in the activity and practice of writing over time.
- Even if your course is not a Vol Core Written Communication (“WC”) course, consider sharing the university’s approved statement about the value of writing with your students: “Good writing skills enable students to create and share ideas, investigate and describe values, and record and explain discoveries—all skills that are necessary for professional success and personal fulfillment. Students must also be able to write correctly and engage in a productive writing process that includes drafting, feedback, and revision. They also must be able to locate relevant information, evaluate its usefulness and quality, and incorporate it logically and ethically to support ideas and claims for different audiences and purposes.”
- Become familiar with how LLMs work, what generative AI is, and the opportunities and risks associated with using it.
- Complete the AI Guide offered by Harvard’s AI Pedagogy Project
- See the MLA-CCCC Joint Task Force on Writing and AI Working Paper, pp. 5-10
- See Dobrin, Talking about Generative AI, pp. 4-8
- Some benefits of LLMs for writers include:
- They can help writers get started and brainstorm–help to “stimulate thought and develop drafts that are still [one’s] own work and to overcome…obstacles to tackling invention and revision” (MLA-CCCC Joint Task Force on Writing and AI Working Paper, p. 9).
- They can help with the process of research, such as with identifying useful keywords to find relevant information on a topic.
- They can assist students for whom English is an additional language in learning writing conventions like vocabulary and sentence structure.
- Some risks of LLMs include:
- AI output text is known to include made-up information that sounds very plausible.
- AI output text is known to include gender, racial, and language biases and biases against particular viewpoints that can be hard for students to detect.
- Students’ opportunity to engage in valuable writing, reading, and thinking practice that helps them develop as learners and thinkers is diminished if they simply “push a button” to generate text and submit it as their own work. (Doing that would also be against their commitment to academic integrity.) AI tools merely assemble pre-existing (and often incorrect, biased) text that does not demonstrate students’ ability to make choices about how to write for a particular purpose, to a particular audience, in a particular situation.
- Some benefits of LLMs for writers include:
4. Create an AI account and practice with it. In particular, enter your assignment prompts a few times to see what the AI output is. (Even if you don’t allow your students to use AI tools, know what AI output for your assignments looks like.) Here are instructions for creating some commonly-used generative AI accounts:
5. Decide which suggested course syllabus statement (from the Office of the Provost) you will use.
6. Add the Honor Statement to your syllabus.
- Recommended: Define undocumented or uncited AI tool use as “inappropriate assistance.”
7. Don’t rely on AI detectors to draw conclusions about AI use or make academic integrity decisions. Sadasivan et al. (2023) conclude “state-of-the-art detectors cannot reliably detect LLM outputs in practical scenarios (p. 20). This position is shared by many, including Wharton Professor Ethan Mollick, a respected analyst on AI and writing.
Guidelines for faculty who allow use of AI tools for writing assignments
Talk with your students about AI use. At a minimum, foreground the value of writing; ask students to commit to using it to assist their work, not to replace it; and aim for transparency, describing when and how students may use AI for writing assignments and asking them to “show their work” by documenting their AI use. Use some of your class time to explore uses of AI tools, critique their output, and reflect on how and when using them may assist some reading and writing processes. Consider sharing the information and guidelines we offer to students regarding the use of AI tools in writing assignments.
- On your syllabus and in conversation with your students, foreground the value of writing–that “writing is an important mode of learning that facilitates the analysis and synthesis of information, the retention of knowledge, cognitive development, social connection, and participation in public life” (MLA-CCCC Joint Task Force on Writing and AI Working Paper, p. 4). (Also see item 1 in the collapsible section above, “Guidelines for all faculty”.)
- If you’re using the “Moderate Use Guidelines” from the Provost’s suggested syllabus statements, decide on the following and add to your syllabus, aiming for transparency of use that contributes to the learning goals for your class.
- How may students use AI-enabled tools?
- Emphasize the use of AI tools to assist in the research and writing process rather than replace it.
- Consider your learning goals: For example, when the goal is for students to demonstrate comprehension, original analysis, or subject mastery, using AI-generated text could undermine academic integrity. When the goal is effective communication, persuasion, or polished writing, using AI can help students efficiently create original, high-quality work*. (*Adapted from this LinkedIn post by John Nash, Associate Professor, U Kentucky and director of their Laboratory on Design Thinking, on the distinction between using AI-generated text and plagiarism.)
- During what parts of the writing process can your students use AI tools? For example, during the invention/brainstorming process? To find source material? To understand source material? To create an outline for a paper? To assist with revising or editing a draft? Are there AI tools you do or do not want your students to use? (Keep in mind that many free AI-enabled tools were in use before OpenAI’s generative AI tool ChatGPT appeared, such as Grammarly and Quillbot, among others.)
- How should students document/cite their use?
- MLA and APA have come out with formats for citing AI-generated text:
- If a citation format for your field does not exist, consider adapting the MLA or APA formats for your own use.
- As recommended by the Provost’s Office syllabus statement guidelines, ask students to add a description of how they used an AI tool during the writing process.
- Also recommended: Ask students to explain how they fact-checked the accuracy of the AI output.
Below are recommendations for additional approaches to consider:
- Break up the scaffolded parts of the overall assignment (such as reading related to the topic, creation of a draft thesis or main point statement, creation of an outline, creation of a first draft, and revision of a draft using feedback) and add due dates for these on your syllabus. Engaging in a guided process will help students avoid the temptation to prompt an AI to write a paper at the last minute before it’s due.
- Assign some type of credit in your grading scheme to students’ completion of the scaffolded parts of the writing process.
- Giving credit communicates a message about the value of active engagement in the writing process to the development of knowledge, critical thinking, and writing skills.
- It also allows you as the instructor to more easily see the development of the student’s work–and, if possible, to provide feedback during the drafting stage–before receiving the final version.
- Credit may be completion points rather than A-F, as you think best.
- Since there isn’t any way to completely “GPT-proof” assignments, consider the extent to which your assignments provide meaningful experiences and opportunities for learning and that engage students in problem-based learning–and discuss the learning goals for your assignments with your students.
- What meaningful writing means for students (Eodice et al., 2017)
- Aim to assign writing that encourages students to communicate for a particular purpose (inform, persuade, change minds, prompt action) to a particular audience, in a particular situation and context.
- See this resource from Cornell’s Center for Teaching Innovation on designing problem-based learning activities in your classes.
- After entering your existing prompt into ChatGPT (or other AI tool), analyze what it does well and what it’s lacking. (Typically, the output text will lack analysis or will not be entirely consistent with what the prompt asked for–even though it generally will follow the expectations for acceptable academic essays–e.g., coherent, organized, free of grammatical error).
- Consider: how might you revise your prompt to emphasize analysis and synthesis?
- Share this exercise with your students (which will show them you’re aware of what the AI output is for your prompt), and, most importantly, ask them to analyze the AI output to identify its useful features and what it lacks in terms of the analysis and/or synthesis your prompt requires–and challenge them to do better than the AI.
4. Help students know when and how to prompt.
- Studies and anecdotal reports suggest that most students will use AI to help them get started on projects, develop or expand parts of projects (e.g., find additional supporting resources), and improve an existing draft.
- It may be useful to provide–or experiment with your students on creating–prompts for these parts of the writing process for your assignments.
- For example: “I’m writing a paper on [topic] and need help getting started. What are five good research questions for someone who’s interested in [major]?” (Of course, adapt as appropriate for your assignment.)
- Recommend use of closed-prompting for AI queries in assignments when appropriate. Closed prompting means asking the AI to search only within particular sources when responding.
5. Teach students how to fact-check AI output.
6. When reviewing/assessing students’s submitted work:
- Don’t rely on AI detectors to draw conclusions about AI use or make academic integrity decisions. The research about this is still emerging, but there’s good reason to be cautious. Sadasivan et al. (2023) conclude “state-of-the-art detectors cannot reliably detect LLM outputs in practical scenarios (p. 20); this position is shared by many, including Liang et al., (2023), and Wharton Professor Ethan Mollick, a respected analyst on AI and writing.
- If you receive papers that lack the analysis and synthesis you’re seeking–ask students to revise until their work meets your standards. In other words, don’t wave it on through if it seems like a good academic essay but, upon review, doesn’t do what you’re asking or expecting.
7. Create assignments that ask students to reflect on their use of AI tools. Reflection contributes to students’ metacognitive development of themselves as writers and helps to develop their critical AI literacy. Especially, ask students to reflect on the rhetorical choices they made as composers of your assignment–given the situation, purpose, and audience, why did they make the choices they did to create their text? (Remember to assign credit in your grading scheme for such reflection[s].)
- See these resources on metacognition from Vanderbilt’s Center for Teaching.
(In the sections that follow, most resources refer to ChatGPT, though the strategies discussed will work with most tools that produce AI-generated writing.)
Reconsider the design of your assignment prompts
Title | Summary | Author(s) and notes |
AI Can Save Writing by Killing “The College Essay” | One way that instructors can try to circumvent students’ use of ChatGPT is to focus more fully on teaching writing, including creating assignments prompts that require a research component, teaching other types of writing/genres, and move away from the generic “college essay” toward more creative assignments. | Steven Krause |
Freaking Out About ChatGPT | Concern over ChatGPT revolves around its ability to respond to particular, oftentimes straightforward, prompts quickly and with passable accuracy. One solution is thus to step outside of traditional modes of assessment and conceive of writing assignments that emphasize metacognition and the writing process, not just the product. | John Warner |
What are we doing about AI essays? | ChatGPT is limited in the information that it can analyze, so two suggestions for instructors are to have students write using information that would only be accessible from their course (i.e., that ChatGPT hasn’t been programmed to respond to) and incorporate multimodal or multimedia elements into assignments, like essay writing. | Miriam Bowers-Abbott |
Critical AI: Adapting College Writing For The Age Of Large Language Models Such As Chatgpt: Some Next Steps For Educators | Ways that instructors can address AI in their classrooms and take preventive measures against plagiarism include but aren’t limited to assigning analyses of non-textual sources, from in-class discussion or activities, and reflective assignments. Instructors may also expand upon existing assignments and assign in-class writing, require 1:1 conferences, and analyze longer works while also highlighting the merits of students working through the writing process. | Anna Mills and Lauren Goodlad* |
“Students are using AI to write their papers, because of course they are” | Instructors can motivate students against using ChatGPT by designing assignments that emphasize the writing process over performance, are unique to the class (i.e., relate to specific class content, have personal components), rely less on high-stakes projects, and generally reconsider writing assessment strategies. | Lori Salem and Stephanie Fiore |
Embrace the Bot: Designing Writing Assignments in the Face of AI | Instructors can design assignments that can help students mature as writers and that would be difficult for ChatGPT to respond to (and well). Students can write about topics too specific for ChatGPT’s coarse-grained responses or write about personal experiences and beliefs. In lieu of written work, students can do presentations, collaborative in-class work, or writing in visual or multimodal genres. | Eric Prochaska |
If we are setting assessments that a robot can complete, what does that say about our assessments? | This source argues that whether or not ChatGPT can generate a response to a prompt isn’t a direct reflection of the prompt’s utility. Even though ChatGPT can respond to certain prompts, those prompts help students develop skills necessary for assignments that require more critical thinking. Along those lines, students’ process to complete an assignment is arguably more important to consider than the writing product. | Daisy Christodoulou* |
ChatGPT and AI Composition Tools | Instructors are encouraged to consider alternative assignment and course designs that avoid more rote tasks, like summarization, and favor tasks that center analysis and evaluation, include non-textual modes of communication, use information from class or personal experience, and privilege the learning process over the final product, among others. | Washington University in St. Louis |
Uplevel your prompt craft in ChatGPT with the CREATE framework | Using the CREATE framework may provide instructors with scaffolding to reimagine writing assignments. The framework calls for instructors to do the following when creating prompts: be clear when articulating the assignment and its intent, provide relevant information about it, provide examples, avoid ambiguity, tinker with the prompt, and continue to evaluate students’ work and thus the extent that pedagogical goals are being achieved. | Tom Barrett |
ChatGPT: students could use AI to cheat, but it’s a chance to rethink assessment altogether | Giving students opportunities to directly apply their academic interests, class content, and/or skills acquired in their majors to solve real-world problems–“authentic assessment”–both creates assignments that would be difficult with which to repond using ChatGPT while adding variety to the genres/ways that students may display knowledge. | Sam Illingsworth |
Teaching Actual Student Writing in an AI World | Instructors may take measures to prevent students from using ChatGPT in writing, including but not limited to assigning content inaccessible to ChatGPT (e.g., personal experiences, reflections, text behind a paywall), handwritten work, and having students track their writing progress/process. | Kevin Jacob Kelly* |
New Modes of Learning Enabled by AI Chatbots: Three Methods and Assignments | Instructors can use ChatGPT to help students acquire and exercise skills that other assignments may gloss over, three being facilitating transfer by having students evaluate an example of negative transfer, providing text for students to “peer review,” and filling knowledge gaps that students then reevaluate and change as they learn more about said topic. | Ethan Mollick and Lilach Mollick |
Emphasize the writing process to discourage AI use
Title | Summary | Author(s) and notes |
ChatGPT and the AI Arms Race | PowerNotes is one tool that instructors can use to discourage ChatGPT by encouraging students to participate in the parts of the writing process leading up to writing production, which ChatGPT bypasses and can’t replicate. Tools like PowerNotes can show that students have engaged with texts in ways like annotating, coding, and citations, emphasizing the steps leading up to writing production and offering instructors non-invasive methods for observing students’ processes. | Daniel Bloom |
What ChatGPT Means for How We Teach Writing | To exercise critical thinking skills and attempt to circumvent AI plagiarism, instructors can privilege the writing process and deemphasize the weight that they assign to the written product. Doing so allows students to better understand how they write, their motivations and inspirations for writing about different topics, questions they have about the writing process, and exercise metacognition. | Anne Bruder |
So, AI Ruined Your Term Paper? | Banning AI is futile, so there are steps that instructors can take to encourage students to avoid using technologies like ChatGPT that have students work with text generated by ChatGPT, respond to assignments using multimodal/multimedia genres, submit projects with annotations, and, on the instructor’s end, screen for valid citations. | James D. Basham, Angelica Fulchini Scruggs, and Eleazar Vasquez |
How do we prevent learning loss due to AI text generators? | Instructors can add weight to key moments in the writing process to make it more challenging for students to use ChatGPT. For example, instructors can require students to detail their writing process, do 1:1 conferences, write self-reflections, do in-class writing, and/or use different assessment strategies, like ungrading, that emphasize process over product. | Anna Mills* |
Request to Retest/Revise | This strategy may likely be more effective for students who are self-motivated (and thus miss the critical audience of students more likely to use ChatGPT), but giving students the opportunity to revise reinforces the importance of revision and metacognitive reflection. Providing revision opportunities may allow instructors to examine their composition pedagogy and learn how to scaffold elements of the writing process that can uplift students toward stronger “final first drafts.” | Motivate Lab (Taken from Cynthia Alby’s compilation “Revamping Online Courses in Response to ChatGPT”) |
Inspiring Your Students to Write, Cite, and Avoid Plagiarism | Instructors may adopt an approach to writing assessment that asks students to demonstrate their knowledge as they move through the writing process. Emphasizing steps of the writing process that may be overlooked–such as class discussions on disciplinary terminology, understanding the assignment guidelines, spending time summarizing and paraphrasing, and thinking intentionally about audience and purpose–may help students realize the importance of completing scaffolding steps. | Matthew J. Samel* |
ChatGPT and its Use in Essay Writing Instruction | Instructors can use ChatGPT as a tool to help students scaffold their writing processes. Framing ChatGPT as a tool to provide examples, demonstrate conventions in academic genres, and provide project ideas is not the same as allowing students to use ChatGPT to write essays, and instructors can also take additional measures to “prevent” plagiarism like requiring collaborative work, using detectors, and emphasizing the limitations of AI for writing. | Brent Anders |
OK Computer: to prevent students cheating with AI text-generators, we should bring them into the classroom | Instructors may consider introducing ChatGPT/AI in classrooms as a tool. Pragmatically, AI writing programs are not predicted to become less advanced, obsolete, or accessible, making banning and detecting usage challenging. Students may benefit from a more nuanced understanding of ChatGPT as a tool to enhance their writing processes, like having students complete heavy revisions on ChatGPT-generated drafts. | Grant Jun Otsuki |
Show students how to use and analyze AI writing tools
Title | Summary | Author(s) and notes |
How to cheat on your final paper: Assigning AI for student writing | Fyfe presents results from an in-class experiment where students write an essay featuring chunks written by ChatGPT, concluding that it gave students space to consider the implications and ethics of using ChatGPT in writing, as well as the value of their own writing. This study demonstrates the utility of directly confronting ChatGPT in a classroom with a unique assignment and a metacognitive and self-reflective component. | Paul Fyfe |
GPT This! a writing assignment in the age of GPT-3 & other Large Language Models | This source describes an assignment that asks students to run a prompt through ChatGPT, then analyze the affordances and drawbacks/limitations of the text with which the program responds. This exercise, and others like it, force instructors to confront ChatGPT in classrooms while allowing students to gain practice doing a close reading (of the output) and hands-on experience using the technology to reflect upon it, like a meta-activity. | Mark Marino* |
ChatGPT and the rise of AI writers: how should higher education respond? | Instructors can take several approaches to addressing ChatGPT in classes. They can design in-class activities that have students use ChatGPT to answer questions using research questions related to assigned readings, then have students compare the reading and the generated text as related to learning objectives. | Nancy Gleason* |
Embrace the Bot: Designing Writing Assignments in the Face of AI | Instructors can design assignments and in-class activities that deconstruct how ChatGPT writes. These can be in the forms of peer review, rhetorical and content analyses, revisions of generated text, and detailed analyses of the differences between human- and AI-generated text. | Eric Prochaska |
Incorporating ChatGPT into Your Assignments | Instructors may introduce ChatGPT into the classroom as a pedagogical tool with which students can generate responses to prompts, then critique and reflect critically on ChatGPT’s capabilities. Students may also be asked to revise generated text or offer suggestions for ways that ChatGPT’s output could be improved. | Washington University in St. Louis |
How to… use AI to teach some of the hardest skills | Two suggestions for integrating ChatGPT into writing assignments are (1) for students to critique and revise ChatGPT’s output when asked to explain a topic and (2) have students offer ChatGPT suggestions for improvement, “teaching” the AI in ways that are similar to peer review. | Ethan Mollick |
With ChatGPT, We’re All Editors Now | This source offers two suggestions: (1) instructors should understand that ChatGPT and other AI writing technologies won’t become obsolete, so (2) they are encouraged to use in classrooms as a tool to facilitate practice editing, annotating, and fact-checking. | Rachel Elliot Rigolino |
Transform learning with AI | Instructors can have students prod ChatGPT output to acquire skills, such as learning the genre of academic writing, “collaborating” to produce creative works, and critically reflecting upon and critiquing the limits and ethical implications of using AI writing technologies. | Mike Sharples and Rafael Pérez y Pérez |
Teaching Students to Write with AI: The SPACE Framework | Instructors may follow the SPACE method for helping students responsibly and productively engage with AI/ChatGPT in writing. Instructors should set expectations for what students can input to AI, prompt AI to do specific tasks, assess the legitimacy of AI output, curate output to use and organize, and learn to edit it to correct errors. | Glenn Kleinman |
Discuss the ethics and implications of using AI tools for writing assignments
Title | Summary | Author(s) and notes |
Teacher and Student Guide to Analyzing AI Writing Tools | Instructors should take time for students to ask questions about how AI writing tools like ChatGPT work, their limitations, how the tool is trained, how users impact how the technology works, the communities whom AI writing tools benefit and harm, and rhetorical considerations from output, such as credibility, reliability, biases, gaps of information, and its ability to cite sources. | Bob Maloy, Torrey Trust, Allison Butler, Chenyang Xu, and others* |
Prior to (or instead of) using ChatGPT with your students | Regardless of whether instructors allow students to use ChatGPT in their writing, they are encouraged to have conversations about the large-scale and privacy-related implications of using AI. Discussions and activities include but aren’t limited to data acquisition, training ChatGPT with free labor, AI’s carbon impact and economic cost, annotating OpenAI’s privacy terms, and understand how media platforms collect personal data. | Autumm Caines* |
How Automated Writing Systems Affect the Circulation of Political Information Online | This research addresses the other side of AI writing: what happens when people consume writing by bots? Research indicates a burgeoning need for discussions about civic and responsible engagement online when AI writing systems can populate the internet with fake and divisive content. Has potential to stimulate conversation in classrooms about what AI writing systems mean as a whole, how they can be used and misused, and rules of engagement. | Timothy Laquintano and Annette Vee* |
Teaching AI Ethics | To make AI (writing) tools more transparent, instructors should generate discussions with their classes about ethical ramifications of AI use, including but not limited to “datafication” and bias, environmental impacts, plagiarism, art theft, and copyright infringement. Instructors can do in-class exercise and use examples from academic subjects to make the problems feel more concrete. | Leon Furze* |
AI Will Augment, Not Replace | Just as instructors should better understand ChatGPT to try to curb its uses in classrooms, students, who will inevitably encounter it, should have a better understanding of it as well. As a form of “digital literacy,” this teaching wouldn’t encourage students to use ChatGPT, but rather would help them grasp the pitfalls/dangers of AI and/or how to use it responsibly. | Marc Watkins |
Explore the use of AI-detecting tools
August 2023 note: The resources below were collected in Spring 2023 before many AI detectors were field-tested. At this time, we do not recommend relying on AI detectors to make academic integrity decisions. | ||
Title | Summary | Author(s) and notes |
A college student created an app that can tell whether AI wrote an essay | GPTZero is an app developed to determine whether a piece of writing was created by a writer or generated on ChatGPT. The app looks for complexity (i.e., too complex = more likely human) and sentence variation (e.g., more sentence variation = more likely human). It also exposes some key characteristics of AI writing that human readers may look for. | Emma Bowman |
There’s a Problem With That App That Detects GPT-Written Text: It’s Not Very Accurate | Encourages caution when using GPTZero, describing some limitations of the tool. | Victor Tangermann |
AI Writing Detection: A Losing Battle Worth Fighting | AI-detection programs, like GPTZero, have been developed to catch AI plagarism by measuring and while they can be an asset, they aren’t foolproof. These programs can also be used to start conversations about AI plagarism in classrooms, encouraging students to consider the implications for using programs, and pushing instructors to think about the best assignments to assign for students to demonstrate knowledge. (Also touches on more holistic measures, like transparency with students and rethinking assessment/learning models.) | Susan D’Agostino |
How to Detect OpenAI’s ChatGPT Output | Instructors can feed assignment responses or papers through several AI-detection programs, such as OpenAI, GPTZero, and DetectGPT. However, the programs listed can inaccurately report AI interference and vice versa, so if instructors elect to use detection programs to screen for AI, it’s recommended that they use multiple ones. AI also changes and updates quickly, so some of these programs will only be useful for older versions. | Sung Kim* |
Sneak preview of Turnitin’s AI writing and ChatGPT detection capability | Turnitin, a company that instructors can use to screen for plagiarism, is working to develop technology that aims to detect use of AI in essays, similarly to how it detects plagiarism of internet materials (e.g., books, blogs, periodicals). The ChatGPT-detection model will be designed specifically for academic writing and will flag text it has analyzed that was or could have been generated by AI. | Annie Chechitelli |
Can Anti-Plagiarism Tools Detect When AI Chatbots Write Student Essays | ChatGPT is a burgeoning example of the strategies that students may use to cheat on writing assessments. Like other plagiarism checkers, programs are being developed to detect AI-generated text, such as work by Turnitin, but the rise of ChatGPT may also encourage instructors to take a proactive approach to curbing plagiarism by rethinking pedagogical practices and writing assignments. | Daniel Mollenkamp |
OpenAI’s attempts to watermark AI text hit limits | OpenAI, the company that developed ChatGPT, is attempting to create a “watermark” feature that plants a signal to ChatGPT-generated text in order to make it more challenging for AI writing to pass as human-produced. However, it may not be challenging to maneuver around the watermark by doing simple tasks, like changing words or phrase structures. | Kyle Wiggers |
Explore policies and create a syllabus statement that works for your course
Title | Summary | Author(s) and notes |
Update your course syllabus for ChatGPT | Instructors should update their syllabi to discuss not just their expectations for students, ethics, and ChatGPT, but also explain course objectives so that students have a stronger grasp on how their writing contributes to their learning. Instructors should also consider having students do more more creative assignments, such as reflective pieces and pieces that have students work with ChatGPT. | Ryan Watkins* |
Using Generative AI in Coursework | Suggested guidelines for use of generative AI, created by Boston University’s Center for Computing and Data Sciences. | Boston University |
ChatGPT | This source offers some suggestions for addressing potential use of ChatGPT in classrooms, framing it as a form of plagiarism when used to write assignments but as a tool that could be used for different parts of the writing process per each instructor’s discretion. It also recommends addressing ChatGPT explicitly in classrooms, as opposed to avoiding it. | University of California, Irvine* |
Classroom Policies for AI Generative Tools | This source offers some examples of syllabus statements outlining expectations for usage of ChatGPT. Some policies do not have any restrictions on usage, while others include but aren’t limited to the following: students must clarify what parts of an assignment are ChatGPT-generated, usage must be cited, students will be penalized for incorrect information that ChatGPT generates, AI may be used only as a brainstorming tool, etc. | Joel Gladd* |
Rules for Tools | This page is an example of explicating instructors’ expectations regarding usage or non-usage of ChatGPT/AI in classrooms. While AI may be permitted in classrooms, students, per this policy statement, are responsible for their writing and thus any writing that ChatGPT generates, correct or incorrect, and must cite usage. | Prof. Dr. Christian Spannagel* |
Syllabus Resources: The Sentient Syllabus Project | This compilation of policy statements addresses the range of ways that instructors may make their expectations regarding ChatGPT usage or non-usage transparent. There are example statements for issues related to the fact that AI is constantly evolving, assessment, performance (i.e., not limited to textual modes), plagiarism, and rubrics, depending on the instructor’s position on AI and pedagogical goals. | Boris Steipe* |
Guidelines for the Use of Artificial Intelligence in University Courses | Regardless of instructors’ policies around ChatGPT usage or non-usage, instructors may find it beneficial to explain the reasoning behind their decisions, clearly outline their pedagogical goals, and state the extent that ChatGPT enhances or undermines them. If instructors do permit AI to be used, they may also outline the types of AI that are acceptable (e.g., translation tools, ChatGPT), when, and with what documentation. | Juan David Gutiérrez* |
A Note About AI (ChatGPT and Other Tools) | Policies around ChatGPT use or non-use should make transparent instructor’s reasons for their policy and how the extent that ChatGPT aligns or undermines their pedagogical goals. Rather than stating policy, it may be more effective for curbing reliance if students understand how ChatGPT may impact their learning. | Whitney Gegg-Harrison* |
Classroom Policies for AI-generated tools | A collection of policy/syllabus statements about AI use in various courses at schools across the country | Lance Eaton |
Course Policies Related to ChatGPT and Other AI Tools | Instructors who permit use of ChatGPT should make clear in their syllabi that there are limitations to ChatGPT, that usage should be governed by ethical principles and not prevent learning, and acknowledging and accepting responsibility for usage. | Joel Gladd |
Also see Teaching and Learning Innovation‘s page on Artificial Intelligence in the Classroom and the Office of the Provost‘s page on Generative Tools in AI Coursework, which includes Suggested Course Syllabus Statements.
In addition, see the information and guidelines we offer to students regarding the use of AI tools in writing assignments.
Sid Dobrin, Talking about Generative AI: A Guide for Educators
The resources on this page are being updated regularly. To suggest additional resources, please email writingcenter@utk.edu.
Credit to Stella Takvoryan for her contribution to the summaries in this collection. Photo credit to Andras Vas on Unsplash.