Teaching Commons > Teaching Guides > Technology > Artificial Intelligence

Artificial Intelligence (AI) in Higher Education

​​In This Guide

What is Artificial Intelligence?

A simple flow diagram shows a three step writing process - Draft, Review, Revise and a repeat symbol (from musical notation).
Bill Hart-Davidson

Artificial intelligence (AI) encompasses a variety of computer-based tools that source existing data to solve a problem, such as search tools that rely on algorithms to find information or language translation tools (McCarthy, 2007). Generative artificial intelligence refers to a suite of tools that source existing data to create new artifacts in response to user prompts (Goodfellow et al., 2020). For example, ChatGPT is a large language model (LLM) and conversational generative AI that is built on many existing texts. In response to prompts from users, ChatGPT generates text that mimics the writing of humans through a process of statistical correlation. The Modern Language Association and Conference on College Composition and Communication's working paper explains how LLMs and ChatGPT work in greater detail.

Generative AI is an emergent and rapidly evolving space. New tools, or updates to existing tools, are released frequently, but a few examples are helpful in framing out what these tools do:

  • ChatGPT (where GPT stands for “generative pretrained transformer”) sources a broad range of texts and other data in order to respond to inquiries with customized text. It can also generate basic code.​
  • Claude is an AI assistant from Anthropic with multiple capabilities.
  • Microsoft Copilot is a free service that provides access to GPT-4 and DALL-E 3. DePaul faculty, staff, and students can access Microsoft Copilot using their DePaul credentials.
  • Microsoft’s Bing search engine has a ChatGPT-based integration.
  • Gemini is Google’s version of a generative text AI. 
  • DALL-E and Stable Diffusion pull from a broad range of image data in order to produce customized images in response to prompts from users.
  • Midjourney a research lab that has created another image creation generative AI. This platform is integrated with the Discord chat application.
  • Lensa is an AI that edits and transforms existing photos.
Many more generative AI tools exist​. If you’re looking for a specific generative AI tool, the resources in Stay Current on Higher Education Generative AI Trends​ are a great place to find the latest recommendations, and Harvard's AI Pedagogy Project contains resources for learning more about how AI tools work. 

DePaul's Approach to AI

At DePaul, we’re encouraged to learn more about emerging AI technologies and help students to learn to use them responsibly.

As noted in DePaul's Approach to Artificial Intelligence, “Universities have a special challenge: on the one hand, we need to prepare our students for a world of work in which AI will certainly play a part, but on the other hand, we want our students to understand and practice integrity in the use of any sources, including those generated by Artificial Intelligence.” DePaul faculty and staff have been charged “to create conditions for members of the University community to learn about both the benefits and dangers of AI and act responsibly.”

As we continue learning about generative AI tools and considering our responses, the resources in this guide will help you to acknowledge the tools with students and consider how they might impact your teaching right now.

AI Teaching Recommendations

Masland (2023) describes a range of ongoing responses to emerging AI technologies in higher education. In doing so, Masland encourages instructors to reflect while considering the best ways to integrate AI in their teaching:

  • What do I want to model to my students about this technology?
  • What additional labor am I willing to invite into my teaching?
  • How can I ensure that my response to this threat isn't bigger than the threat itself?
  • What decisions can I make that will maximize my student's success and my own enjoyment of this facet of my career?

Draft a syllabus statement and discuss academic integrity with students.

A syllabus statement and other classroom policies will help you to enforce your expectations for AI use in your classroom.

  1. Familiarize yourself with DePaul’s Academic Integrity Policy and any department, college , and field-specific generative AI guidance and approaches.
  2. Detail your expectations for students with a syllabus statement.
  3. Workshop your statement with colleagues and review how others have approached syllabus statement development.
  4. Use your syllabus statement as a starting point for co-creating specific guidelines for responsibly using AI tools with students.
  5. Reflect on your plan for approaching students if you suspect they have used generative AI inappropriately and in violation of your classroom policies. Provide students with resources for navigating generative AI and academic integrity.

Syllabus Statement Drafting

An AI syllabus statement should help you and your students identify appropriate AI use for the class. The following questions are designed to help you reflect on and describe appropriate AI use in your classes, and were generated from a review of syllabus statements from many institutions curated by Lance Eaton.

  • What role does generative AI play in your course or field? What do students need to know about the possibilities of AI tools?
  • Is the use of AI tools appropriate in this class? If so, are there limitations based on tool, assignment, stage of the project, type of prompt, percentage of original to AI-generated content, etc.? Are there specific examples of what’s appropriate (e.g., brainstorming and idea generation) and what’s not (e.g., analytical or reflective work, checking grammar, or generating sentence structure options)?
  • Is the use of AI tools appropriate for some assignments and not for others? How will students know when it’s appropriate to use AI tools?
  • If AI usage is appropriate for an assignment, how should students indicate that they’ve used AI? What are the citation expectations?
  • Will you be using an AI detection tool to look for AI usage in student work? If an AI detection tool indicates that a student has used an AI tool, what are your steps for adjudication?
  • What are the possible penalties for students who violate this AI usage policy? What is the relationship between the AI policy and the DePaul Academic Integrity policy?
  • How should students consider your AI policy when navigating their other courses and coursework?
  • How will you use AI tools to support the course?
  • What sources did the instructor reference in creating their AI policy?

Everyone is still learning about generative AI tools. Providing students with some information about the limitations of AI tools will help them to make informed decisions about how they use them.

  • AI tools can generate inaccurate information or hallucinate, making up sources and information.
  • AIs are trained on existing information and therefore are limited in supporting future-oriented thinking.
  • AIs are trained on existing information that contains bias and therefore may perpetuate those biases.
  • Using an AI gives it permission to integrate your prompts into its learning; you’re providing free labor in exchange for the use of the tool, and you can’t fully control how your work will be used in the tools “learning.”

Syllabus Statement Examples

These examples have been written with DePaul’s approach to AI in mind. Please note some departments and colleges may have their own AI policies that must be used verbatim or considered when drafting AI statements.

Students are allowed to use generative AI tools, such as ChatGPT or Dall-E 2, on specific assignments in this course. Each assignment sheet will indicate if AI use is permitted. When permitted, students must document and credit the AI tool. For example, paraphrased text generated using ChatGPT should include a citation according to APA, MLA, or Chicago Style generative AI citation guidelines.

Material generated using other AI tools should follow a similar citation convention.

The use of generative AI tools is permitted in this course for the following activities:

  • Brainstorming ideas
  • Fine tuning your research questions
  • Exploring what you need to learn about your topics
  • Drafting outlines
  • Checking grammar

If you use generative AI in any of the above ways, please describe the tools you used, and how you used them, in a paragraph at the end of your discussion post or essay. Please include the prompts you provided to the generative AI tools.

The use of generative AI tools is prohibited in this course for the following assignments and activities:

  • Composing discussion board posts and responses
  • Writing reflections, including exit tickets and process logs
  • Writing drafts of a writing assignment
  • Writing paragraphs used to complete any assignments

If you are unsure about whether or not a specific tool or use of AI is permitted, please contact me. Using generative AI in a way that is not permitted is considered a violation of DePaul’s Academic Integrity Policy.

Generative AI tools are trained on existing texts, algorithms, and models to generate content like writing, images, and videos based on prompts from users. ChatGPT, Midjourney, Google Bard, and DALL-E are examples of generative AI tools. Please refrain from using generative AI in this course for any purpose. We will be developing skills that are important to practice on your own and using generative AI may inhibit development, practice, or understanding of those skills. During class, we will discuss how and why generative AI may disrupt your learning.

If you’re unsure if a specific tool makes use of AI, or if a specific tool is permitted for use on assignments in this course, please contact me. Attempting to pass off AI-generated work as your own will violate DePaul’s Academic Integrity Policy.

In this course, we will spend some time analyzing how AI impacts our work as writers. We will think critically about the best and worst ways to use AI in our University writing and the writing we do outside of the University. I will use AI to brainstorm assignment and activity ideas and to generate writing that we can analyze and provide feedback on as a class.

For your course work, we will treat AI-based assistance, such as the use of ChatGPT and Google Bard, the same way we treat collaboration with other people: you are welcome to talk about your ideas and work with other people, both inside and outside the class, as well as with AI-based tools. You can use these resources for every assignment in our course unless otherwise noted in the assignment instructions. However, all work you submit must be your own. Avoid hitting “Copy” within your conversation with an AI tool. Do not have your assignment and the AI tool open at the same time to limit the urge to copy the responses. Use your conversation with the AI as a learning experience, then close the interaction down, open your assignment, and let your assignment reflect your revised knowledge.

You should never include in your assignment anything that was not written directly by you without proper citation. In this course, we will predominantly use MLA citation style. Use the generative AI MLA citation guidelines any time you paraphrase, quote, or incorporate into your own work any content (whether text, image, data, or other) that was created by AI. Please note that the MLA guidelines also recommend adding notes to your work where you explain in greater detail how you used AI. Throughout the quarter, I will also ask you to reflect on your AI use in class discussions and process-based reflective writing, and will ask you to provide direct links to your chats with ChatGPT or Google Bard.

AI tools can generate inaccurate information, even to the point of making up sources. AI tools are trained on existing information and therefore are limited in supporting future-oriented thinking. AI tools are trained on existing information that contains bias. Therefore, they may perpetuate those biases. Using an AI tool gives it permission to integrate your prompts into its learning; essentially, you’re providing free labor in exchange for the use of the tool, and you can’t always control how your work will be used to inform future iterations of the tools.

I will read your work throughout the quarter and learn more about your writing style. During class, we will also talk about the features and limitations of AI-generated writing. Throughout the quarter, I will see how AI responds to our assignment prompts. If I suspect that you have submitted work that is not your own without citation, including work generated by AI, I will set a time to meet with you. If you blatantly violate DePaul’s Academic Integrity Policy and the citation guidelines in our syllabus, I will report an academic integrity violation.

While all DePaul courses are guided by DePaul’s Academic Integrity Policy, professors in other courses may set different guidelines for using AI assistance. Please check with your professors before using AI assistance to support your course work.

We’re currently in a time of transition and learning with generative AI, the new-ish tools that respond to prompts and create new texts, code, images, and other materials that are grounded in existing human- created work. Here’s how I’d like us to think about, and learn about, these tools together:

In this course, you’re welcome to treat generative AIs like ChatGPT, Google Bard, Microsoft Bing, Anthropic Claude, DALL-E, etc., as you would a human resource: You can bounce ideas off of the AI. You can share some of your writing with the AI and ask for grammatical or syntactical advice (as long as you’re comfortable with your work becoming part of the AI’s training, with no compensation for you!). You can have a conversation with the AI as you’re planning a draft.

But, my expectation is that the work you submit in class is primarily authored by you. A key learning path in our class involves practicing different writing skills, and neither of us will know what you’re learning if you’re relying on AI to produce the text that represents your thinking.

Human authorship is especially important in creative and reflective writing, and we’ll be doing both in this class! You’ll miss out on key learning if you’re relying on AI for the creation of these types of texts. In each assignment description, I’ve included explicit expectations on how you may or may not use AI for that assignment.

If I think an assignment might have a bit too much AI intervention, my first step will be to have a conversation with you, so that we can understand your composition process and work together to align expectations. However, if this becomes a pattern, I’ll be relying on DePaul’s Academic Integrity Policy (also detailed below) for next steps.

The screenwriting program at DePaul is committed to fostering the unique voices and perspectives of up-and-coming screenwriters, as evidenced by our motto: “learn your craft, find your voice, make your mark.” Writers should be aware that AI (generative artificial intelligence) is a tool and only a tool. It does not create; it aggregates and repurposes previously published content. As such, AI may facilitate the story development process, but cannot replace what each individual writer brings to a story: their distinctive point of view and voice. We understand that students may wish to use AI tools to aid in the development and early drafts of screenplays and screenwriting assignments. To do so with academic integrity, students must understand and adhere to the following:

  • Developing ideas, creative problem-solving, honing one’s voice, and engaging in rewriting are at the core of the writing process, and the use of AI is not a replacement for enhancing these fundamental skills.
  • Acknowledge your use of AI. For any assignment that you complete using AI, you must include a paragraph that explains 1) what AI tool you used, 2) what prompts you used, and 3) a clear explanation of how it helped you engage in the writing process.
  • AI-assisted work may be accepted as part of the development of any project (including story ideation, character development, world-building, basic outlining, etc.), but the entirety of your screenplay/writing assignment must be authored by you.


Look for opportunities to check in with students’ processes and learning strategies.

Use your assignment prompts to experiment with generative AI tools.

Discuss the possibilities and limitations of generative AI tools with students.

Use the “Curated Readings and Podcasts About AI” to guide some of your discussion. Consider how your field or discipline impacts the conversation. 

"Get Up to Speed with AI" Checklist

Joe Veverka, an instructor in DePaul’s Department of Marketing, developed a “Get Up to Speed with AI” checklist that combines recommendations for how instructors can familiarize themselves with generative AI and suggestions for intentionally integrating generative AI in activities and assignments with students. The checklist is divided into three phases: 

  • Instead of using a search engine, use an AI. Ask AI to write first drafts of emails, proposals, memos, etc. – any “for work” writing that you think might be a good candidate
  • Ask AI to do some life tasks for you: find a recipe that works for ingredients you currently have in your home; plan a trip itinerary; create a to-do list; plan an event; etc.
  • Anytime you write anything, or are starting a new task, ask yourself “Could AI do this for me?”. Take the time to pause and commit to shifting your mindset to this new behavior to gain efficiency over time. It’s a great new habit to get into to learn how to use the technology.

Before you use any Generative AI service, please review the terms of use to ensure you understand exactly how your data will be used. This can range from data being used to train or refine models, or steering responses toward human preference. As a best practice, never provide a chat service DePaul proprietary information or intellectual property unless the service is designed for specifically enterprise use with the necessary protections.

  • Put in your assignment prompts to see what the AI generates.
  • Using that baseline information, create an anticipated prompting path that students might use for an assignment. Plan to share your most successful prompts with your students to help get them started.
    • Remember, using Generative AI is the same as a spoken conversation with humans. You need to provide context to the conversation in your prompt(s) to make the conversation more effective. Be willing and prepared to have a chat that is 5-10 prompts long and build on detail along the way to get to the outcome you want.
  • Evaluate the process yourself and determine if you want to integrate this intentionally in your course.
    • Your students may already be familiar with using Generative AI, so be sure to take into account varying levels of familiarity if you choose to add it to your course.

Another caution: Requiring students to use AI is tricky! There are many free versions, but there are also some paid versions so there are potential equity issues to consider. Additionally, take into account that some legacy models of ChatGPT are constrained to information before September 2021, while search engines powered by AI like Bing Chat and Google Bard have access to current real-time information. Ensure you are guiding students to the experience that is more relevant for the course work you are assigning.

  • Be prepared to discuss the assignment with your students, as well as any student concerns related to privacy of the information they put into the AI. Encourage your students to review the terms of use prior to using any tool.
  • Ask students to provide a citation as they normally would with referencing any material on the Internet. In some instances, links to conversation chats can be provided so you can directly access the conversation.
  • Ask students to write a summary of their prompts and how the AI outputs supported or distracted from their working process.
  • Ask students to reflect further on their experience:
    • What worked
    • What didn’t work?
    • How did this experience impact your perception of AI?
    • Did using AI for parts of the project help you create more sophisticated or complex work?
    • Did using AI hinder you from exploring paths not presented by the AI?

Generative AI Assignment and Activity Ideas

See Stay Current on Higher Education Generative AI Trends for additional ideas.

Generative AI in Higher Education Lesson Plan

This pre-packaged 30-minute lesson plan is designed to help you begin a conversation about AI with your students. The lesson, developed by DePaul University Senior Instructional Designer Jes Klass, is divided into the following parts:

  1. Lecture notes: What is Generative AI? 
  2. Lecture notes: Why did our relationship with AI change drastically in late 2022? 
  3. Small group activity: How can we work with AI to make our lives easier? (5 minutes)
  4. Whole-class activity debrief (7-10 minutes)
  5. Small group activity: What uses of generative AI in higher education are inappropriate? (5 minutes)
  6. Whole-class activity debrief and drafting of class charter (7-10 minutes) 

The following resources map the lesson parts and are meant to be copied and adapted to suit your context.

Lesson Plan Slide Deck

Lesson Plan Instructor Guide

AI Detectors

A number of AI detection tools are available and designed to identify what text might have been generated by AI. Educators experimenting with detection tools are discovering that they incorrectly identify student writing as AI-generated writing. Detection tools may also incorrectly fail to detect AI-generated writing, especially in cases where the tool was built to only detect content generated by a specific AI tool. 

For these reasons, we recommend that you use AI detection tools with caution and consider the false positive rates. If you’re using an AI detection tool with student writing, use the results only to start a conversation with the student. Review the work further and be open to the possibility that the tool’s detection results were incorrect.

Turnitin has developed resources to help instructors design assignment prompts to help avoid AI misuse, including an AI Misuse Checklist and Rubric. See “Draft a syllabus statement and discuss academic integrity with students” for additional ideas for how to consider AI misuse in the classroom.

Turnitin’s AI Detection Tool

Turnitin's AI detection tool is integrated with Originality Check in D2L Submission folders. In September 2023, the DePaul Teaching, Learning, and Technology (TLT) Committee reviewed Turnitin’s AI detection tool and determined that it shouldn’t be used until the University had more time to evaluate the tool. Based on this recommendation, as of September 28, 2023, AI detection was turned off for all DePaul Turnitin accounts. 

Discussion of the pros and cons of this tool continued at subsequent TLT meetings.  With input from faculty representatives on TLT and information gathered from their respective units, the decision was made to turn the tool back on in January 2024, prior to the beginning of Winter Quarter. 

Faculty are rightly concerned about how to know whether or not a student has used AI for an assignment, and having a tool to help you work through that circumstance would be helpful. While Turnitin's AI Detector will be an available tool for faculty to have in their toolbox, there is no one-size-fits-all solution to address the complexities of academic misconduct. Please read about how Turnitin's AI Detector works and keep the following in mind when using Turnitin’s AI Detector: 

 As the Provost noted in her message about AI at DePaulfaculty are encouraged to approach the unique challenge our university community faces by learning about the affordances and limitations of AI tools, discussing the impact of AI on your discipline and curriculum with colleagues, and evaluating how you’ll approach AI in your classes by developing syllabus statements that set expectations and reviewing your assignments for AI impact. This guide is a resource to support these efforts. 

Please stay tuned to the Teaching Commons Events page for future programming offerings. If your unit is planning to discuss how AI is impacting your work, a member of our team would be happy to join you and support that conversation. Please email facultydevelopment@depaul.edu.

If there are questions or concerns about this decision please contact TLT@depaul.edu or your faculty representative to this committee.

The following are great resources for engaging with higher education’s ongoing responses to generative AI:

Curated Readings and Podcasts About AI

The following texts represent some of the recent conversations about generative AI. DePaul Driehaus College of Business Online Learning Director James Moore also maintains a curated list of texts that address AI.

“The Brilliance and Weirdness of ChatGPT,” Kevin Roose, New York Times (12/5/22)

Roose provides an overview of how ChatGPT works as an A.I. chatbot, including some screenshots that demonstrate the interface.

“The College Essay is Dead,” Stephen Marche, The Atlantic (12/6/22)

This alarmist piece is referenced in many of the other articles. Eliciting panic can be effective! Useful to read as part of the initial discourse.

“ChatGPT is Dumber than you Think,” Ian Bogost, The Atlantic (12/7/22)

Bogost demonstrates some limitations of ChatGPT and argues “ChatGPT isn’t a step along the path to an artificial general intelligence that understands all human knowledge and texts; it’s merely an instrument for playing with all that knowledge and all those texts.”

“GPT and a New Generation of AI for Education,” Tony Wan, Reach Capital Blog (12/10/22)

Tony Wan discusses the emerging field of AI tools in use for education beyond ChatGPT, including platforms that can help with online language learning, and more collaborative approaches to using AI technology in education.

“Blueprint for an AI Bill of Rights”  The Office of Science and Technology Policy, The White House

In this white paper, the Office of Science and Technology Policy (OSTP) provides  a “set of five principles and associated practices to help guide the design, use, and deployment of automated systems to protect the rights of the American public in the age of artificial intelligence.” 

“A majority of Americans have heard of ChatGPT, but few have tried it themselves” Emily A. Vogels, Pew Research Center (6/24/2023) 

Vogels discusses the results of a March Pew Research Center survey. The survey results indicate that 58% of U.S. adults are familiar with ChatGPT, but few have tried it. 

“Beyond the ‘Matrix’ Theory of the Mind” Ezra Klein, The New York Times (5/28/2023) 

Klein examines how the internet and AI affect human cognition and productivity. Klein calls for increased attention to the way AI, and the jobs and systems that use AI, can be built to enhance human intelligence.

“How Easy Is It to Fool A.I.-Detection Tools?” Stuart A. Thompson and Tiffany Hsu, The New York Times (6/28/2023)

Thompson and Hsu explore multiple AI image generators and AI image detectors, explain how the image detectors are designed to work, and provide many examples. They begin to discuss the implications of these tools and their successes and failures.

“AI and the Future of Undergraduate Writing”, Beth McMurtrie, Chronicle of Higher Ed (12/13/22)

McMurtie provides a near-immediate, yet thoughtful, academically-framed response to the release of ChatGPT: adjusting learning processes, incorporating generative AI in curricula, and rethinking assessment.

“How About We Put Learning at the Center,” John Warner, Inside Higher Ed (1/4/23)

Warner argues ChatGPT has created an opportunity to examine how we value learning and how we create experiences that help students learn.

“Academic Experts Offer Potential Advice on ChatGPT,” Susan D’Agostino, Inside Higher Ed (1/12/23)

Eleven academics share their advice for approaching ChatGPT. Suggestions include “Think a Few Years Out,” “Invite Students Into the Conversation,” and “Experiment. Don’t Panic.”

“Why I’m Not Scared of ChatGPT,” Christopher Grobe, Chronicle of Higher Ed (2/3/23 issue)

This detailed walkthrough of how Grobe incorporated ChatGPT into his course shows the affordances and limitations of the tool.

“GPT-4 Is Here. But Most Faculty Lack AI Policies.” Susan D'Agostino, Inside Higher Ed, 3/22/2023

D'Agostino maps some of the ways GPT-4 compares to GPT-3.5. D'Agostino describes some recent thinking about generative AI tools and their place in education. 

"A Brief History of Academic Integrity Panics about Disruptive Technology", Alex Joppie, IDD Blog (3/31/23) 

Joppie, a DePaul University Senior Instructional Designer, explores how education has encountered and integrated new technologies in the past and recommends steps for considering generative AI in teaching contexts.

“Teaching: Are professors ready for AI?” Beth McMurtie, Chronicle of Higher Ed (5/25/2023)

In this newsletter McMurtie discusses the challenges and considerations surrounding the use of AI in grading and the potential impact on faculty and students. McMurtie explores the ethical, pedagogical, and legal aspects of AI tools and emphasizes the need for guidelines, transparency, and awareness of the limitations of AI in teaching.

“Professors Plan Summer AI Upskilling, With or Without Support” Susan D'Agostino, Inside Higher Ed (5/26/2023)

D'Agostino describes how faculty members are seeking to upskill in AI for teaching and learning during the summer and maps some current grassroots and institutional efforts to support faculty. D’Agostino advocates for more comprehensive and widespread support to help faculty navigate the AI landscape in education effectively.

“How AI Tools Both Help and Hinder Equity” Susan D'Agostino, Inside Higher Ed, (6/5/2023)

D'Agostino explores how AI tools in education can assist disadvantaged students and but also widen the education gap. D’Agostino maps concerns related to biases and unequal access as well as some of the efforts to address and anticipate those concerns.

"AI can do your homework. Now what?" Vox, (12/12/2023) 

Vox interviews students and instructors about their experiences with generatitve AI chatbots. 

“My class required AI. Here’s what I’ve learned so far” Ethan Mollick, One Useful Thing (2/17/2023)

Ethan Mollick focuses on the importance of learning to use AI correctly, by learning to collaborate with it as opposed to expecting it to do the work for you.

“ChatGPT as an Assistive Technology” Maggie Melo, Inside Higher Ed (3/1/2023) 

Melo explains how ChatGPT can be used as an assistive technology for students and instructors with ADHD

ChatGPT Just Got Better. What Does That Mean for Our Writing Assignments?” Anna R. Mills, The Chronicle of Higher Education (3/23/2023) 

Mills compares GPT-4 to GPT-3.5 and provides some updated recommendations for assignment design. Mills explains why AI detection tools are unreliable. 

“Teaching: What You Can Learn From Students About ChatGPT” Beth McMurtie, The Chronicle of Higher Education (3/30/2023)

McMurtie explores two ways professors and teaching experts have involved students in research and policymaking related to AI tools. 

“Can Turnitin Cure Higher Ed’s AI Fever?” Liam Knox, Inside Higher Ed (04/03/2023)

Knox anticipates the launch of Turnitin’s AI detection too. Knoxl identifies some concerns associated with the tool, including the impact of the fast rollout and questions about the reported accuracy rates.

"Shifting Tides with AI in Higher Education" Jes Klass, IDD Blog (5/23/2023)

Klass, a DePaul University Senior Instructional Designer, encourages professors to consider the guidelines they provide for AI-use in their classes,  provides suggestions for guiding students through using AI in a sustainable and learning-focused way, and argues it is time to for everyone to start working AI into their course materials.

“How ChatGPT Could Help or Hurt Students With Disabilities” Beth McMurtie, The Chronicle of Higher Education (5/26/2023) 

McMurtie examines how AI tools like ChatGPT can impact students with disabilities. McMurtie emphasizes the need to involve students with disabilities in policy development and to educate all students about the limitations of AI.

“4 Steps to Help You Plan for ChatGPT in Your Classroom” Flower Darby, The Chronicle of Higher Education (6/27/2023)

Darby maps out four strategies to help all instructors navigate ChatGPT in their classes. Darby’s recommendations are geared toward all instructors, including those that have resisted addressing ChatGPT up until now.

“Get the Best From ChatGPT With These Golden Prompts” Brian X. Chen, The New York Times (5/25/2023)

Chen provides some strategies for most effectively using generative AI, including strategies for considering safety and responsibility while using generative AI. This resource could be a good starting point for students beginning to experiment with generative AI and explore prompt engineering.

“Your Guide to Communicating with Artificial Intelligence” from Learn Prompting

A free and open source resource for learning how to engineer prompts. The resources are organized based on skill level.

“Prompt engineering for educators – making generative AI work for you” Danny Liu, Teaching@Sydney (4/27/2023)

Liu provides examples of how instructors might use generative AI and provides guidelines for effective prompt engineering.

“A new AI chatbot might do your homework for you. But it's still not an A+ student,” Emma Bowman, NPR KQED (12/19/22)

A brief look at the trustworthiness of the content produced by generative AI.

“ChatGPT Goes to Law School,” White Paper (1/23/23)

Professors at the University of Minnesota Law School used ChatGPT to generate answers on a small set of blindly graded exams. ChatGPT performed, on average, at a C+ level. The professors describe their methods and results and discuss the implications for law education.

“Would ChatGPT Get a Wharton MBA?,” Christian Terwiesch, Mack Institute for Innovation Management at the Wharton School (January 2023)

Terwiesch, an operations management professor, asked ChatGPT several final exam questions. She found that it performed well at general questions, made pretty glaring math errors, and couldn’t move towards more complex analytical responses. It got a B/B+ on the exam.

“GPT-4 Technical Report,” OpenAI, 3/27/2023) 

This technical report from OpenAI explores how GPT-4 performed on a series of standardized tests. For a summary of some of the results, see “OpenAI's GPT-4 aced all of its exams – except for these.”

“Sam Altman and the AI Revolution, Trillionaires, and the Future of Political Power,” Ezra Klein and Sam Altman, The Ezra Klein Show (6/11/2021)

Klein’s conversation with Altman from 2021 is pretty broad, but it’s helpful to hear from the founder of OpenAI (the source of ChatGPT and Dall-E) to understand his operating principles.

“How AI Could Change Apple and Google, Writing and Music, and Everything Else,” Derek Thompson and Ben Thompson, Plain English (1/27/23)

Thompson and Thompson offer a wide-angle-lens conversation that situates generative AI in potential technological, economical, and societal impacts.

“Bard Fork, and How to Talk So Chatbots Will Listen“ Kevin Roose and Casey Newton, Hard Fork (3/24/23) 

Roose and Newton explore some possibilities associated with Google’s Bard, provide specific examples of how people are using ChatGPT in their everyday lives, and explore Spotify’s AI D.J.

“Rethinking Teaching in an Age of AI with James M. Lang and Michelle D. Miller” Derek Bruff, Intentional Teaching (6/23/23) 

Bruff, Lang, and Miller discuss generative AI and how instructors might respond to generative AI tools. Bruff points listeners to Lang’s Cheating Lessons and Miller’s Remembering and Forgetting in the Age of Technology as resources for considering how generative AI tools best fit in higher education. 

“Suspicion, Cheating and Bans: A.I. Hits America’s Schools” Stella Tan, The Daily (6/28/23)

Teachers and students reflect on their first semester with ChatGPT and how it is changing the classroom.

Events and Additional Resources

See Events for opportunities to learn more about AI. Email facultydevelopment@depaul.edu to suggest additions to this guide.