ChatGPT and other AI tools created controversy from the moment they launched. Responses ranged from school bans, to professors assigning AI as a tool in their assignments, to online publications using AI to create content.
This guide helps students understand what artificial intelligence (AI) text generators and art generators are, how to use them, and when not to use them:
The content of this guide is subject to change as AI-generative tools and policies about their use evolve. Before you use an AI tool, consult your course syllabi and ask your professors if and how these tools can be used in your classes.
Because text-generating AI is in its infancy, developers are still attempting to resolve issues. For example, ChatGPT has limited knowledge of world events that occur after September 2021. When used, AI sometimes fabricates information and creates fictitious references to support its claims. Another major concern is gender, race, ethnicity, and disability status biases. Lima and DiMolfetta (2023) write, "Lawmakers said they plan to explore how tools like ChatGPT can both inadvertently produce misinformation, including a phenomenon known as hallucination where AI concocts a false fact seemingly out of thin air, as well as how they can power disinformation, such as via deepfakes" (n.p.).
Faculty and students should utilize ChatGPT as an idea generator rather than a tool to find facts from reputable sources. For example, while AI can create lesson plans, it cannot create custom lessons for specific classes or individual students as well as a human teacher. AI can help faculty with grading, but "the AI may get it right 9 out of 10 times, requiring the teacher to personally review each piece of feedback" (Will, 2023).