First of all...
There is no such thing as an "AI proof" assignment,
BUT
you can design assignments that will mitigate the use of AI.
First, it is important to know the limitations of large language models like ChatGPT and understand that these limitations are constantly changing as AI gets better at what it does.
ChatGPT’s free model (3.5), for instance, will tell you itself that has been trained on limited data (up until @2021), thereby limiting it’s knowledge base. Furthermore, the large data sets it is trained on are not comprehensive and may contain biases. It has also been primarily trained primarily on English text and from a Eurocentric point of view, so may not be reliable or relevant in response to questions about other languages or cultures. Large language models are fallible, thereby causing inaccuracies or incorrect information (these are called Hallucinations). Finally, ChatGPT is impersonal; personal opinions or emotions cannot be offered.
So what does all that mean?
In terms of our classrooms, that means that ChatGPT will have difficulty analyzing case studies, a variety of sources, or a problem based on current events and recent data (at least for now). It also can’t provide reliable citations – it will even make up non-existent citations (these make for great learning moments and emphasize to students the importance having a thorough critiquing process). So, for example, developing assessments related to readings from the current week and, when possible, leveraging very recent current events can help mitigate AI use.
Let’s dig a little more deeply into steps we can take to mitigate the use of AI in our assessments.
Assessment makeover
Derek Bruff, in his blog Assignment Makeovers in the AI Age: Essay Edition, outlines six questions to help think through assignment makeover:
- Why does this assignment make sense for this course?
- What are specific learning objectives for this assignment?
- How might students use AI tools on this assignment?
- How might AI undercut the goals of this assignment? How could you mitigate this?
- How might AI enhance the assignment? Where would students need help figuring that out?
- Focus on the process. How could you make the assignment more meaningful for students or support them more in the work?
Derek’s questions can help us focus in on the meat of any assessment and pick out what is important. It also gets us thinking about how and why students might use AI on any given assignment. These initial ponderings can really help when it comes to deciding if the assignment needs tweaking or if it need to be completely reworked.

Check the learning goals
As suggested above, a good starting point is looking at the learning goals and how the assessment aligns with these goals. Learning goals that emphasize the process of learning rather than the product of learning help us create assignments that ask students to engage in more complex and higher-order thinking. This could look like focusing on the importance of each step in the process (ideation and brainstorming, outlining, identifying sources, drafting and revising, etc.) rather than the finished product.
The Teaching and Learning Lab outlined a presentation by Derek Bruff in Teaching in the Artificial Intelligence Age of ChatGPT where “Bruff offered an example of using Wolfram Alpha in his linear algebra class. He persuasively argued that by allowing students to use the software to row-reduce large matrices (after ensuring that students engage in and grapple with the row-reduction process “by hand”), students can model and solve more interesting, challenging, and authentic problems. i.e., the use of the software allows students to focus on the more meaningful and relevant aspects of problems and problem solving, rather than the time-consuming and error-prone row-reduction process.”
Get specific
Another mitigation technique is getting into specifics that apps like ChatGPT are not good at, for example:
- Ask students to connect assignments to specific readings from class, or use a point raised in discussion to defend or critique a stance.
- Incorporate a reflective component that asks students to explain how the assignment relates to their personal beliefs
- Explore current or local problems that AI doesn’t have access to by using examples that are local to your school community.
Less content, more application
Inara Scott, Senior Associate Dean and Gomo Family Professor at Oregon State University College of Business and Editor in Chief of American Business Law Journal writes that “Research suggests students learn the content better when less of it is covered and more time is spent engaging students in active learning. If we spend our time covering content and not focusing on application, we are making our students irrelevant to the professional world they are entering” (The “Less Content, More Application” Challenge).
An example of this type of assessment might be tasks that require students to collect, analyze, or present data, evidence, or documentation that are difficult to fake or generate by AI. For example, asking students to design and conduct an experiment, survey, or interview, and report their findings and conclusions.
Or consider tasks that ask students to demonstrate evaluative judgement by comparing, critiquing, or applying different concepts, methods, or solutions. For example, asking students to evaluate the strengths and weaknesses of different approaches to a problem, or to apply their knowledge to a real-world scenario or case study.
Process
So, there has been a lot of talk about process over product. It’s important to understand that scaffolding assignments into smaller checkpoints can deter—though not prevent—unauthorized use of generative AI. The beauty of focusing on these checkpoints is that it allows you to give feedback throughout the process and see how students are developing ideas and changing their thinking over time. It also invites metacognition; a final portfolio, for example, lets students create a narrative about the work they have completed from start to finish and explain their choices.
Other ways to demonstrate process
Of course, there are many way to demonstrate process beyond brainstorming, rough draft, and final product. Consider a few of these ideas:
- Outlines or proposals
- Elevator Pitches
- Annotated bibliographies
- Drafts and feedback
- Engaging in feedback cycles that require peer, self, and teacher feedback.
Go multimodal
Another fundamental shift we can make is to stop relying on written products and go multimodal. Multimodal writing refers to different modes of communication and composition – spatial, gestural, alphabetic, visual, aural, and haptic – and it offers us affordances that strictly alphabetical writing doesn’t. In fact, focus on the alphabetic sends a message that it is better or more important than the others modalities and perpetuates a detrimental view that limits and harms our students of diverse cultures (see Zaretta Hammond’s Culturally Responsive Teaching and the Brain).

Multimodal assessments ask students to work effectively and critically through multiple modes of presenting information, here are a few examples:
- video or graphical elements
- portfolios
- websites
- blogs/vlogs
- podcast
- infographic
- app development
- social media
- research posters
You can learn more about multimodal assessments in this article from the University of Notre Dame, Incorporating Multimodal Assessment into Your Course.
Promote metacognition
Incorporating tasks that ask students to reflect on and analyze their own learning creates an opportunity for them to demonstrate mastery and enhances overall success, as outlined in Anton O. Tolman’s “Ways Metacognition Can Enhance Student Success.” Consider tasks that require students to demonstrate their reasoning, creativity, or critical thinking skills, rather than just producing factual or procedural answers. For example, asking students to explain how they solved a problem, why they chose a particular method, or what assumptions they made.
You might ask students to track when they used AI and why. Did it help them to clarify a point that they didn’t understand? How did using the tool further or complicate their understanding of a topic? Did using AI help them meet the goals of the assignment?
Ask students to write or record a reflection on what they have learned in the subject, how they have progressed, and what they feel confident about.
Tell the tale
Mitigating assessment to prevent all use of AI is a pretty tall order, another possibility is embracing the use of AI. I say this with a caveat: there do need to be parameters.
It is unhelpful to direct students’ use of AI on any given assessment with vague comments like
- “Don’t use it too much”
- “Add a human touch” or
- “Make sure to use your authentic voice.”
Those aren’t really practical and could be interpreted on a very large scale, face it, how much is too much?
Specific guidelines are necessary.
James Presbitero Jr. uses the three layers of AI augmentation as his framework for the appropriate use of AI in writing.

His framework of layers also made me think about what we ask our students to do. Are our assignments what he would consider surface creations? If so, then AI can easily be used. Alternatively, if our assessments focus on the deep creations then AI use is naturally mitigated.
Consider some of the ways AI can be used productively to enhance learning:
- Researching your topic
- Simplifying audience research
- Looking for underserved titles and topics
- Formulating and creating hooks, titles, and quotes
- Enhancing readability by suggesting synonyms, idioms, etc
- AI writes the essay and students assess it and give feedback
- Students write a descriptive paragraph that is then put through generative image AI and assess the quality of the image
- Students design prompts to have AI draft an authentic artefact (e.g. policy briefing, draft advice, pitch deck, marketing copy, impact analysis, etc) and improve upon it. They document the process and reasoning: initial prompt, improvements, sources, critiques
If AI is used for “surface-level” tasks, students could then be free to focus on learning at a deeper level.
Stephanie Holt, in her article Moving from Learning to Use AI to Using AI for Learning: Bllom’s Taxonomy for Utilizing Generative AI Tools, outlines how Bloom’s Taxonomy provides an ideal lens through which educators can integrate Generative AI tools into their teaching methodologies.
- Knowledge: Gathering and Recall – streamline the process of gathering and recalling factual information
- Comprehension: Understanding and Interpretation – assist in summarizing texts, translating languages, or even explaining complex ideas in simpler terms
- Application: Practical Use of Knowledge – applying learned concepts in practical scenarios, i.e. AI simulations for scientific experiments
- Analysis: Dissecting and Questioning – help students learn to analyze data, recognize patterns, and critically assess information
- Synthesis: Integration and Creation – help students in synthesizing information and presenting that information in creative ways
- Evaluation: Critical Thinking and Judgement – teach students about bias in data, the importance of ethical considerations, and the limitations of AI itself, encourage critical thinking and foster a deeper understanding of the technology’s impact on society, use AI tools for feedback for improvement
Considering AI’s capabilities through the lens of Bloom’s Taxonomy, opens the doors to the creation of engaging, interactive, and effective learning experiences where AI is not the enemy, but an integral part of relevant, rich learning.

Here are some other ideas for how AI can be used in learning:
- Visual Storytelling
- Art & Design Projects
- Visual Communication i.e. infographics
- Research/Discovery Quest
- Writing Coach
- Think/Pair/Share partner
- Advisor/Tutor
- Grade the AI
- Science Lab
- Debate the AI
- Learn About Voice From Imitations of Famous Authors
- Chat With Historical Figures
- Speak With AI in a Foreign Language
- Critically examining AI in its many aspects i.e. Bias, Erasing Identity, Hallucinations, Human Labour, Environmental Impact, and Spreading Misinformation
As Holt says, “This shift not only prepares students for a future where AI is ubiquitous but also equips them with the skills to critically engage with and shape this technology for the betterment of society.”

