Explore how AI impacts assessment and academic integrity, and learn strategies to redesign assessments that uphold student mana and promote authentic learning.
Learning Path 3: AI, Assessment, and Academic Integrity
⚖️ Section 2: Risks + Opportunities
We’ll begin by naming both sides of the coin — the tensions and possibilities that AI introduces into assessment.
Risks to Watch For
- Masking learning: AI-written work may hide a student’s actual skill level
- Equity gaps: Some learners have better access to tools, devices, and support
- Overuse: Students may rely on AI instead of developing independent thinking
- False positives: Detection tools can wrongly flag innocent students — especially Māori, Pasifika, ESOL, or neurodivergent learners
- Hidden workload: Tutors may face added complexity when evaluating AI-supported work
Opportunities to Embrace
- Shift the focus to process, thinking, and reflection
- Assess the human layer — decision-making, critique, and insight
- Support digital literacy while upholding academic ethics
- Reduce busywork to surface deeper learning
- Strengthen feedback loops through clearer explanation of student thinking
💬 “When we asked students to explain how AI supported their draft — they were more thoughtful, not less.”
🪶 Kaupapa Māori Lens — Whakaaro | Perspective
From a kaupapa Māori perspective, the key question isn’t:
“Can students cheat?”
but rather:
“Does this assessment protect and strengthen the mana of the student?”
Risk through a Tikanga Lens
AI misuse can:
- Erode self-efficacy
- Undermine authentic voice
- Create whakamā or self-doubt
Opportunity through a Tikanga Lens
Thoughtful redesign can:
- Honour ako (reciprocal learning)
- Build whanaungatanga through shared kōrero
- Strengthen kaitiakitanga by teaching responsible AI use
- Support tino rangatiratanga by positioning learners as decision-makers
Whaiwhakaaro | Reflection
How might your current assessments either strengthen or diminish student mana?
Tip: 💡 Explore this further in Section 6: Mana and Assessment Deep Dive.