Pedagogical Adaptation in the AI Era: Fostering Critical Engagement through Structured Adoption and Exploration
Abstract
The widespread adoption of generative AI by students has fundamentally challenged the validity of traditional assessment approaches focused on information retrieval and basic synthesis. In response, this article proposes moving beyond reliance on detection tools and toward a proactive redesign of assessment that prioritises higher-order thinking and critical engagement with AI. Grounded in the AAAE (Against, Avoid, Adopt, Explore) framework (Khlaif et al., 2025), it presents a reflective case study from a UK Taxation module to illustrate distinct and actionable strategies for the Adopt and Explore stances. The Adopt phase involves the deliberate incorporation of AI tools, such as a customised AI Tutor, into the learning workflow for specific generative tasks. The Explore phase strategically shifts the assessment focus to the critical process of evaluating, discussing, and refining AI-generated outputs within interactive small-group tutorials. This two-phase approach not only deepens discipline-specific understanding but also systematically builds essential competencies in critical AI understanding, preparing students for responsible and evaluative engagementl with AI in professional contexts.
Keywords: AI in Education; AAAE Framework; Critical Thinking; Assessment Redesign
 
 
Introduction
The rapid integration of generative Artificial Intelligence (AI) into student learning represents a game-changing moment for higher education pedagogy. Traditional assessment models, which often rely on information retrieval and basic synthesis, are increasingly compromised in this new reality. A response based solely on prohibition or unreliable AI detection is insufficient. Following Khlaif et al. (2025), a sustainable path forward requires a strategic shift toward assessment redesign that prioritises higher-order thinking and builds pedagogical resilience. This involves moving from defensive postures (Against, Avoid) to proactive ones that strategically Adopt and Explore AI's potential.
This article presents a practical implementation of this shift through a case study in a UK Taxation module. It focuses explicitly on operationalising the Adopt and Explore dimensions of the AAAE framework as distinct yet interconnected phases in a learning cycle. The case demonstrates how AI can be purposefully embedded into small-group teaching not as a definitive answer engine, but as a structured object of critical analysis. By separating and elaborating on the actions taken under each stance, we provide a clear model for educators seeking to transform the AI challenge into an opportunity for fostering deep, active, and critical learning.
 
 
Theoretical Framework: Operationalising Adopt and Explore
The AAAE framework (Khlaif et al., 2025) is a strategic model for educators to categorise and adapt assessments in response to generative AI. The framework outlines four stances: Against (prohibit AI), Avoid (design AI-resistant tasks), Adopt (integrate AI with required critique), and Explore (focus on collaborative process and innovation) (Khlaif et al., 2025). In this article we focus on the two proactive stances:
 
Adopt: This stance involves the permitted and structured integration of AI tools into the learning process for specific tasks. The goal is to leverage AI’s generative capabilities while framing its output as a starting point for further work, not a final product. Actions under Adopt include explicitly permitting AI use for brainstorming, drafting, or generating initial responses, and designing tools or activities that channel this use productively.
However, this accessibility comes with a well-documented risk: the uncritical acceptance of AI-generated content. AI outputs can contain inaccuracies, biases, fabricated information and lack of consideration for complex and context-specific domains (Raji, 2025). Therefore, the central educational challenge in the AI age is no longer merely providing information but equipping students with the capability to evaluate, critique, and responsibly use the information these AI tools generate.
 
Explore: This represents a more advanced pedagogical stance where the core learning activity is the critical process of engaging with AI. The focus shifts from product generation to the evaluation, discussion, and refinement of AI outputs. Assessment under Explore prioritises metacognition, collaborative critique, and the demonstrable reasoning process used to interrogate AI-generated content, often within interactive settings like small-group tutorials.
The following sections detail how these two stances were separately enacted and linked within the design of a UK Taxation module.
 
Phase 1: Adopt – Structured Integration of AI Tools
The Adopt phase was implemented through clear policy, tool provision, and task design to normalise and guide responsible AI use.
 
1. Policy and Expectation Setting: Students were explicitly informed that using AI tools for brainstorming and knowledge learning was permitted and encouraged. This transparent policy mitigated ambiguity and academic integrity concerns by aligning practice with the Adopt category. 
 
2. Provision of a Customised AI Tutor: A central action under Adopt was the development and deployment of a customised AI Tutor for UK tax. This tool, trained on core tax legislation, textbooks, and case studies, served as a 24/7 digital assistant. Students were encouraged to use it to query complex topics, generate explanations of principles, or produce initial outlines for problem-solving. This tool represented a strategic Adoption move, acknowledging AI's utility for support while controlling its knowledge base to be domain-relevant.
 
3. Task Design for Generative Use: In tutorials, the Adopt phase was activated through specific instructions. For a complex tax scenario, students were first tasked with using the customised AI Tutor or other permitted tools (e.g., Deepseek) to generate a proposed answer. This directive legitimated AI use for a concrete and generative task, embodying the Adopt principle. The instruction framed the AI’s output explicitly as “raw material” for the subsequent, more important phase of critical analysis.
 
Phase 2: Explore – Cultivating Critical Evaluation through Collaboration
The Explore phase began where Adopt ended, transforming the AI-generated output into the subject of rigorous, collaborative critique.
 
1. Scaffolded Critical Analysis in Small Groups: Students, working in small groups, were required to critically dissect the AI's proposal. Guided by prompts, their discussion focused on:

 · Accuracy & Currency: Does this reflect the latest Finance Act provisions?
 · Contextual Application: Are the assumptions valid for the specific client details?
 · Procedural Correctness: Has the AI applied reliefs and computations correctly?
 · Professional Standards: Does the output meet the formatting and rigor required for tax practice?

This structured discussion forced students to engage deeply with authoritative sources (legislation, manuals) to verify claims, moving beyond passive consumption.
 
2. Leveraging AI’s Limitations as Learning Opportunities: The activity was deliberately designed to leverage common AI weaknesses - potential outdatedness, generic responses, lack of refined understanding - as the central learning objective. By contrasting flawed AI outputs (see Figure 1) with correct solutions (see Figure 2), students solidified their understanding of precise procedural knowledge and the importance of contextual judgement.
 
 
Case Study Illustration: The Integrated Adopt-Explore Cycle in Practice
The subject of UK Taxation provides a particularly fertile ground for this integrated approach. It is a discipline characterised by dense legislation, complex computational applications, and rules that are subject to annual governmental updates. Consequently, while an AI can be trained on vast datasets, its ability to provide consistently accurate, current, and contextually appropriate advice is inherently limited. A static model may not reflect the latest Finance Act provisions or understand the specific situation of a complex client scenario.
Recognising this, a pedagogical intervention was designed for the small-group tutorials. Instead of setting traditional problem-solving exercises, in a tutorial on corporation tax, students were given a complex multi-layered scenario. Following the Adopt phase, students, working in small groups, were first tasked with using the AI Tutor to generate a proposed solution or answer.
 
Figure 1 Example AI-Generated Answer (Deepseek) for a Corporation Tax Question
 
The Explore phase then commenced. In their small groups, students collaboratively analysed Figure 1 and each group was required to critically discuss the AI’s output. They identified specific discrepancies such as the miscalculation in capital allowances, misapplied relief and etc. Through debate and joint reference to the current Finance Act and official tax manuals, they constructed a corrected and professionally formatted answer (Figure 2).
 
Figure 2 Corrected Answer Developed through Group Critique 
 
This cycle - Adopt (generate with AI) followed by Explore (critique and correct collaboratively) - transformed the tutorial into a workshop for critical AI literacy and deep disciplinary learning. Through collaborative discussion, students successfully identified the discrepancies in the AI's output, such as miscalculations, misinterpretations or the generation of generic text that lacks the precise formatting required for official tax computations. By contrasting the AI’s flawed or incomplete proposal with the correct solution, students improved their understanding of the underlying principles, correct procedural steps, and necessary professional standards. This process of comparison, peer discussion, and corrective reasoning moves them beyond passive receipt of information. It solidifies conceptual understanding, reinforces accurate application, and embeds procedural knowledge, thereby fostering the deep learning that arises from active analysis and social construction of meaning.
 
 
Reflection and Future Development
The successful application of the Adopt-Explore model embodies the shift towards proactive pedagogical redesign, moving beyond a primary reliance on prohibition and detection, as advocated in frameworks for the generative AI era (Khlaif et al., 2025). It offers a practical model for upholding integrity through verification and authentic assessment rather than unreliable detection software, aligning with approaches that emphasise demonstrable student ownership of work in the GenAI era (Raji, 2025).
 
To successfully implement such a model, explicit guidance and transparent policies are crucial. Teachers must lead students through a deliberate process of understanding AI as a tool, not an authority. This involves:
 
Firstly, setting clear expectations. The module syllabus and assignment briefs must explicitly state that the use of AI for brainstorming and initial drafting is permitted or even encouraged, but that its output is a starting point for critique, not an end product. A clear policy defining permissible use, aligned with the Adopt and Explore categories, mitigates ambiguity.
 
Secondly, scaffolding the skill of critical evaluation. Students cannot be expected to critique AI effectively without guidance. Initial sessions can model the process: the tutor can present an AI-generated answer to a simpler problem and lead a whole-group critique, highlighting how to check for factual currency, logical consistency, and contextual fit. This scaffolds the independent critical skill required for later group work.
 
Thirdly, emphasising process and metacognition. The assessment criteria for such tutorial activities should reward the quality of the group’s critical discussion, the evidence used to challenge the AI, and the depth of the reflective justification for their final answer. This focus aligns with the “Explore” principle of the AAAE framework, which advocates for assessing the learning journey and critical engagement over the final product alone (Khlaif et al., 2025). Incorporating metacognitive tasks, such as a post-activity reflection where students articulate what they learned about both tax law and the limitations of AI, effectively operationalises this principle.
 
Finally, fostering a culture of academic transparency. Students should be required to document their use of AI, perhaps through a brief declaration stating which tool was used, for what purpose, and how its output was critically assessed and modified. This mirrors professional practice and reinforces responsible use.
 
 
Conclusion
In summary, this case study presents a practical approach to integrating generative AI within small-group teaching, as demonstrated in a UK Taxation module. By intentionally designing for both the Adopt and Explore phases of the AAAE framework (Khlaif et al., 2025), we reframe the widespread use of AI not as a disruption to be policed, but as an opportunity to deepen learning. This pedagogical shift moves the focus from content delivery to the processes of critical examination, collaborative discussion, and evidence-based reasoning. When students are guided to generate, interrogate, and refine AI-produced outputs together, they strengthen both their subject-matter expertise and their capacity to engage with digital tools reflectively. Ultimately, this approach helps cultivate not just skilled users of technology, but discerning and ethically-aware thinkers who can exercise independent judgment in an increasingly AI-mediated world.
 
 
 
 
References
Khlaif, Z. N., Alkouk, W. A., Salama, N., & Abu Eideh, B. (2025). Redesigning Assessments for AI-Enhanced Learning: A Framework for Educators in the Generative AI Era. Education Sciences, 15(2), 174.
Raji, M. (2025). Smart Solutions: Authentic Assessment for GenAI-Era Academic Integrity. Accessed on 14 December 2025, https://tlconestoga.ca/smart-solutions-authentic-assessment-for-genai

AUTHOR
Liyan Hou
Senior Teaching Fellow
Department of Accounting, International Business School Suzhou, XJTLU

DATE
21 January 2026

Related Articles