Researchers are increasingly investigating the integration of artificial intelligence into educational settings, but a clear understanding of how students perceive AI’s cognitive role remains elusive. C. K. Y. Chan, undertaking this work independently, proposes a novel framework defining AI as a dynamic cognitive partner, adapting its function to suit diverse learning needs. Analysing written responses from 133 secondary students in Hong Kong after an AI literacy course, the study identifies nine key dimensions of this partnership, ranging from conceptual scaffolding and feedback to metacognitive support and cognitive load management. This research is significant because it moves beyond simply assessing AI’s impact on learning outcomes, instead illuminating how students actively conceptualise and utilise AI in relation to their own thinking, distinguishing between beneficial cognitive extension and detrimental cognitive substitution. Grounded in established learning theories, this framework offers crucial insight into fostering effective and responsible AI integration within education.
Scientists are beginning to understand how young people truly see artificial intelligence, not as a tool, but as a thinking companion. This emerging awareness is vital as AI becomes ever more present in the classroom and beyond. Researchers are increasingly focused on how students integrate artificial intelligence (AI) into their learning processes, yet a clear understanding of how learners conceptualise AI’s role in their own thinking remains elusive.
This study proposes a new framework that views AI not simply as a tool, but as a dynamic cognitive partner, adapting its function to suit different learning scenarios. Analysing written responses from 133 secondary school students in Hong Kong after they completed an AI literacy course, the work identifies nine distinct ways learners describe AI collaborating with their cognitive processes.
These range from providing conceptual scaffolding for challenging ideas to regulating task demands and offering feedback on errors. Notably, students consistently distinguished between AI support that enhances understanding and reliance that diminishes their own cognitive effort, demonstrating an awareness of appropriate AI usage. Grounded in established learning theories, including sociocultural perspectives, distributed cognition, and self-regulated learning, this framework clarifies how AI becomes interwoven with a learner’s cognitive activity.
It also highlights the critical boundary between extending cognitive abilities and replacing them altogether. Once AI is positioned as a partner, rather than a simple answer-generating system, it can support reasoning and sensemaking in ways previously unexplored. Recent increases in AI systems for education encompass tutoring, feedback, adaptive learning, and content organisation.
Yet, concerns regarding overreliance and diminished learner agency are growing, particularly with the rise of generative AI. Unlike previous research focused on AI’s instructional effectiveness or system design, this work centres on the learner’s perspective, how students actively position AI within their own thinking. The study’s analysis of student writing reveals a nuanced understanding of AI’s potential, capturing experiences with explanation, feedback, organisation, and adaptation.
From these insights, researchers developed the concept of AI as a “Dynamic Cognitive Partner”, a term reflecting AI’s interactive, personalised, and adaptive nature. This framework moves beyond viewing AI as a mere “tool” and instead considers it an integral part of the learner’s cognitive system. By linking this learner-informed perspective to established theories of mediated cognition and self-regulated learning, the study offers a process-oriented model for understanding AI’s role in human learning.
The research clarifies how AI functions within learners’ cognitive systems and provides a conceptual basis for examining when AI supports learning and when it risks reducing cognitive engagement. At its core, the work builds on Vygotsky’s sociocultural theory, which posits that learning occurs through interaction with cultural tools and “more knowledgeable others”. In this context, AI increasingly functions as that “other”, providing contingent support within a student’s Zone of Proximal Development, the gap between what they can achieve independently and with guidance.
Student perceptions of artificial intelligence supporting secondary learning
A qualitative content analysis underpinned this work, focusing on how secondary students perceive artificial intelligence’s role in their learning processes. Participants comprised 133 students aged 15, 17 from Hong Kong, all of whom had completed a 30-hour online AI literacy course prior to data collection. This preparatory course ensured students possessed both foundational knowledge of AI and practical experience applying AI tools to learning tasks, including explanation, feedback, organisation, and content generation.
Students also explored the ethical implications and potential risks of overreliance on these systems, preparing them to reflect critically on their own engagement. The primary data source consisted of non-graded written responses, each approximately 400-800 words in length, completed by students as a final assignment for the AI literacy course. Students were prompted to describe how AI could support their learning, specifically addressing its potential uses in education, helpful tools, changes to teacher and student roles, associated challenges, and their personal experiences with AI.
Rather than detailing specific tools, the instructions encouraged students to articulate their understandings of AI’s role in thinking, studying, and problem solving, allowing for a nuanced exploration of cognitive processes. Two researchers independently coded all texts, initially employing open coding to identify segments describing what cognitive functions students attributed to AI, when they sought AI assistance, and how they positioned their own role in relation to the technology.
These initial codes, such as explanation, checking, organisation, and idea generation, were then clustered into function-oriented categories, including conceptual explanation, feedback, organisation, adaptation, monitoring, and persistence support. Simultaneously, segments detailing potential risks like overreliance or diminished critical thinking were also coded, capturing awareness of the boundary between supportive and substitutive AI engagement.
Subsequent refinement and consolidation of these categories, through constant comparison, yielded the final framework of nine interrelated dimensions. Disagreements during coding were resolved through discussion, ensuring a robust and reliable analysis. Ethical approval was secured, student submissions were anonymised, and data were stored securely throughout the research process.
Students’ perceptions of AI’s multifaceted cognitive support in learning
Learner descriptions revealed nine interrelated dimensions of AI’s cognitive partnership. These dimensions detail how students perceive AI supporting their thinking, encompassing conceptual scaffolding for challenging ideas, feedback and error detection, and idea stimulation. Beyond simply identifying these functions, the research highlights the nuanced ways students integrate AI into their learning processes.
Students consistently distinguished between productive support, where AI extends understanding, and unproductive reliance, indicating awareness of appropriate AI usage. Detailed accounts showed AI regulating task and cognitive load, offering learning continuity beyond classroom walls, and reframing explanations through representational flexibility when students felt stuck or overwhelmed.
This level of detail provides a granular understanding of how students actively perceive AI’s role in their cognitive processes. At times, students portrayed AI as a facilitator of learning continuity, extending support beyond the confines of scheduled lessons. This suggests a shift in how students view learning, moving away from episodic classroom experiences towards a more continuous, self-directed process.
However, the study also revealed that students were adept at recognising when AI’s assistance became detrimental, replacing their own cognitive effort rather than enhancing it. Now, considering the breadth of these reported interactions, the framework positions AI not as a static tool, but as a dynamic cognitive partner. This dynamic role shifts according to learner needs, task demands, and stages of understanding, reflecting AI’s interactive, personalised, and adaptive nature.
The research underscores that AI’s involvement occurs at the level of thinking processes, supporting reasoning and sensemaking, rather than simply delivering content. Furthermore, the study’s findings align with established theories of mediated cognition and self-regulated learning. By grounding the framework in learner perspectives, the research offers a process-oriented model for understanding AI’s role in human learning activity. This model clarifies how AI functions within learners’ cognitive systems and provides a conceptual basis for examining when AI supports learning and when it risks reducing cognitive engagement.
Young learners’ nuanced perceptions of artificial intelligence in cognitive processes
Researchers have begun to map how young minds integrate artificial intelligence into the very process of learning. This isn’t simply about students using AI tools; it’s about a fundamental shift in how they approach thinking itself, viewing AI not as a replacement for effort but as a malleable extension of their own cognitive abilities. For years, educational technology has promised personalised learning, yet often delivered little more than digitised worksheets.
This work, however, suggests something deeper is occurring, a genuine reshaping of the student-knowledge relationship. Yet understanding this integration requires moving beyond simple measures of performance. The study identifies nine distinct ways students perceive AI’s role, ranging from scaffolding complex concepts to regulating cognitive load, revealing a surprisingly sophisticated awareness of both the benefits and drawbacks of AI assistance.
But this awareness isn’t uniform; students clearly differentiate between helpful support and detrimental reliance, a critical distinction often overlooked in discussions about AI’s impact. Once considered a distant threat to critical thinking, AI is now being actively negotiated, its potential benefits weighed against the risk of diminished cognitive effort.
Still, the context of this study, Hong Kong secondary students completing an AI literacy course, limits broad generalisation. While these students demonstrated a level of meta-cognitive awareness, it remains unclear if this is typical or a product of their specific training. Beyond this, the reliance on written responses offers a snapshot of expressed understanding, not necessarily a complete picture of actual cognitive processes.
At the same time, this research opens avenues for exploring how AI can be deliberately designed to promote self-regulation, rather than undermine it. Future work should focus on longitudinal studies tracking cognitive development alongside AI use, and on investigating how these perceptions vary across different cultures and educational systems. Now, the challenge lies in translating these insights into practical teaching strategies that empower students to become discerning partners with AI, not passive recipients of its outputs.
👉 More information
🗞 Who Is Doing the Thinking? AI as a Dynamic Cognitive Partner: A Learner-Informed Framework
🧠 ArXiv: https://arxiv.org/abs/2602.15638
