In a significant development for the future of education, new research from Pearson indicates that generative Artificial Intelligence (AI), when thoughtfully integrated into learning tools, can serve as a potent catalyst for developing critical thinking skills rather than diminishing them. This finding challenges a prevailing sense of apprehension among educators and policymakers regarding AI’s potential to encourage shortcuts and offload cognitive effort in students. The study, led by Principal Research Scientist Muireann Hendriksen and her colleague Dr. Emily Lai, analyzed tens of thousands of student interactions with an AI-powered study tool, offering an optimistic outlook on AI’s role as a partner in curiosity and deeper learning.
The Evolving Landscape of AI in Education
The rapid ascent of generative AI tools, such as ChatGPT and similar large language models, has undeniably introduced a transformative, albeit complex, dimension to the educational landscape. Since their public introduction in late 2022, these technologies have sparked widespread debate, with concerns frequently voiced about their potential to undermine foundational learning processes. Educators globally have grappled with the implications, fearing that students might bypass the "productive struggle" essential for genuine cognitive development, opting instead for AI-generated answers that circumvent critical thinking [1]. Indeed, early studies have pointed to potential risks of diminished critical thinking when students over-rely on mainstream AI chatbots, prompting a cautious approach to their integration in academic settings.
However, the discourse surrounding AI’s role in education is far more nuanced than a simple dichotomy of peril or promise. Proponents argue that AI holds immense potential to personalize learning, automate administrative tasks, and provide immediate feedback, thereby freeing up educators to focus on higher-order teaching. The challenge lies in harnessing this power responsibly and designing AI tools that align with pedagogical best practices. Pearson, a global leader in educational content and technology, has been at the forefront of exploring these possibilities through rigorous research and development. Muireann Hendriksen, a Principal Research Scientist on Pearson’s R&D and Thought Leadership team, with a robust background spanning academia and public health, specializes in impact evaluation and behavior change, bringing a critical lens to this investigation. Her work, alongside Dr. Emily Lai, aims to move beyond theoretical discussions to empirical evidence, focusing on how AI can be leveraged to improve learner outcomes.
Unveiling the "Asking to Learn" Study: A Deep Dive into Student Inquiry
The pivotal research, detailed in Pearson’s report "Asking to Learn," provides compelling evidence that AI can indeed foster, rather than hinder, critical thinking. The study represents one of the largest-scale analyses of student interactions with an AI-powered learning environment to date. Researchers examined nearly 130,000 anonymized queries from over 8,600 students utilizing an AI study tool embedded within a digital biology textbook—a resource commonly employed in introductory biology courses. The focus was specifically on the tool’s "Explain" feature, which empowers students to articulate their questions in their own words, thereby offering an unparalleled window into their genuine thought processes and authentic curiosities.
The methodology employed for analyzing these queries was grounded in the revised Bloom’s Taxonomy [2], a widely recognized framework for categorizing educational learning objectives into levels of cognitive complexity. This enabled the researchers to move beyond merely identifying what students were asking to discern how they were thinking. Bloom’s Taxonomy classifies cognitive processes into six hierarchical levels: Remember, Understand, Apply, Analyze, Evaluate, and Create. By categorizing each student query according to its cognitive process (e.g., Remember, Understand, Analyze) and knowledge dimension (e.g., Factual, Conceptual), the study meticulously mapped the depth of student engagement.
Beyond Basic Recall: Evidence of Higher-Order Thinking
The initial findings confirmed an expected pattern for an introductory course: approximately 80% of student queries focused on foundational knowledge. These questions typically involved defining terms ("what are the different types of light microscopy?") or seeking explanations for core concepts ("can you explain cellular respiration to me like I’m a dummy"). This is entirely appropriate, as building a solid base of factual and conceptual knowledge is an essential prerequisite for deeper learning [3]. It demonstrates that students are effectively using the AI tool to reinforce their understanding of basic principles and ideas.
However, the truly remarkable and encouraging discovery emerged from the analysis of questions that delved deeper. The research revealed that nearly one-third of all student inputs reflected more advanced levels of cognitive complexity. Even more striking, 20% of queries were classified at the "Analyze" level or higher—these are precisely the cognitive levels widely associated with the development and exercise of critical thinking skills. This proportion significantly challenges the notion that students exclusively use AI for superficial information retrieval.
These higher-order queries were far from simple requests for information. Students were engaging in sophisticated cognitive tasks, such as:
- Hypothetical Reasoning: "What might happen if the lysosome wasn’t in a separate compartment, or if it didn’t work?" This type of question requires students to predict outcomes based on their understanding of biological functions, moving beyond mere recall to application and analysis.
- Problem-Solving and Design Thinking: "How would I ‘build’ an organism to maximize its surface area to volume ratio?" This query demonstrates an understanding of complex biological principles and the ability to apply them in a design context, suggesting elements of application and synthesis.
- Comparative Analysis and Methodological Evaluation: "If you had access to a microscope, how would you differentiate endomycorrhizae and ectomycorrhizae?" This question probes practical application, requiring students to compare and contrast, and to understand experimental or observational methodologies.
These examples underscore that students were not passively consuming information. Instead, they were actively grappling with the course material, using AI as a sounding board to explore concepts, test hypotheses, and deepen their understanding in a meaningful, interactive way. Their active framing of inquiries demonstrated a profound level of cognitive engagement [4].
The "Go Deeper" Initiative: Nurturing Curiosity with AI
Inspired directly by these compelling findings, the Pearson team has proactively developed and implemented a new AI feature called "Go Deeper." This innovative addition is designed to intentionally scaffold students towards higher levels of cognitive complexity. When a student poses an initial question, the AI tool now not only provides a comprehensive answer but also follows up with a carefully crafted prompt. This follow-up question is specifically engineered to elevate the student’s cognitive engagement by one to two levels on Bloom’s Taxonomy.
For instance, if a student initially asks for a definition (a "Remember" level query), the "Go Deeper" feature might prompt them to describe the concept in a novel context (moving to "Understand") or to apply it to solve a practical problem (advancing to "Apply"). This transforms a singular, isolated query into a multi-step learning journey, gently guiding the student towards more critical thinking without overwhelming them or pushing them too far beyond their current understanding, which could lead to confusion. This approach exemplifies how AI can act as a formative support, meeting students at their current cognitive level and incrementally challenging them to achieve greater depth of thought.
Broader Implications and the Future of Learning
The "Asking to Learn" study marks a pivotal moment in the ongoing conversation about AI in education. It provides empirical evidence that, contrary to some widespread fears, generative AI can be a powerful ally in developing essential 21st-century skills.
- For Educators: These findings suggest a shift in pedagogical approaches. Instead of viewing AI as a tool to be policed or prohibited, educators can embrace it as a personalized tutor, an inquiry partner, and a means to foster deeper engagement. The emphasis can move from rote memorization and basic recall—areas where AI excels—to guiding students in asking better questions, evaluating AI responses critically, and synthesizing information creatively. This empowers teachers to focus on complex problem-solving, ethical reasoning, and collaborative projects, leveraging AI to handle more routine informational tasks.
- For Students: The research validates the potential for AI to democratize access to personalized learning experiences. Students can receive immediate, tailored support that adapts to their individual learning pace and cognitive needs. This fosters greater autonomy and encourages a proactive, inquiry-based approach to learning, cultivating an intrinsic motivation to "ask to learn." The ability to test understanding with a non-judgmental AI partner can reduce anxiety and build confidence in exploring challenging concepts.
- For EdTech Developers: The study offers a clear blueprint for designing responsible and effective AI-powered learning tools. The focus should be on creating features that actively scaffold higher-order thinking, rather than merely providing answers. Ethical considerations, such as data privacy, algorithmic bias, and equitable access, remain paramount. The "Go Deeper" feature serves as an excellent model for integrating pedagogical principles directly into AI design.
- Policy and Ethics: The results underscore the urgent need for educational institutions and policymakers to develop comprehensive guidelines for AI integration. These policies should encourage innovative uses of AI while safeguarding against potential misuse and ensuring that all students benefit equitably from these advancements. Training for educators on how to effectively integrate AI into their curriculum and how to teach students to use AI responsibly will also be crucial.
Looking Ahead: The Collaborative Learning Ecosystem
While opinions on AI in education continue to span a spectrum from cautious skepticism to fervent optimism, Pearson’s research firmly anchors the discussion in empirical data. It highlights the profound value of thoughtfully designed, AI-powered experiences that function as formative supports within a larger learning ecosystem. By understanding the nuances of how students interact with AI and, crucially, how they frame their questions, educational innovators can continue to build tools that not only meet learners where they are but also effectively guide them toward a richer, more active, and profoundly curious engagement with the vast world of knowledge. The future of learning, it seems, will be increasingly collaborative, with AI acting not as a replacement for human intellect, but as an intelligent partner in the lifelong journey of inquiry and discovery.
References:
[1] Kumar, H., Rothschild, D. M., Goldstein, D. G., & Hofman, J. M. (2023). Math education with large language models: Peril or promise? Available at SSRN 4641653. https://ssrn.com/abstract=4641653
[2] Anderson, L.W., & Krathwohl, D.R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives: complete edition. Addison Wesley Longman, Inc. https://doi.org/10.1187/cbe.10-01-0001
[3] Momsen, J.L., Long, T.M., Wyse, S.A. and Ebert-May, D. (2010). Just the facts? Introductory undergraduate biology courses focus on low-level cognitive skills. CBE—Life Sciences Education, 9(4), pp.435-440.
[4] Maiti, P., & Goel, A. (2025, March). Can an AI Partner Empower Learners to Ask Critical Questions? In Proceedings of the 30th International Conference on Intelligent User Interfaces (pp. 314-324).
[5] Chin, C. and Osborne, J. (2008). Students’ questions: a potential resource for teaching and learning science. Studies in Science Education, 44(1), pp.1-39. https://doi.org/10.1080/03057260701828101
[6] Hendriksen, M., & Lai, E. (2025). Asking to Learn: What student queries to Generative AI reveal about cognitive engagement. Pearson. https://plc.pearson.com/sites/pearson-corp/files/asking-to-learn.pdf








