How to Grade the Top AI Tools for Students
Teacher's Take: "AI Study Tool Surge": Do They Support Deep Learning or Just Provide Answers?
Support journalism that digs deeper. This detailed investigation of AI learning tools took over 25 hours of research, testing, and analysis to produce. To continue delivering thorough, independent reviews that help educators and parents make informed decisions, we need your support. Consider upgrading to a paid subscription to ensure this vital work continues. Every subscription directly enables more in-depth coverage of the rapidly evolving AI education landscape.
The Challenge
The AI study tool gold rush is on. Every week brings new apps promising to transform how students learn. Take a photo, and Photomath solves any math problem. Record a lecture, and Coconote turns it into perfectly organized notes and study guides. Feed your textbook to StudyRot, and it spits out TikTok-style videos complete with meme sound effects. Need help with any subject? ChatGPT serves as an always-available tutor.
Figure 1. Studyrot transforms lecture notes into two-person, NotebookLM, podcast-style discussion playing overtop your favorite online video games like Subway Surfer. Who could ask for more?
For parents and educators, this flood of AI learning tools is both exciting and unsettling. On one hand, they offer unprecedented support and accessibility - a democratization of tutoring and study help. On the other hand, many of these tools seem to automate the very mental work that leads to deep understanding. When an app can instantly solve any math problem or transform dense reading into bite-sized videos, are students still developing crucial thinking skills?
Some education skeptics dismiss these AI tools entirely, viewing them all as sophisticated cheating devices that undermine real learning. But this blanket rejection ignores important nuances in how different tools support - or hinder - the learning process. It also fails to acknowledge that these tools are already deeply embedded in how students study and learn.
Our response can't be to ban these tools or pretend they don't exist. Instead, we need a thoughtful way to evaluate them and teach students to use them wisely. We need to distinguish between tools that promote genuine learning and those that simply provide answers.
Disclosure: This review is independent and unsponsored. I have not received any compensation or incentives from the AI companies mentioned in this article. NP
Generative Thinking: A Framework for Understanding Learning with AI
When we think about learning in the age of AI, we need to move beyond simple binaries of "good tools" versus "cheating tools." Instead, we can draw on decades of learning science research that tells us what effective learning actually looks like. The concept of generative thinking, pioneered by education researcher Merlin Wittrock, offers particularly valuable insights.
According to Wittrock, genuine learning isn't passive consumption of information - it's an active process where learners "generate perceptions and meanings consistent with their prior knowledge." The mind, he explains, "is not a passive consumer of information" but rather "actively constructs its own interpretations of information and draws inferences from them."
This distinction becomes clear when we compare different AI study tools. Consider two approaches to math learning:
Photomath simply solves problems and shows steps. Students passively receive answers.
Grademaxx helps students visualize mathematical relationships through interactive mind maps and adaptive quizzes. Students actively build understanding.
At its core, generative thinking involves five key elements:
Actively constructing meaning by reorganizing new information and integrating it with existing knowledge
Going beyond finding "right" answers to exploring multiple possibilities and connections
Using tools as thinking partners rather than answer providers
Engaging in deep reflection about one's own understanding
Creating original insights rather than just consuming information
These elements give us a lens for evaluating AI study tools: Do they support these processes, or do they bypass them in favor of quick answers?
Figure 2: Generative Thinking Visualized (Created with Claude 3.5)
How to Measure Generative Learning Support
Converting learning science theory into practical tool evaluation requires clear metrics. We developed a straightforward scoring system that looks at four critical dimensions of generative thinking, each rated on a simple 0-2 scale.
Knowledge Integration
How well does the tool help students connect ideas?
0: Limited - Tool presents isolated facts or solutions
1: Basic - Tool suggests some connections between concepts
2: Strong - Tool actively helps students build meaningful relationships between ideas
Active Processing
How much does the tool engage student thinking?
0: Passive - Students mainly receive pre-made content
1: Semi-active - Students make some choices and interact with content
2: Highly active - Students must think deeply and make meaningful decisions
Metacognitive Support
Does the tool help students reflect on their learning?
0: Minimal - Focus on answers, not understanding
1: Moderate - Some support for tracking learning progress
2: Comprehensive - Rich feedback about thinking and learning process
Generative Features
Can students create and explore?
0: Limited - Students mainly consume preset content
1: Moderate - Some opportunities for customization
2: Extensive - Students can create, experiment, and discover
Highly Generative Tools (6-8 points): Champions of Active Learning
These tools excel in fostering deep engagement and creativity. They encourage students to actively construct knowledge, reflect on their learning, and create original insights. Examples include MOXIE and GRADEMAXX, which integrate knowledge, promote extensive student interaction, and support comprehensive metacognitive reflection.
Figure 3. GRADEMAXX's innovative multi-window interface enables personalized learning pathways through interactive content curation and knowledge assessment. The mind-mapping functionality facilitates sophisticated concept visualization and interconnected understanding.
Moderately Generative Tools (3-5 points): Balanced Learning Support
These tools provide a mix of active and passive learning. While they support some knowledge integration and student engagement, they may lean on pre-set content or automation, limiting deeper cognitive processes. Tools like STUDY FETCH, TUTORAI, and CHATGPT fall here, helping students connect ideas but offering less room for creation and reflection.
Figure 4. STUDY FETCH demonstrates versatile information delivery mechanisms. While its quiz features encourage analytical thinking, the platform's primary focus remains on structured content presentation rather than open-ended knowledge creation.
Minimally Generative Tools (0-2 points): Convenient but Limited
Focused on delivering answers or transforming content, these tools often prioritize automation over active thinking. They typically provide minimal opportunities for customization, reflection, or creative exploration. Tools like STUDYROT, PHOTOMATH, and CHATPDF primarily aid consumption of information rather than fostering generative thinking.
Figure 5. PHOTOMATH represents an early automated learning tool that exemplifies minimal generativity. Its core functionality—converting photographed mathematical problems into step-by-step solutions—demonstrates efficient answer delivery but lacks features that would promote independent analytical thinking or conceptual understanding.
For a comprehensive analysis of apps by category, please refer to the detailed document.
Complete Ranking of AI Study Tools by Generative Learning Support
HIGHLY GENERATIVE (6-8 points)
MOXIE (Academic research and writing AI assistant) - 8/8
GRADEMAXX (Course materials to interactive study formats) - 8/8
CHATGPT (General purpose AI chat assistant) - 6.5/8
SCITE (Smart citation analyzer for research papers) - 6/8
QUIZLET (Student-created flashcards and study sets) - 6/8
MODERATELY GENERATIVE (3-5 points)
TUTORAI (Personalized AI learning and tutoring platform) - 5/8
STUDY FETCH (Course content to flashcards converter) 3/8
CONSENSUS (AI-powered scientific literature search engine) - 3/8
SEMANTIC SCHOLAR (AI-powered academic paper search database) - 3/8
COCONOTE (Lecture recording to study materials converter) - 3/8
GAUTH (Math solver with live tutor support) - 3/8
STUDYX (Multi-subject homework help with AI tutor) - 3/8
SOCRATIC (Google's homework help photo solver) - 3/8
QUILLBOT (AI writing enhancement and paraphrasing) - 3/8
MINIMALLY GENERATIVE (0-2 points)
CHATPDF (PDF document question answering tool) - 2/8
STUDYROT (PDF to TikTok-style video converter) - 0/8
PHOTOMATH (Math problem photo solver with explanations) - 0/8
For a comprehensive analysis of apps by category, please refer to the detailed document.
_______________
Key Insights from the Analysis
Tool Design Patterns
The most generative tools (8/8) like Moxie and Grademaxx share key characteristics:
Strong student agency in content creation
Built-in reflection and metacognitive support
Balance between AI assistance and student effort
Multiple ways to engage with material
The least generative tools (0-2/8) like Photomath and StudyRot:
Prioritize convenience over learning
Automate processes that should involve student thinking
Lack features for student creation or exploration
Focus on content delivery rather than understanding
Figure 6: Tool Design Patterns (Created with Claude 3.5)
The Automation Paradox
Many tools that promise to "make learning easier" actually reduce learning effectiveness by automating the cognitive work that leads to understanding. The highest-scoring tools don't remove mental effort - they scaffold it.
The Importance of Purpose
Tools must be evaluated based on their intended use. Some lower-scoring tools might be appropriate for specific purposes (like quick reference) but shouldn't be relied on as primary learning tools.
Recommendations for Students and Educators
Prioritize tools that require active engagement
Look for features that support reflection and understanding
Use automation selectively - not as a replacement for thinking
Combine tools strategically (e.g., using Grademaxx for deep learning and Photomath for checking work)
Future Directions
The gap in the market is clear: We need more tools that:
Balance AI capabilities with student agency
Support metacognitive development
Encourage creative exploration and connection-making
Make learning more efficient without bypassing understanding
The goal isn't to avoid AI study tools, but to use them thoughtfully in ways that enhance rather than replace the cognitive work of learning.
Nick Potkalitsky, Ph.D.
For a comprehensive analysis of apps by category, please refer to the detailed document.
Check out some of our favorite Substacks:
Mike Kentz’s AI EduPathways: Insights from one of our most insightful, creative, and eloquent AI educators in the business!!!
Terry Underwood’s Learning to Read, Reading to Learn: The most penetrating investigation of the intersections between compositional theory, literacy studies, and AI on the internet!!!
Suzi’s When Life Gives You AI: An cutting-edge exploration of the intersection among computer science, neuroscience, and philosophy
Alejandro Piad Morffis’s Mostly Harmless Ideas: Unmatched investigations into coding, machine learning, computational theory, and practical AI applications
Michael Woudenberg’s Polymathic Being: Polymathic wisdom brought to you every Sunday morning with your first cup of coffee
Rob Nelson’s AI Log: Incredibly deep and insightful essay about AI’s impact on higher ed, society, and culture.
Michael Spencer’s AI Supremacy: The most comprehensive and current analysis of AI news and trends, featuring numerous intriguing guest posts
Daniel Bashir’s The Gradient Podcast: The top interviews with leading AI experts, researchers, developers, and linguists.
Daniel Nest’s Why Try AI?: The most amazing updates on AI tools and techniques
Riccardo Vocca’s The Intelligent Friend: An intriguing examination of the diverse ways AI is transforming our lives and the world around us.
Jason Gulya’s The AI Edventure: An important exploration of cutting edge innovations in AI-responsive curriculum and pedagogy.
This is such a fantastic piece of nuanced analysis Nick, thank you. And, as you seem to have a knack for doing... it's right on time. I am having conversations with a University client of mine about their 2025 programme of AI education for their lecturers, and this will inform a section of the content I build for them.
(Be assured, I always cite and recommend you in my workshops, and now that I've read your new book, I'll be recommending that to any lecturers that see content I create based on this essay of yours).
Thanks for sharing this really handy framework for thinking about AI learning tools!
It'd be fantastic to have a service that helps "grade" each AI educational tool across these dimensions and gives an up-to-date ranking/table to make decisions easier.
Sort of like you do in the article for a selection of tools, but constantly updated. A "Trustpilot" for AI learning tools, if you will.