Beyond the Hype: Addressing Real AI Challenges in K-16 with Data Scientist Nikolas McGehee
Let's address the big questions about AI school infrastructure, accessibility, and security!!!
Welcome to Educating AI!
Thank you for joining our growing community exploring the intersection of AI and education. With 200-300 new subscribers weekly, you're part of an important conversation about how artificial intelligence is transforming learning environments from kindergarten through college.
Educating AI brings together my perspective and insights from education experts to tackle the real challenges and opportunities facing students, teachers, and administrators in this rapidly evolving landscape.
All content remains freely accessible, though paid subscriptions help sustain our innovative work.
Please share with colleagues who might benefit from these discussions.
Looking forward to navigating this educational frontier together,
Nick Potkalitsky, Ph.D.
Introduction
Artificial Intelligence in education: it's a topic sparking passionate debate and, frankly, considerable controversy. From questioning the fundamental purpose of AI in schools to anxieties about its impact on student work ethic and critical thinking skills, and even the thorny issue of who ultimately decides if and how AI enters our classrooms – educators, parents, and communities are grappling with profound questions. In these uncertain times, navigating the complexities of AI in K-12 requires informed guidance and real-world expertise.
That's precisely why we sought out Nikolas McGehee, Ph.D., Senior Data Scientist and Professor at Trevecca Nazarene University. Nikolas brings a valuable perspective to this conversation, shaped by his work at Michigan Virtual, an organization known for its commitment to student-centered innovation and impactful programs. At Michigan Virtual, they understand that "Education is changing faster than ever" and are dedicated to empowering schools with research-backed solutions to "move learning forward." Nikolas's experience within this forward-thinking environment, conducting research, piloting AI tools, and shaping AI policy, provides a crucial lens for understanding the current AI landscape. Let's explore our interview and hear his experienced voice.
Interview Sections:
Setting the Stage: Nikolas's Role and Experience
We began by asking Nikolas to briefly describe his work in the AI in K-12 space. His response immediately grounded the conversation in real-world experience:
Could you briefly describe your role and experience with AI in K-12 education?
I work as a data scientist and researcher for an online learning company. I have conducted research regarding student and teacher AI usage, pilot tested many different AI tools, and worked on policy development for AI use in K-12 education. I also teach doctoral level research courses, and incorporate a lot of AI in my practices.
This breadth of experience—from research to policy to practical application—sets the stage for a truly informed discussion.
The Allure and the Abyss: Benefits and Challenges
Next, we explored the core of the AI promise and the hurdles we need to overcome. The benefits, as Nikolas points out, are compelling:
What do you see as the primary benefits and challenges of integrating AI into K-12 education?
Benefits - I think that these are very apparent: with a small, but significant investment up front in learning AI tools, teachers and educators can utilize AI for a myriad of purposes including content creation, assessment, differentiation, lesson planning, compliance reporting, program planning and evaluation… the list goes on and on. It can free up time for teachers to get more face time with students, it can reduce feedback loop time with students when they can interact with AI, and it can help personalize each student’s learning journey when supplied with the necessary information to do so.
The potential for efficiency and personalization is undeniable. However, Nikolas doesn't shy away from the significant challenges:
Challenges - Privacy, Cost and Time, Pedagogical Shift, Student Brain-Drain - There are always going to be data privacy and sharing concerns with tools, AI or not, and this is no exception. Additionally, teachers require time and training, REAL TIME AND TRAINING not be mandatory meetings where someone talks about ChatGPT and throws up an example…
They need time to experiment and have literal hands-on experiences with the tools they are expected to use IN THE CONTEXTS in which they are expected to use them – JUST LIKE WE WANT OUR KIDS TO HAVE AUTHENTIC LEARNING EXPERIENCES, THIS IS TRUE FOR THE ADULTS TOO!!!!! Teachers must also learn how to change their teaching practices to integrate AI, which takes time, and assistance. And ALL of this takes money, of course.
Lastly, there is the concern of students over relying on AI and just using it to cheat and easily scrape by or excel without mastering any content…. And I say that honestly, that goes back to Pedagogical Shift – teachers MUST adapt and learn how to address higher order thinking skills rather than simple recall (LIKE WE HAVE SAID FOR YEARS) tasks, because AI can complete them easily. Changing learning to take place beside AI is key, because it’s not going away.
Nikolas’s passionate plea for “REAL TIME AND TRAINING” for teachers resonates deeply. He underscores that effective AI integration isn't just about adopting new tools, but about a fundamental pedagogical shift, a point educators have been advocating for long before AI entered the scene.
Leveling the Field: Equity Issues
Moving to the critical issue of equity, we asked about access and fairness. Nikolas shared a nuanced perspective:
From your perspective, what are the most pressing issues related to equitable access to AI tools in schools (e.g., financial, infrastructural, or policy-related)?
I don’t think a lot of it is financial related on the student side– any student that has internet access can use it. Teachers also. But APPROVED AI use or integration is different – that may be a financial issue.
While student access to basic AI tools isn't necessarily a financial barrier, the official and supported integration within schools presents a different picture, potentially widening existing disparities.
Guarding the Gate: Safety and Privacy
Data privacy and student safety are paramount concerns in education, and AI adds layers of complexity. Nikolas highlighted the gravity of these issues:
In your experience, how do data privacy concerns, particularly FERPA compliance, influence the decision-making process when adopting AI tools?
I think it’s a huge deal. Even for private companies with fewer data regulations have hesitancy to use AI for fear of disclosing company data to AI and it somehow coming back to bite them. Cyber-security concerns have been at the forefront of education entities, public, private, and nonprofit for years, and this is yet another tool that would likely interface with student data - if there aren’t protections, safeguards, and/or plans in place to deal with data breaches or other similar issues, schools aren’t going to invest. They already hemorrhage money from dealing with lawsuits, at-risk student programs, and other issues… so even with the benefits of AI, they likely see it as just another liability if it is integrated as an “official” tool.
The fear of liability, data breaches, and the potential financial fallout looms large, creating significant hesitation for official AI adoption despite the recognized benefits.
Regarding classroom safety, Nikolas points out the differing approaches for adult learners versus K-12 students:
What safety concerns have emerged when using AI tools in the classroom, and how have you or your institution addressed them?
Personally, when teaching adults, they all have clicked the “I agree” boxes when using AI tools, and so they are aware of many of the issues that they could face. And because the tools aren’t adopted by the university in an “official” capacity, it’s sort of a “user accepts all risk” sort of thing. I personally tell and show my adult students the power of AI and its use cases, as well as ways it can be erroneous or how it captures your data etc. They are aware. They accept risk.
With K-12 students, it's different, you have to be 13 minimum to use ChatGPT “officially” and even then, you have to have “parent permission” until you are 18. Kids can easily just click the “I accept”... goes back to what I said before. Schools won’t adopt AI tools “officially” because of liability…. But they will be happy to “unofficially” adopt tools or utilize them in the classroom.
This highlights the inherent tension: schools are wary of official adoption due to liability, yet the informal use continues, raising questions about consistent safety measures and responsible use.
Laying the Groundwork: Infrastructure and Affordability
For AI to truly flourish in K-12, a robust infrastructure is essential. Nikolas breaks down the key components:
What technological infrastructure do you believe is essential for successful AI integration in K-12 schools?
To successfully integrate AI in K-12 schools, you need a solid tech foundation, but it’s more than just having the right gadgets—it’s about building an ecosystem that works for students, teachers, and families.
First off, reliable internet is non-negotiable. AI tools, especially ones that rely on real-time responses or cloud-based systems, need strong, consistent connectivity. Without it, you're just setting up for frustration in the classroom. Then there’s the hardware. Students and teachers need devices that can actually handle AI applications—tablets, laptops, or whatever fits the district's budget and goals.
Plus, classrooms should have tools like interactive whiteboards or even AR/VR setups to make learning more hands-on and engaging. And let’s not forget the behind-the-scenes tech: servers or edge devices to keep things running smoothly if the internet goes down.
Now, data management is huge. AI needs data to work, but that also means schools need to have centralized, secure systems in place to handle it responsibly. Think student information systems (SIS) and learning management systems (LMS) that store and organize things like grades, attendance, and more, while staying compliant with privacy laws like FERPA and COPPA.
Speaking of privacy, parental permission is key—especially for platforms like ChatGPT that require explicit consent for students under 18. Schools need to be upfront with parents about what’s being used, why, and how their child’s data is being protected. Teachers play a big role too. If they’re not comfortable using AI tools, it’s not going to work. Professional development is a must, and it needs to go beyond “here’s how to click around in this app.”
Teachers should understand how AI fits into their classrooms and how to use it to save time or personalize learning for their students. And of course, we have to talk security. Schools are dealing with sensitive student data, so cybersecurity needs to be a top priority. Encryption, firewalls, regular audits—these aren’t just extras; they’re the basics. Ethical use is another biggie—AI isn’t perfect, and schools need policies in place to make sure it’s being used responsibly and fairly.
Lastly, the system has to be flexible and scalable. Schools grow, tech changes, and AI tools will keep evolving. The infrastructure needs to adapt without breaking the bank. If we build it right, AI can do amazing things—helping teachers be more effective, supporting students with different needs, and giving families confidence in the system.
Beyond the technology itself, Nikolas emphasizes trust-building and ensuring the infrastructure is sustainable and equitable. Affordability, of course, remains a major roadblock:
How have affordability challenges affected the ability to distribute AI tools equitably? Are there any funding solutions or strategies you’ve seen that help mitigate these challenges?
Affordability is one of the biggest hurdles when it comes to distributing AI tools equitably in schools. Let’s be real—AI tools can be pricey, and schools in underfunded districts are already stretched thin trying to cover basic needs like books and teacher salaries. When you throw in the costs of devices, software licenses, internet infrastructure, and ongoing maintenance, it’s easy to see how some schools fall behind. And it’s not just about the upfront cost; it’s the long-term investment. These tools require updates, training for teachers, and sometimes additional IT staff to keep things running smoothly.
What’s really frustrating is that the students who could benefit most from AI—those in underserved communities—are often the ones who don’t have access. Without equitable access, AI can unintentionally widen achievement gaps instead of closing them.
But there are strategies that can help. One big solution I’ve seen is leveraging grants. Federal programs like ESSER funds (Elementary and Secondary School Emergency Relief) or state-level initiatives can cover tech investments, including AI tools. Partnering with private organizations is another approach. Some tech companies are willing to provide tools at reduced costs—or even free—for schools that meet certain criteria. Nonprofits often step in too, helping schools identify and apply for funding opportunities.
Another tactic is prioritizing shared resources. For instance, some districts set up AI labs or resource centers instead of trying to put tools in every classroom right away. This gives students access without overloading budgets. Similarly, focusing on tools that are scalable, like cloud-based systems, can lower costs over time compared to investing in hardware-heavy solutions.
I’ve also seen districts get creative by partnering with local businesses or higher education institutions. These partnerships can provide access to technology, training, or even interns who help implement and maintain AI tools. And finally, professional development funding is crucial. It doesn’t matter how great the tech is if teachers don’t feel confident using it effectively.
To me, the key is planning for sustainability. It’s not just about getting the shiny new tool in the door—it’s about ensuring it can be maintained, used effectively, and benefit all students equally over time. That takes a mix of strategic funding, partnerships, and a real focus on equity from day one.
Nikolas outlines a range of pragmatic solutions, from leveraging grants to creative partnerships and shared resource models, all emphasizing the need for long-term sustainability and a commitment to equity.
Looking Ahead: Future Outlook
Finally, we asked Nikolas about the most immediate steps schools can take:
What's the most achievable next step for ensuring equitable and safe AI access in K-12 education?
Personally, I think the easiest lift to getting there is for schools to start drafting AI usage and data policy agreements just like all the software agreements we scroll to the bottom and click “agree” on. Sure, students/parents can opt out if they want, and you might have some, but overall I think if you can release schools from the liability of any of the aforementioned “issues” that could arise with using AI, then it eliminates or at least addresses a lot of the fear that many districts have regarding parental retribution.
It might sound sort of shady, but isn’t that exactly what we all do when we create accounts in ChatGPT or Claude? Click agree, and go on. And then we can’t sue them if Claude goes off rails and hallucinates something wild, or if Gemini starts insulting us.
Getting parents to accept liability and responsibility for their child’s AI usage, and then utilizing the free tools-- that’s the easiest lift in my opinion.
His somewhat provocative suggestion of AI usage agreements, mirroring the “click-agree” culture of online platforms, is a thought-provoking, if perhaps controversial, starting point for addressing liability concerns and encouraging broader adoption.
Conclusion:
Nikolas McGehee’s insights offer a clear path forward for AI in K-12 education. He reminds us that while the potential is significant, realizing it depends on facing the challenges head-on: prioritizing equity, ensuring safety, and building solid infrastructure. The key takeaway isn't simply adopting the newest AI tools, but focusing on thoughtful implementation, robust teacher support, and open conversations with everyone involved – teachers, families, students, and communities. To truly empower all learners with AI, we must commit to these principles. Thank you, Nikolas, for sharing your valuable experience and guiding perspective on this crucial topic. This interview served as source material for my presentation at the Ohio Educational Technology Conference (OETC) on “Bridging the AI Divide: A Practical Guide to Equitable, Safe, and Accessible AI in K-12 Education,” highlighting the real-world need for these very insights.
Check out some of our favorite Substacks:
Mike Kentz’s AI EduPathways: Insights from one of our most insightful, creative, and eloquent AI educators in the business!!!
Terry Underwood’s Learning to Read, Reading to Learn: The most penetrating investigation of the intersections between compositional theory, literacy studies, and AI on the internet!!!
Suzi’s When Life Gives You AI: An cutting-edge exploration of the intersection among computer science, neuroscience, and philosophy
Alejandro Piad Morffis’s Mostly Harmless Ideas: Unmatched investigations into coding, machine learning, computational theory, and practical AI applications
Michael Woudenberg’s Polymathic Being: Polymathic wisdom brought to you every Sunday morning with your first cup of coffee
Rob Nelson’s AI Log: Incredibly deep and insightful essay about AI’s impact on higher ed, society, and culture.
Michael Spencer’s AI Supremacy: The most comprehensive and current analysis of AI news and trends, featuring numerous intriguing guest posts
Daniel Bashir’s The Gradient Podcast: The top interviews with leading AI experts, researchers, developers, and linguists.
Daniel Nest’s Why Try AI?: The most amazing updates on AI tools and techniques
Riccardo Vocca’s The Intelligent Friend: An intriguing examination of the diverse ways AI is transforming our lives and the world around us.
Jason Gulya’s The AI Edventure: An important exploration of cutting edge innovations in AI-responsive curriculum and pedagogy.