Open AI’s Misguided Initiative to Put General AI Tools in Every Student’s Hands
Universal Access to GPTs; Expanded ChatGPT4o Access to Universities
Greetings, Esteemed Readers,
Before diving in, I wish to extend my heartfelt thanks to those of you who have chosen to support my Substack through paid subscriptions. Your vote of confidence means the world to me. Thanks to your generous contributions, I can devote more time to research and writing, as well as expanding Educating AI's network of contributors, resources, and materials.
On May 30th, OpenAI announced two significant initiatives aimed at expanding its presence in the education sector: free access to GPTs and a streamlined model for universities. While these moves aim to increase accessibility and ease of use, they raise important questions about the underlying purpose and design of OpenAI’s educational tools and products.
Free Access to GPTs: Market Share Over Safety?
In a notice sent to customers the day before the launch, OpenAI revealed that existing GPTs might not continue to work as expected under the new system. This indicates that OpenAI does not fully understand the potential impact of universal GPT access on the entire system. Is it safe? Instead of addressing these concerns, OpenAI appears to prioritize market share over safety, proceeding with the rollout and adopting a wait-and-see approach.
Source:
Introducing ChatGPT Edu: Streamlined University Access
The centerpiece of OpenAI's new educational offerings is ChatGPT Edu, a version of ChatGPT specifically designed for university environments. ChatGPT Edu aims to provide affordable, enterprise-level AI tools to students, faculty, researchers, and campus operations. It boasts advanced capabilities such as data analysis, web browsing, and document summarization, along with robust security and administrative controls.
While these features enhance accessibility and ease of use, they prompt critical questions about the true educational value of these tools. OpenAI’s promotional material highlights applications such as personalized tutoring, grant application assistance, and language learning tools. However, these examples do not fully address the need for tools that are designed in accordance with explicitly educational principles and goals. Effective educational tools should foster critical thinking, provoke extended inquiry, and encourage deep research. For instance, while ChatGPT can streamline tasks like grant writing or language practice, it does not inherently cultivate the complex skills required for high-level academic research or critical analysis. Tools designed without these educational principles risk becoming mere conveniences rather than transformative resources that challenge and engage students.
The Pitfalls of Overgeneralized Tools
History has shown that technology designed without specific educational intent can sometimes do more harm than good. A relevant example is the introduction of interactive whiteboards in classrooms. While initially hailed as a revolutionary educational tool, their implementation often lacked clear pedagogical purpose and training for teachers. As a result, many interactive whiteboards ended up being used as little more than expensive projectors, failing to enhance student engagement or learning outcomes. This overgeneralization of technology led to significant financial investments with little to no improvement in educational quality, highlighting the importance of designing tools with clear, educationally-sound goals.
Similarly, when powerful AI tools are made freely accessible without sufficient safety measures, they can be misused for a variety of purposes, including creating deep fakes, generating misleading information, or even conducting cyber-attacks. For instance, when OpenAI initially released GPT-3, there were concerns about its potential misuse, which led to debates about the need for stricter access controls and more robust ethical guidelines. By prioritizing market expansion over safety considerations, OpenAI risks repeating these issues on a larger scale with its current push for universal GPT access.
The Need for Purposeful Design in AI Education Tools
To genuinely serve the educational community, OpenAI must move beyond simply introducing tools into educational spaces with the hope that they will become educational by default. Designing AI tools that are truly educational requires a deliberate focus on fostering critical thinking, deep research, and genuine intellectual engagement. For example, an educationally-oriented AI product should not only assist with mundane tasks but also challenge students to engage in independent research, develop critical analysis skills, and participate in meaningful intellectual discourse. These tools should be designed in accordance with explicitly educational principles and goals, encouraging resistance and provoking extended inquiry. Without this focus, the tools risk becoming mere conveniences rather than transformative educational resources.
Conclusion: A Call to Action for OpenAI
While OpenAI's new initiatives offer benefits in terms of accessibility and ease of use, they fall woefully short of addressing the deeper educational needs of universities. OpenAI must abandon its oh-so predictable pursuit of market share and take the bold step of designing AI products that are purposefully built to support and enhance educational outcomes.
It's time for OpenAI to stop playing it safe and start revolutionizing education with tools that not only assist but also inspire, challenge, and cultivate the next generation of thinkers and innovators. Only then can they truly transform the integration of AI into educational frameworks and ensure these technologies enhance the learning experience in profound and meaningful ways.
Nick Potkalitsky, Ph.D.
Check out some of my favorite Substacks:
Terry Underwood’s Learning to Read, Reading to Learn: The most penetrating investigation of the intersections between compositional theory, literacy studies, and AI on the internet!!!
Suzi’s When Life Gives You AI: An cutting-edge exploration of the intersection among computer science, neuroscience, and philosophy
Alejandro Piad Morffis’s Mostly Harmless Ideas: Unmatched investigations into coding, machine learning, computational theory, and practical AI applications
Michael Woudenberg’s Polymathic Being: Polymathic wisdom brought to you every Sunday morning with your first cup of coffee
Rob Nelson’s AI Log: Incredibly deep and insightful essay about AI’s impact on higher ed, society, and culture.
Michael Spencer’s AI Supremacy: The most comprehensive and current analysis of AI news and trends, featuring numerous intriguing guest posts
Daniel Bashir’s The Gradient Podcast: The top interviews with leading AI experts, researchers, developers, and linguists.
Daniel Nest’s Why Try AI?: The most amazing updates on AI tools and techniques
Riccardo Vocca’s The Intelligent Friend: An intriguing examination of the diverse ways AI is transforming our lives and the world around us.
The marketplace in education has always been bloody. ACT is partnering with university’s to create superscores for admission, incentivizing multiple retakes and prep.
I quibble with you a bit. Bots may help students better understand analysis at the linguistic level (I’ve not tried any statistical analysis, but I don’t see it capable of either doing or teaching critical thinking skills. CT in the wild is not a collection of skills. It’s a heightened form of metacognition which bots can’t do. You are offloading some of the fundamental properties of human teaching to the bot.
This development caught me off guard yesterday. If OpenAI drops its pricing to a level most universities can afford, that would be good in many ways. I have not seen anything firm on that yet, and it may still be on the high side. That could create problems, including equity issues between better and less well funded institutions. That is nothing new, but this could exacerbate that. One option I wonder about is treating ChatGPT more like a textbook or class materials and offer it through Inclusive Access for when it is needed. Optimally, if a student needed it for more than one class, they would only be charged once. Not sure that is a good or realistic solution, just one of my first reactions.
I do think OpenAI is a very reckless company now and not easily trusted. I am not sure I trust any of the AI companies though. Some teachers and courses can use these tools effectively as they are, but they are not a panacea or fit all use cases. Their real uses may actually be niche, though it is unclear how big those niches are. I am less sanguine that you are about specialized educational AI tools. I've been working in instructional technology in some role since 1998. We will probably see some fairly good tools emerge eventually, but we will see a lot of bad ones, and a great many mediocre ones. I have not seen a really good one so far, but I'm literally from Missouri, so show me. Talking to people from the big three textbook/courseware publishers, I have the impression that they are conflicted (after all, they do not want their content consumed) and proceeding cautiously. We should start seeing more AI resources in their products this Fall, either tools from third parties that are bolted on to their courseware, or developed in-house for selected titles. Blackboard, of course is well ahead on AI in its tools. Canvas is moving more slowly. I have not been following Brightspace or Moodle closely. Another complicating factor for all of the players is going to be keeping everything modular, so permissions can be turned on and off as needed, by institutions, academic units, or individual professors. As long as AI policy remains decentralized, as it is in many places, conflicts between application feature sets and policies is going to be a problem. We are going to see a lot of trial and error before the major and minor players figure out what is useful and what is not.
The other announcement yesterday that caught my attention is Perplexity's Pages announcement. This effectively makes it an AI blog platform with posts readers can interact with. (https://www.perplexity.ai/hub/blog/perplexity-pages) This is partly aimed at educators. It will not have the impact that ChatGPT Edu will have, but it is a move I had not expected from that company. It makes me wonder about their long-term plans.