27 Comments

The marketplace in education has always been bloody. ACT is partnering with university’s to create superscores for admission, incentivizing multiple retakes and prep.

I quibble with you a bit. Bots may help students better understand analysis at the linguistic level (I’ve not tried any statistical analysis, but I don’t see it capable of either doing or teaching critical thinking skills. CT in the wild is not a collection of skills. It’s a heightened form of metacognition which bots can’t do. You are offloading some of the fundamental properties of human teaching to the bot.

Expand full comment
author

Bloody is right. Yes, I can give the bot some credit with linguistics, but even there it is all about skillful prompting. "Out-of-the-box," GPT gives the illusion of analysis all the time, but as you probe deeper, you realize its linguistic "center" is very different from your own. I think GPT performs so well on standardized test because the analytical experiences are so methodically staged through passage selection, question construction, etc. But in truly "out-of-the-box" contents, we falters, and students will need some serious skills to get some real utility out of these general bots. Leading me to my point, we need better bots. I don't know about you but every chatbot I have built has just felt like 95% regular model with a little focused functionality spinkled in . We are using a fire hose to rinse our sinuses!!! Sorry for the ridiculous metaphor. Strangely enough, I have found the firehouse to be quite useful for specific personal and even educational uses. But imagine a classroom with 25 7th grades are armed to the hilt with firehoses. That is the situation OpenAI has created. Granted, we will figure out a solution. We teachers always do. But in the meantime, for goodness sake, slow down the tools roll out, and make something different. OpenAI really has innovated since ChatGPT4. Everything since has been derivative. Just feels like we are wasting precious time.

Expand full comment

Netti pot instead of fire hose?

Expand full comment

You miss my point. Here is what you wrote in your post: “it does not inherently cultivate the complex skills required for high-level academic research or critical analysis. Tools designed without these educational principles risk.” You’re right. It doesn’t. It can’t. That’s like asking a hoe to weed your garden. Teachers have a higher-level task, and your language describes part of it. You also discuss “principles of education” bots must be designed to respect. Bots don’t operate on principles. Humans do. Bots operate on algorithms. You’re sounding a bit hyperbolic.

Expand full comment
author

Yes, I did miss your point. I am in firehose mode myself. Haha!!!

Expand full comment
May 31Liked by Nick Potkalitsky

This development caught me off guard yesterday. If OpenAI drops its pricing to a level most universities can afford, that would be good in many ways. I have not seen anything firm on that yet, and it may still be on the high side. That could create problems, including equity issues between better and less well funded institutions. That is nothing new, but this could exacerbate that. One option I wonder about is treating ChatGPT more like a textbook or class materials and offer it through Inclusive Access for when it is needed. Optimally, if a student needed it for more than one class, they would only be charged once. Not sure that is a good or realistic solution, just one of my first reactions.

I do think OpenAI is a very reckless company now and not easily trusted. I am not sure I trust any of the AI companies though. Some teachers and courses can use these tools effectively as they are, but they are not a panacea or fit all use cases. Their real uses may actually be niche, though it is unclear how big those niches are. I am less sanguine that you are about specialized educational AI tools. I've been working in instructional technology in some role since 1998. We will probably see some fairly good tools emerge eventually, but we will see a lot of bad ones, and a great many mediocre ones. I have not seen a really good one so far, but I'm literally from Missouri, so show me. Talking to people from the big three textbook/courseware publishers, I have the impression that they are conflicted (after all, they do not want their content consumed) and proceeding cautiously. We should start seeing more AI resources in their products this Fall, either tools from third parties that are bolted on to their courseware, or developed in-house for selected titles. Blackboard, of course is well ahead on AI in its tools. Canvas is moving more slowly. I have not been following Brightspace or Moodle closely. Another complicating factor for all of the players is going to be keeping everything modular, so permissions can be turned on and off as needed, by institutions, academic units, or individual professors. As long as AI policy remains decentralized, as it is in many places, conflicts between application feature sets and policies is going to be a problem. We are going to see a lot of trial and error before the major and minor players figure out what is useful and what is not.

The other announcement yesterday that caught my attention is Perplexity's Pages announcement. This effectively makes it an AI blog platform with posts readers can interact with. (https://www.perplexity.ai/hub/blog/perplexity-pages) This is partly aimed at educators. It will not have the impact that ChatGPT Edu will have, but it is a move I had not expected from that company. It makes me wonder about their long-term plans.

Expand full comment
author

Thanks for keeping me in the loop. Yes, I am trying to use the blog for a two-prong message. 1. We need better tech. 2. Realizing that our education spaces will be flooded with mediocre and bad tech, here is what we do. As always, I appreciate your perspective. The textbook company angle is interesting. Everyone is activating AI modality, and still there are very few resources for teaching students how to use these tools. Not much money in that business I guess. I will check out the Perplexity move. The fundamental issue is that any of these model in and of themselves aren't good for research. You have to develop specific skill sets in order to guide the tools for meaningful research exchanges. That is my big issue with a general tool these days---and particularly, advertising as if, it is ready "out-of-the-box" to do specific applications, when really there is a lot of human work between A and Z that isn't been acknowledged, instructed, evaluated, etc.

Expand full comment

One thing I realized from conversations and presentations this spring, teaching the students to use the tools and teaching the content does not mean that college students, even seniors can integrate AI skills well with other knowledge. This is at least partially a critical thinking question. It seems that the disconnect may be in figuring out and framing the kinds of questions that students need to ask the AI about their project topics to get useful results.

Expand full comment

Very much like the analog with interactive whiteboards. With whiteboards, at least there are some specific use cases...at least they were decent projectors. With generative AI it is more like here is a mystery box with a weird tool that might be really useful if we can figure out what it will do.

Expand full comment
May 31Liked by Nick Potkalitsky

I'd like to know which model this is running under? And at the university level? Good luck.

For example, GPT-4 cannot do "data analysis". It can retrieve results based on natural language user queries and it can create data visualizations based on those queries. Analysis? Forget it - it thinks it can, but it really can't. I've tested this many times.

Also, what happens when students research topics with little to no training data? GPT-4 can't say "I don't know". It will spew out something incorrect and irrelevant - I also see this frequently.

With current models, I think this initiative "in action" will be a shit-show.

Expand full comment
author

I would guess ChatGPT4o as that is the free best model. Yes, analysis with GPT requires human competencies. Human contencies require in-class instruction in AI usage. None of that is included or acknowledged in the plan. This is just a "ChatGPT" plan. The "Education" is happening elsewhere.

Expand full comment
May 31Liked by Nick Potkalitsky

I guess a more positive spin to put on this is to see the launch for its signaling value: It's OpenAI saying "We want to develop helpful tools for educators, and here's version 0.5"

I agree that in its current state, there doesn't seem to be anything fundamentally radical in the rollout, but it has the chance to build a foundation for future education-oriented tools. They could even piggyback on some of Ethan Mollick's extensive educational GPTs.

Expand full comment
author

Yes, I appreciate the optimism. I have had several readers point out to me how small the education AI market is. What lies beneath these comments is the feeling that these companies don't really need to do anything in the AI x Education space. So whatever they are doing is an expression of a genuine desire to be helpful. I think this is a very interesting point. I think it underestimates the complexity of the AI x Education which at least has 2 --- probably more like 3 --- distinctive domains: K-12, college, professional.

I think the college and professional spaces are responding to the round of recent AI x Education announcements very differently than folks in the K-12 space. It is no secret that K-12 is more concerned about how general tools will impact that acquisition and development of basic skills and competencies. The sooner Big Tech can figure this out the better. But here again, money talks. There is more money in the college and professional spaces. Thus, we will OpenAI focusing its initial efforts on universities--expanding university access. They haven't release a K-12 specific notice since last summer. Meanwhile, their policies continue to have a dispropriate impact on the way we do school in K-12. No wonder, no one in this space wants to even talk about AI integration. Lots of anger and resentment.

Expand full comment

That's an interesting bit of insight, thanks for sharing Nick.

I recognize that as an outsider, I probably do treat "AI in Education" as more of a homogenous umbrella, even though it should be clear that different age groups have very different educational and developmental needs. And my take is undoubtedly colored by people like Ethan Mollick who focuses primarily on the college/university demographic.

I'd actually love to see a deep dive by you about this resentment in K-12 about feeling neglected.

Expand full comment

K12 has ALWAYS been neglected. It is a factory, an assembly line. Teachers get pissed when something interferes with the machine. I know. I taught there for ten years. I taught in teacher prep for 17 years, including supervision and evaluation of student teachers.

The big difference between K12 and higher education is who owns the curriculum. Teachers in K12 are out of that fundamental loop. That’s why they are angry. Professors own it lock stock and barrel except in Texas and Florida.

Expand full comment
May 31Liked by Nick Potkalitsky

That's very curious. Are there some fundamental reasons there's such a discrepancy? Is it historical factors, the way things are governed, the organizational charts, the law, etc.?

Expand full comment

K12 is governed completely by the State Departments of Education. It is inextricably linked to policies in Education Codes that are huge. Higher education campuses have Presidents, not Superintendents, because faculty self-regulation has centuries of tradition. Public schools were conceived as a public charge (paid for by taxing everyone) as part of the fallout of the Civil War. The Federal Department of Education was opened in 1867.

Expand full comment
May 31Liked by Nick Potkalitsky

Thanks for sharing. As someone based in Europe, I'm quite removed from this background and the intricacies of the US education system.

Expand full comment

I think they are putting the cart before the horse, which is sad. They see they‘re losing credibility with their antics and drama so they rush these half-baked ideas out. Unlucky for them, others in the space are moving and adapting quickly.

Expand full comment
author

Thanks for your support, Alicia. "Unlucky for them."

Expand full comment
May 31Liked by Nick Potkalitsky

Uhmmm.... although understanding your opinion, as university professor using and encouraging the use of ChatGPT and others in my courses, I disagree. I understand as anyone else the risks, but I do believe there's no alternative in a world where AI is about to be ubiquitous.

Expand full comment
author
May 31·edited May 31Author

Hi, Jordi,

I am always excited to hear from successful early adopters.

There are very different perceptions about these rollouts across the educational spectrum.

I try to remind myself that although ubiquity feels like a historical necessity in hindsight, it is actual a historical contingency, and that we do have some power over the future state of what ubiquity will look like. My approach in this blog is two fold: 1. Interact with existing tools in order open pathways for better tools, 2. admitting the existence of said tools, create the best instructional pathways for my students. So yes, now the work is: Since all my 9th grade students have access to these tools, how do I structure a course to help them to continue to the foundation writing, reading, and thinking skills they will need to success in school and beyond? I hope you continue to follow along with this journey. I appreciate your willingness to share your disagreement. Hopefully, we will find more to agree on in the next post.

Expand full comment
Jun 8Liked by Nick Potkalitsky

It is evident the difference that the use of these tools implies at different moments of the educational process. From what I understand, OpenAI Edu is focused on universities, and in that environment, there is no other option, it’s mandatory. In your case, and that’s why I am a subscriber and read you, I share the same concerns. I also have quite a few doubts about when, at what age, and in what way these tools should be introduced into the day-to-day educational process.

I have the feeling that current studies do not have enough scope, meaning that testing the repercussions of this technology and its effects requires some time for evaluation. With the current pace of change, we may end up with conclusions for tools that have already fallen behind in their capabilities.

Difficult ;)

Expand full comment
founding
May 31Liked by Nick Potkalitsky

Spot on Nick unfortunately I don't think Sam is listening!

Expand full comment
author

Very true. But I am glad you are. And a few others. Small conversations can lead to better implementations. That is what keeps me going these days!!!

Expand full comment

There's a reason why hack means both a short cut and a con artist or poser. I've met a ton of credentialed people who don't understand the material they were credentialed under.

This push for AI carries risk in failing to perform the mission of education turning grads into hacks through hacks.

Expand full comment