Discussion about this post

User's avatar
Michael Woudenberg's avatar

Good insights here. More than the tools, teachers need workflows for those tools.

Expand full comment
Guy Wilson's avatar

The issues in higher ed are a little different, often only as a matter of degree. One of the biggest differences is for institutions that host hospitals and clinics. The privacy of medical records (HIPAA data if you are in the US) requires a higher level of security, and because some of our software (e.g., O365) crosses academic and administrative boundaries, a tremendous amount of due diligence is required to ensure that data does not get sucked into Copilot-type applications.

Another thing that makes for a degree of difference are the requirements of the variety of subjects that are taught. While we are beginning to see some basic principles of teaching and learning with AI in both the core curriculum and individual subjects, this is still in its early stages. Having an AI orientation course may be important to transition students to using it in college, and AI certificates (to allow them to explore it more deeply) are also emerging. The ways that AI may be used in different fields (say Computer Science, Nursing, Journalism, Art, Physics, and English) are as different as those areas of study. The needs for accuracy, the consequences of mistaken information, of data breaches, and ethics are all much higher than is usual in K12. If, for instance a nursing student learns the wrong thing from an AI, that is more likely to have serious consequences for others than if a middle schooler does. Of course if the middle school's AI manages to systematically indoctrinate students into a particular way of thinking, that could also have broader implications for the students and their community. One thing I have not seen addressed much about AI tutors and teachers is that they can be hacked in ways that humans cannot. Also, unless we develop AIs with theories of mind and good understanding of the physical and social world, they will not be able to relate to students in the same way that a human teacher would.

In terms of dealing with companies, I agree with Nick that educational technology companies are more likely to provide safe and reliable AI for students than the AI giants. One thing to consider is the extent to which they are developing applications at the same time as institutional policies that may limit the use of some functions is being developed. I have already run into this a few times. The developers need to build in fine-grained settings to enable to disable a functions at an institutional or course level - preferably both.

As for the AI giants, almost all of them have exhibited unethical or questionable behavior. This raises the concerns about whether use of their tools can ever be ethical. Their often lax approach to privacy and security, let alone accuracy, means a level of vigilance is required of institutions that is more extreme than before. It is taking a great deal of time and effort to exercise that vigilance. In the short term, it is largely being done by individuals and departments taking on extra work or pushing back other projects. In the long run, it is going to require additional human, fiscal, and technical resources.

The behavior and statements of the AI giants so far are actually not promising for education. Some of them, or at least their cheerleaders, seem to have teachers and professors in their sights - just more jobs to be deskilled or eliminated. There are some who see schools, colleges, and universities withering on the vine as everyone gets a personalized AI tutor. The greater societal implications of that are large. On the other hand, for a company to capture most of those institutions would be a huge windfall in revenue, influence, and long-term power. Some of the statements from some of the companies or their leaders raise questions about the nature of this game. We do need to treat them as companies - though we need to think about the big players in terms of what they are, cloud capitalists, which operate very differently than other corporations. We also need to realize that some of these companies stated aims indicate they are playing a very high-stakes game for control of the future.

What Nick and others are starting to do is introduce educators and those who care about education to a different way of understanding education. They are leading us to consider things we might not consider otherwise, or that only a handful of staff in a school district, community college, or university have had to deal with in the past. They are also pointing to the ways that AI is changing the larger contexts in which education and educational institutions operate.

Expand full comment
23 more comments...

No posts