Discussion about this post

User's avatar
Guy Wilson's avatar

Nick, I can't speak to the K-12 environment; I just don't know enough about what's going on there or what is going on there in the last 35 years. I've been thinking a lot this week about the beginnings of the use of the web in higher education. That was one of the last times there was a technology that anyone could use for free that also had such transformative power. Committees and task forces were formed, workshops were held, money and release time was made available for professional development. Eventually, departments like the one in which I work were formed to support its use.

The environment is very different this time around. We have had three decades of web use on campuses (Netscape Navigator, which was the first browser for most of us, so perhaps a parallel to ChatGPT arrived 30 years ago this October). Most of the use was on desktops. Laptops were not that common yet, certainly not in classrooms, and all connections were wired until 1997. Cellphones were more likely to be seen on TV than on a midwestern campus. Educational technology was largely coming from universities and a few companies. Higher education was becoming more expensive but was still seen by most as a positive good. Privacy and security were minimal concerns. Intellectual property and acceptable use were the main worries. There was very little talk of ethics.

The environment today is far different. Higher education is expensive and often beleaguered. Most educational technology comes from for-profit corporations and startups. Over the last fifteen years or so, venture capital has discovered education. The infusion of money and business practices has created a highly competitive landscape, so that when something trendy and innovative like GenAI comes along, it is immediately incorporated by companies looking for an advantage. (An interesting side note is that the big textbook publishers, most of whom have had other forms of AI in their courseware for years, have been more cautious and slower off the mark with GenAI.) Laws and the interpretation of laws have changed. So have accreditation standards. So has the campus policy environment. Privacy, security, intellectual property, and acceptable use have become much more important over the last thirty years. Courses, especially those offered only online, have to follow design guidelines and faculty may have to undergo online teaching certification. There are larger numbers of instructional designers to provide advice and instructional technologists to provide support. In addition to training from the university on how to use the technologies, there are trainers from all of the educational technology companies. All are offering advice and guidance. This may be oriented towards a particular set of design principles or the way a particular product can be used. In the best cases, the faculty voice is heard. In the worst ones, and there are a few of these coming from ed tech companies, the technology forces the instructor to teach in a certain way. In the wake of the Pandemic, many instructors are more used to asking for and accepting advice on how to teach with technology than before. I think that is significant too. There are still plenty of experimenters, don't get me wrong, the faculty have all kinds of different skill levels and levels of comfort with using technology in their teaching, but the Pandemic does feel like a turning point in what they are willing to accept.

Something else that is different today is fear. There was some, but mostly wonder, in the early days of the web. The first wave of reaction to ChatGPT by professors was different. My involvement began because of cheating and plagiarism concerns. I am the lead support for Turnitin for our university system, and at that time was also the lead on automated proctoring software. Both had already exposed me to a lot of ethical concerns about the products we were using, while the proctoring programs introduced me to algorithmic bias and the coded gaze in 2020. I got involved in AI because of the fears and the ethical concerns that were manifest already in December and January following ChatGPT's release.

This is new in my experience. I think it is partly shaped what's going on with Generative AI. The range and extent of the fears have changed in the last 18 months. While there is still a lot of concern about plagiarism and cheating, the broader ethical questions have come to the fore. Concerns about bias are much more prevalent. So are issues of equity, and threats to creativity, intellectual property, privacy, and security, as well as to society, the economy, democracy, the environment, and the climate.

All of this is much different from the early days of the web on campus. It means more policies and guidelines. For some it means more acceptance of the ways an application allows one to teach. For others it means more experimentation. Over all, it means even more attention paid to technology in teaching and learning than before.

I think we have been fortunate on our campuses to see a lot of room made for individuals and departments to work through policies and uses. We have also had fruitful collaboration between professors and instructional designers in making the technology beneficial to classes.

Now that more policies and guidelines are being introduced, there is an attempt being made to balance the freedom to experiment (or even reject AI) with the need to maintain quality and uphold ethical and legal standards. There is a lot going on. It is exciting, worrying, and often frustrating. I can see that in such an environment there is a temptation to dictate and hope that it can be held at bay.

Expand full comment
Tom Daccord's avatar

Very thoughtful piece, Nick. My experiences leading workshops tells me that if you provide opporunities early for teachers to express their concerns, misgivings, hesitations, or (in this case) successes, you create an environment of openess and sharing that lays a strong foundation for teacher engagement. And the more teachers actively engage with a technology the more likely they are to develop a measure of confidence, comfort, and understanding to integrate the technology.

Expand full comment
15 more comments...

No posts