I haven't thought about this in terms of the efficiency-accountability spectrum, but it jibes well with something I was thinking of lately. I find that, with certain AI tools, I don't end up naturally incorporating them into my routine precisely because doing so would require "letting go" of some of my processes or control over the outcome.
So what happens as we get more comfortable with a given tool is that we necessarily let go of some control as well. In turn, this then leads to a loss of some accountability as we let the tool increasingly take over.
You got it, Daniel!!! We are grappling with human nature here. No easy solutions on the horizons. I like how this research suggests that even AI literacy initiatives---ostensibly, efforts to route us toward more reflective, system-2 thinking---can in turn bolster the authority of the model and thus lead to greater reliance. If anyone offers you an easy fix to the tradeoff, they are probably selling AI Snake Oil.
Absolutely! Just reposted to our Educational Strategy and Design Group LinkedIn audience. This conversation needs to be happening not only in higher education institutions but also with accreditors.
There is, I think, a touchpoint here with attention issues and distraction. For example. Nir Eyal has written about how effective technology gets us hooked, and he has also written about self-practices to help us become in-distracticable.
Nir Eyal's called his latest book "Indistractable" and it outlines key self-practices to enhance focus and minimize distractions:
1. Understand Internal Triggers: Recognize that distractions often stem from emotional discomfort, such as boredom or anxiety, rather than external factors like technology.
2. Pain Management: Eyal emphasizes that effective time management involves managing pain associated with these triggers.
3. Four-Step Model: He proposes a structured approach to regain control over attention, which includes identifying triggers, making time for traction, removing distractions, and reflecting on progress.
4. Self-Compassion: Practicing self-compassion can significantly improve productivity and focus.
Perhaps not all relevant, but some interesting parallels.
This is amazing, Nigel. I really like the point about self-compassion. We are all caught up in extremely complicated use cycles --- the world just keeps accelerating. These four points will be very helpful as I start to build up a response that focuses more on student mental health and well-being. Thank you!!!
You're welcome, Nick. I love what you're doing and the messages you send out. (Really looking forward to reading and reviewing your book!).
I think there are many approaches in different fields that resonate with Eyal's and yours. I think healthy concerns over AI will serve a nexus for these mindful self-practices. There is the real potential for the deterioration of human thinking and communication skills, but there is also the real potential for a renaissance of what it means to be human -- partly from resisting AI but also from embracing it critically with disciple as a curiosity driver. :)
That's curious research!
I haven't thought about this in terms of the efficiency-accountability spectrum, but it jibes well with something I was thinking of lately. I find that, with certain AI tools, I don't end up naturally incorporating them into my routine precisely because doing so would require "letting go" of some of my processes or control over the outcome.
So what happens as we get more comfortable with a given tool is that we necessarily let go of some control as well. In turn, this then leads to a loss of some accountability as we let the tool increasingly take over.
You got it, Daniel!!! We are grappling with human nature here. No easy solutions on the horizons. I like how this research suggests that even AI literacy initiatives---ostensibly, efforts to route us toward more reflective, system-2 thinking---can in turn bolster the authority of the model and thus lead to greater reliance. If anyone offers you an easy fix to the tradeoff, they are probably selling AI Snake Oil.
Absolutely! Just reposted to our Educational Strategy and Design Group LinkedIn audience. This conversation needs to be happening not only in higher education institutions but also with accreditors.
There is, I think, a touchpoint here with attention issues and distraction. For example. Nir Eyal has written about how effective technology gets us hooked, and he has also written about self-practices to help us become in-distracticable.
Nir Eyal's called his latest book "Indistractable" and it outlines key self-practices to enhance focus and minimize distractions:
1. Understand Internal Triggers: Recognize that distractions often stem from emotional discomfort, such as boredom or anxiety, rather than external factors like technology.
2. Pain Management: Eyal emphasizes that effective time management involves managing pain associated with these triggers.
3. Four-Step Model: He proposes a structured approach to regain control over attention, which includes identifying triggers, making time for traction, removing distractions, and reflecting on progress.
4. Self-Compassion: Practicing self-compassion can significantly improve productivity and focus.
Perhaps not all relevant, but some interesting parallels.
This is amazing, Nigel. I really like the point about self-compassion. We are all caught up in extremely complicated use cycles --- the world just keeps accelerating. These four points will be very helpful as I start to build up a response that focuses more on student mental health and well-being. Thank you!!!
You're welcome, Nick. I love what you're doing and the messages you send out. (Really looking forward to reading and reviewing your book!).
I think there are many approaches in different fields that resonate with Eyal's and yours. I think healthy concerns over AI will serve a nexus for these mindful self-practices. There is the real potential for the deterioration of human thinking and communication skills, but there is also the real potential for a renaissance of what it means to be human -- partly from resisting AI but also from embracing it critically with disciple as a curiosity driver. :)