Thanks for the article, David. I suppose we need to create the conditions for meaningful cognitive engagement where the struggle is worth it.
I worked in a school that destroyed the groundwork for meaningful struggle when administration became obsessed with data and metrics.
It pushed us into these manic modes of better instructional practice to the detriment of meaningful thinking practices. The idea that it has "meaning if it can be measured" will plague our educational systems until disruption arrives. That in itself is a "competence without comprehension" and it happens even without AI's shadow.
Thank you. I might add that we don't only judge the performance with wrong inferences about student learning but also the performances of teachers teaching, which of course can be accurate but also misleading. Thanks again.
Tty for this comment. I generally don't look upon the past that fondly, but as far as schooling goes it does seem that we lost something along the way. Only recently, almost middle aged, I came across "first principles thinking" for the first time. I'm pretty sure I wasn't taught that way.
Great piece, David. That study lines up with something I argued recently in "AI Didn’t Invent Cheating. It Just Made It Irresistible": LLMs don’t short-circuit learning if students have already done the hard cognitive work of grappling with the problem. That’s why we need both: good AI tools *and* teachers who know how to scaffold curiosity and judgment.
Hi Giorgio - Yes, this isn't really about AI. It's imprtant to acknowledge that avoiding effort seems an essential human quality. So, to that end, I think what we need is teachers who are committed to expending effort on thinking about their curriculum and the lessons they teach. If teachers aren't prepared to think, it's ahrd to see why their students will.
I’m unsure schools ever did much to encourage true learning. Who wouldn’t blush to discover all those hundreds of exams that one could no longer pass?
A few months back, out of the blue, I remembered how I never really understood the concept of a mole from chemistry. I asked AI and received the standard definition. But I pushed back in a way that I never could in class. Why carbon-12? Why that number? Why not more? Why not less? Why? Why? Why? I refused to accept anything, a bit like a stubborn kid. And when I understood the concept, I asked “so what”. But when my questions finally ran dry, I really understood what a mole was and why we needed it. That kind of knowledge just wasn’t the goal of my chemistry class. And somewhere in that little story, I think is the grain of some way forward. I really wish I were a student today.
Hi Kit - the way you describe how you use AI is similar to the way I use it. There's no doubt that used in this way it can enhance our ability to think. The trouble is, most of us, most of the time, avoid having to think.
Hasn't that always been an issue? I constantly integrate thoughts from books or from conversations into my world view. I've little doubt I've forgotten the origin of many of the things I think. I'm not sure this is a risk.
Hi David, yes it has, but now with AI is much more relevant. Integrating thoughts ia exactly the way to prevent this. The problem is that many people use AI for solving the problem, not thinking on it and helping you solve it. This is huge difference. You wrote "thoughts", it means you thought 😊
Thanks for the article, David. I suppose we need to create the conditions for meaningful cognitive engagement where the struggle is worth it.
I worked in a school that destroyed the groundwork for meaningful struggle when administration became obsessed with data and metrics.
It pushed us into these manic modes of better instructional practice to the detriment of meaningful thinking practices. The idea that it has "meaning if it can be measured" will plague our educational systems until disruption arrives. That in itself is a "competence without comprehension" and it happens even without AI's shadow.
Being obsessed about data and metrics is not a great approach. I wriote a bit about this here: https://daviddidau.substack.com/p/the-cost-of-borrowed-thought?r=18455
I think you meant to give another link?
Apologies: https://daviddidau.substack.com/p/mock-exams-regression-to-the-mean?r=18455
If you are interested, I wrote about this a couple of weeks ago.
https://open.substack.com/pub/betterthinkers/p/break-out-of-the-operational-vortex?utm_source=share&utm_medium=android&r=5jekme
Thank you. I might add that we don't only judge the performance with wrong inferences about student learning but also the performances of teachers teaching, which of course can be accurate but also misleading. Thanks again.
Tty for this comment. I generally don't look upon the past that fondly, but as far as schooling goes it does seem that we lost something along the way. Only recently, almost middle aged, I came across "first principles thinking" for the first time. I'm pretty sure I wasn't taught that way.
This was a great read. I also wrote on this study late last week...https://open.substack.com/pub/jdweber/p/when-thinking-gets-outsourced?r=4hwsmu&utm_campaign=post&utm_medium=web
Great piece, David. That study lines up with something I argued recently in "AI Didn’t Invent Cheating. It Just Made It Irresistible": LLMs don’t short-circuit learning if students have already done the hard cognitive work of grappling with the problem. That’s why we need both: good AI tools *and* teachers who know how to scaffold curiosity and judgment.
Hi Giorgio - Yes, this isn't really about AI. It's imprtant to acknowledge that avoiding effort seems an essential human quality. So, to that end, I think what we need is teachers who are committed to expending effort on thinking about their curriculum and the lessons they teach. If teachers aren't prepared to think, it's ahrd to see why their students will.
I’m unsure schools ever did much to encourage true learning. Who wouldn’t blush to discover all those hundreds of exams that one could no longer pass?
A few months back, out of the blue, I remembered how I never really understood the concept of a mole from chemistry. I asked AI and received the standard definition. But I pushed back in a way that I never could in class. Why carbon-12? Why that number? Why not more? Why not less? Why? Why? Why? I refused to accept anything, a bit like a stubborn kid. And when I understood the concept, I asked “so what”. But when my questions finally ran dry, I really understood what a mole was and why we needed it. That kind of knowledge just wasn’t the goal of my chemistry class. And somewhere in that little story, I think is the grain of some way forward. I really wish I were a student today.
Hi Kit - the way you describe how you use AI is similar to the way I use it. There's no doubt that used in this way it can enhance our ability to think. The trouble is, most of us, most of the time, avoid having to think.
What if the real risk isn’t losing thinking, but losing the capacity to tell whether a thought is truly yours?
Hasn't that always been an issue? I constantly integrate thoughts from books or from conversations into my world view. I've little doubt I've forgotten the origin of many of the things I think. I'm not sure this is a risk.
Hi David, yes it has, but now with AI is much more relevant. Integrating thoughts ia exactly the way to prevent this. The problem is that many people use AI for solving the problem, not thinking on it and helping you solve it. This is huge difference. You wrote "thoughts", it means you thought 😊