For those of you who prefer to read off paper rather than the screen, we have converted the post into an easily printable pdf file. If you appreciate this, please consider supporting our work by upgrading to a paid subscription. Thank you:)
The motion picture is destined to revolutionize our educational system and…in a few years it will supplant largely, if not entirely, the use of textbooks.
—Thomas Edison, 1922
A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?
For many people, the instinctive answer is “10 cents”. It sounds logical, even obvious. But it’s wrong. If the ball costs 10 cents, then the bat (which costs a dollar more) would be $1.10, and together they would add up to $1.20.
The correct answer is 5 cents. Mathematically, it’s not complicated, you only need to slow down and really think about what you’re being asked. But that’s the difficulty. We don’t like to slow down. Thinking is effortful, unpleasant, and we often prefer to go with our immediate feeling.
Try these next questions. As you read them, notice what your initial or “fast” answer is, and then slow down and see if you can recognize how this initial fast answer might be misleading:
A father and his son are in a car accident. The father dies. The boy is rushed to surgery. The surgeon says, "I can’t operate on this boy—he’s my son!" How is this possible?
You are running a marathon, and manage to catch up and pass the person in second place. What place are you in?
Emily’s mother has four children. Their names are “April”, “May”, “June”, and—? What is the name of the fourth child?
If you still aren’t sure about the answers, you can check them in this footnote.1
Recently, we attended a live talk by Derek Muller from the Veritasium channel. The topic was AI and education, and knowing almost nothing about Muller—except that he had a YouTube channel with over 17 million subscribers—we expected to hear a lot of gung-ho cheerleading about how AI was going to revolutionize learning in the classroom.
But as we listened, we became intrigued, and then we were stunned.
For much of his talk, Muller didn’t even focus on AI, but on how people think. Drawing on the psychological work of Daniel Kahneman2, he pointed out that our thought processes can be viewed in terms of two separate systems. They’re often labeled as System 1, or the fast system, and System 2, or the slow system.
Or, we can think of them as the hare-brain and the tortoise-brain.
The revolution that keeps failing
When we learn to drive for the first time, we rely on our tortoise brain. We think slowly and deliberately about each step, like starting the engine, putting the vehicle into gear, accelerating properly, and braking at the right moment. The process is effortful, controlled, and requires a high level of conscious attention.
But once we learn to drive, we barely notice any of these steps. We’re not the tortoise anymore, now we’re the hare, signaling and accelerating and braking so effortlessly that we often barely notice we’re doing it. Other than emergency situations, or when traffic conditions are complicated or change unexpectedly, driving doesn’t require intense attention.
We might call driving a “hare-brained” activity, not in the sense of being crazy or foolish, but in the sense of being quick, well-learned, and automatic. The same is true of reading, recognizing faces, and recalling memorized facts like multiplication tables or how to make a good ratatouille.
So what does this have to do with AI and education?
It’s often assumed that AI will “revolutionize” education, yet Muller reminds us that this kind of hype isn’t new. In the 1920s, Thomas Edison thought motion pictures would transform education. He even believed that education from a textbook was only 2% efficient, whereas education from a motion picture would be 100% efficient—meaning Hollywood should have become Harvard.
Which definitely didn’t happen. Then, in the 1930s, many believed that rather than having a teacher in every classroom, lessons could be taught through radio, allowing a single teacher to reach thousands of students at the same time. In the 1950s, TV was going to revolutionize education. In the 1980s we thought it would be computers. It was video disks in the 1990s, and then MOOCS (Massive Open Online Courses) in the 2010s.
But there was still no revolution. Why not?
Some might believe it’s because of the inertia of educational institutions—a refusal to embrace change. Others might believe that past technologies were simply over-hyped, and that AI might actually live up to the hype.
Or maybe, Muller suggests, there’s another reason the ed-tech revolution keeps failing?
This is where we come back to the two systems of thought. There’s something essential about System 2—that slow, tortoise-brained part of our mind. Although effortful thinking is difficult, and works against the grain of our own laziness, but there’s no other way to convert what we learn into System 1 thinking, unless we first go through System 2.
Cognitively, if we want to be hares, then we must begin as tortoises.
Or as Muller put it, “you should ‘suck’ at the beginning” when learning a new skill.
The problem with using technology to revolutionize education is that the idea is always premised on making learning so much easier. At last, no more slogging through System 2! No more being a tortoise, no more huffing and puffing through endless Latin conjugations, Shakespearean soliloquies, and calculus operations! No, our technology is going to hop, skip and jump us through all that sweat and tedium—yet in the end, somehow, we’ll still be experts, right?
Except we won’t be. We can’t develop System 1 if we bypass System 2.
Knowing and effort
We once knew a mathematician who taught at a university. He regularly complained that he would ask his students to learn certain equations, but to his dismay, they would come back to class having done something slightly different. Rather than learning an equation, they would learn about the equation. “No,” he would tell them in great frustration, “I don’t want you to know about the equation, I want you to know the equation.”
We can ask ChatGPT anything, or just look it up on Wikipedia. When did the Norman conquest of England happen? When did Christopher Columbus arrive in the Americas? But knowing where we can find information, or recognizing it when we see it, isn’t the same as knowing the information.
And knowing matters, because knowledge builds on knowledge.
Try to remember these 16 digits. Read them just once, then see how many you can recall:
6 6 0 1 2 9 4 1 5 4 9 1 1 0 0 2
If you struggled to recall all the numbers, don’t worry. A typical attention span for this kind of task is around 5 to 9 digits, give or take a few.
If you used your tortoise brain and were allowed to repeat the digits several times, you might recall them. But would you still recall them tomorrow, or next week, or ten years from now?
Actually, you might, if you developed a strategy. For example, we could associate the digits with meanings.
Here are the same numbers again, with a minor change: they have been grouped in four sets with the digits reversed, and with a hint (in italics) after each group. Scan through and see if you can remember them now:
1066 (Norman Conquest of England)
1492 (Columbus arrives in the Americas)
1945 (end of World War 2)
2001 (year of 9/11)
Of course, the strategy for grouping the numbers according to historical meaning might not have worked if we didn’t know when the Norman Conquest happened, when World War II ended, or the other dates. And yet, a lot of our knowledge does work this way. We learn something, and then we build on top of what we learned, and build on top again, on and on, endlessly converting slow System 2 thinking into fast System 1 knowledge.
Repeat effortful practice until mastery, in Muller’s words.
Which means there’s a danger in using AI or any technology in education if it causes us to skip the effort.
The decline of depth
According to the National Center for Education Statistics, reading abilities among US 13-year-olds reached its peak around 2012, and then began a gradual decline that was hastened during the Covid-19 pandemic. As of 2023, 13-year-olds scored only slightly higher than their same-age peers back in 1971. What is going on?
Research suggests that the brain encodes meaning more deeply when we read off print compared to digital text. Of course, 2012—the year the reading decline measurably began—wasn’t the year when computer screens were invented, but the early 2010s are considered a critical period of tech history, when smart phones and social media platforms really started to take off. As commentator John MacArthur points out,
For more than a decade, social scientists, including the Norwegian scholar Anne Mangen, have been reporting on the superiority of reading comprehension and retention on paper…But the work of Mangen and others hasn’t influenced local school boards, such as Houston’s, which keep throwing out printed books and closing libraries in favor of digital teaching programs and Google Chromebooks. Drunk on the magical realism and exaggerated promises of the “digital revolution”, school districts around the country are eagerly converting to computerized test-taking and screen-reading programs at the precise moment when rigorous scientific research is showing that the old-fashioned paper method is better for teaching children how to read.
When we read a sentence, like this one, we use System 1 to read at the word-by-word level. Yet, if you’ve been reading this essay from the very start, without scrolling up and down but trying to follow the underlying ideas in greater depth, then you’re using System 2—the tortoise-mind that is so vulnerable to disruption by digital technology.
Too often, our digital devices both hijack our attention and distract us through constant multitasking, while infiltrating our dopamine system through intermittent bursts of stimulation. No wonder it’s hard to read books deeply. Why bother when, compared to the slot-machine joys of a smartphone, it takes so much focus and effort?
And the impact of our inability to read deeply isn’t just theoretical, isn’t just educational. It has diminished the collective mind of a civilization. As observed by Adam Garfinkle:
A greater percentage of Americans may be deep literate in 2019 than in 1819 or 1919, but probably not than in 1949, before television, the internet, and the iPhone. We have reached a stage at which many professors dare not assign entire books or large parts of moderately challenging ones to undergraduates because they know they won’t read them.3 And while more Americans are graduating from four-year colleges than ever before, the educational standards of many of those institutions, and the distribution of study away from the humanities and social sciences, suggest that a concomitant rise in deep literacy has gone unrealized as the degree factories churn.
Deep reading is just one example of a System 2 mental skill that has been impacted negatively by our devices. Other learning skills have also declined.
While AI is a new technology, many public schools are already integrating AI-powered programs such as IXL or Khan Academy’s Khanmingo, and an online charter school in Arizona even has plans to “prioritize AI in its content delivery model”, using teachers merely as “guides” to oversee progress.
UNESCO reports that, “there is little robust evidence on digital technology’s added benefit to education”, and there is indeed overwhelming evidence that EdTech has been a failure4, and even a tragedy.
cited a paper by Mircosoft researchers who found that the use of generative AI “can result in the deterioration of cognitive faculties that ought be to be preserved”:[A] key irony of automation is that by mechanising routine tasks and leaving exception-handling to the human user, you deprive the user of the routine opportunities to practice their judgement and strengthen their cognitive musculature, leaving them atrophied and unprepared when the exceptions do arise.
Based on these findings, should we expect that putting EdTech on AI-steroids will result in better learning outcomes?
We can’t know all the ways that AI might be deployed in education settings, but one thing is fairly certain: the more AI interferes with student effort and focused attention, the more likely it will diminish their learning.
The learning equation
“Better than a thousand days of diligent study is one day with a great teacher.”
– Japanese proverb
Our technologies are often said to “disrupt” society, in the positive sense of innovating. But as Derek Muller points out, while we can be optimistic about how technology might disrupt an area like healthcare, it’s harder to be optimistic about how it might disrupt education.
Should education even be disrupted? What if there is nothing wrong with the fact that we need human teachers, physical books, and classrooms of students who sometimes struggle with learning, but mostly, eventually, figure things out, develop their tortoise legs, and start to run like hares? Maybe we’ve hit the ceiling with how easy education can be, or ought to be?
It’s not that we can’t make any changes to teaching and learning, but what if the real innovation isn’t at the level of technology, but of people?
This gets closer to the heart of the problem. Learning is more than just about getting information and skills into somebody’s head. It happens in the context of relationships.
And relationships cannot be mechanized.5
We have become so ensnared in the machine paradigm of easier, faster, automated, that we fail to realize that, while this may work for car parts, it does not transfer to teaching students.
, who often engages the challenge of using AI in education, observes:…only a human can see students and engage with them in the context of a relationship; to encourage them to keep going even when they encounter struggle and want to give up. The world's most advanced personalized generative AI chatbot is fundamentally unable to replace the personal connection offered in the relationship between the teacher and student. ChatGPT or Khanmigo might be able to give me platitudes or encouraging messages to urge me to keep going even when I get a wrong answer, but it’s not genuine.6
And yet, some people still wonder how we might make the relational context of learning faster or more efficient. During Muller’s talk, an audience member asked how teachers can “scale” a personal connection when teaching large classes. Muller’s reply was one of the highlights of the evening:
“How do you scale a personal trainer? How do you scale a plumber? How do you scale an electrician?
You don’t.
You just have lots of them. And I think that’s the answer; I think that’s the solution.
The goal should always be to get more of them [teachers] and make them better.”
Mic drop. We are stunned and exhilarated to hear such a sensible, hopeful response. So are the people around us.
We ended up speaking with several teachers who for years have been assured that the EdTech Goliath would render them obsolete7, and they were relieved to hear somebody saying what they’ve known all along: you cannot downgrade the human and relational context of learning without downgrading the quality of learning.
This was also a hopeful message for young and aspiring teachers, including some of our own children. The educational revolution is not coming. It already happened—a long time ago. In Muller’s words:
“We may have already found the best thing: Being in a room with other people, other learners, a teacher, and some time to talk.”
Expressed more succinctly, and taking into account everything above, we can think of learning in terms of the following equation8:
To learn something, we need to use cognitive effort, and we need to do it through human relationships; and since we can only absorb so much information at once, we need to present just enough information to challenge students but not to overwhelm them.
Then repeat until mastery, and keep doing it.
Or, to express it more beautifully, it seems that the educational model depicted in Raphael’s School of Athens got things right over two thousand years ago.
What are your thoughts on using AI in classrooms?
What was your best learning experience in an educational setting?
Could a humanoid AI robot replace a teacher?
Please share your reflections in the comments!
We are happy to offer this post to you for free, but it took a lot of time and effort :) If you found this post helpful (or hopeful), please consider supporting our work by becoming a paid subscriber, or simply show your appreciation with a like, restack, or share.
About the Authors
Peco Gaskovski is the author of Exogenesis and also writes at , a newsletter about being human in an age of acceleration. Ruth Gaskovski is a home educator, polyglot, and loves long classic novels. Together they explore navigating the impact of technology in daily life on School of the Unconformed. As Swiss-Canadian dual nationals, they make their home on the borderlands of Mennonite country in Canada.
If the ideas we write about resonate with you, why not consider joining us for a most extraordinary extended conversation? Come and join us on a Pilgrimage out of the Machine as we walk the Camino in Spain from June 14th to 24th. Register today to save your spot! See here for details and download the brochure here.
You can watch the Derek Muller’s full talk delivered at the Perimeter Institute 9 here:
Further Reading
Humane Learning in a Machine Age: A Professor’s Resolutions by Dr. Ben Reinhard in Hearth and Field
The Most Compelling Argument Against Tech in Schools by
andThe Future of Education is Not Personalized and How a Certain Scheme to Improve the Human Condition Will Fail by
Education Requires Human Connection by
- and
Big Tech Hubris and Greed Behind Digital Education Failure by
for .Luddite Pedagogy: It’s OK to Ignore AI in Your Teaching by Brad East in The Chronicle of Higher Education
The Average College Student Today by
(this post has over 1000 comments and 1000 restacks, which probably means it’s worth your time)What’s Happening to Students by
What Students Lose When They Read AI-Adapted Texts by
In One Year, He Watched 16,000 Youtube Videos - During Class! by
How is this possible? The surgeon is the boy’s mother.
What place are you in? You are in second place.
What is the name of the fourth child? Emily.
Actually, the work of both Daniel Kahneman and his collaborator Amos Tversky. Kahneman won the Nobel Prize for their work in 2002, but Tversky had died a few years earlier and the prize is not awarded posthumously. Kahneman nevertheless commented, “It is a joint prize. We were twinned for more than a decade.”
See The Average College Student Today by
for a professor’s perspective that has clearly struck a nerve.The Most Compelling Argument Against Tech in Schools by
: “The OECD found that most EdTech ‘has not delivered the academic benefits once promised’, and that ‘students who use computers very frequently at school do a lot worse in most learning outcomes…The Karolinska Institute in Sweden recently published research concluding that, ‘there’s clear scientific evidence that digital tools impair rather than enhance learning’. Sweden has taken note and been the first country to kick tech out of the classroom, re-investing in books, paper and pens. They had the courage to admit that EdTech was a ‘failed experiment’.”According to a Forbes survey, 62% of educators noted that they are concerned that AI reduces human interaction in learning.
Bill Gates recently predicted that, within ten years, AI will replace teachers.
Muller pointed out that when writing is more difficult to read (e.g. on a very crumpled page) our thinking process slows down, making it less automated. Testing if this will work with handwriting here.
I loved this article. I'm so tired of hearing about how AI is going to make all of us obsolete. I'm a surgeon, and while technology can sometimes be helpful in our profession for making things safer, less invasive, etc, I think it would be a sad day when there was no longer a person to discuss your symptoms with and look you in the eye when delivering difficult news. Some of my greatest learning took place in my high school AP English class where we gathered our desks in a small circle with an "old school" teacher who led thoughtful discussions about classics we were reading like Hamlet or Wuthering Heights. He was tough on our weekly essays, marking them with red ink and lots of comments. I'm so grateful for the two years of AP classes I had with him. He touched countless students' lives with his very "low-tech", "high-relationship," high standard approach to teaching.
I loved this article so much! I'm an academic librarian with a teaching load. I teach research methods and routinely start with the question: what makes good research? Answer: Good reading and good writing. I have been told for years and years that some technology will render my work and position obsolete. Ways I plan to update my course for next academic year, is to request a non-computer-lab-classroom; require the annotated bibliography be handwritten first in a bluebooks and then typed/submitted for the final (I'll compare the entries and enter two separate grades weighting the handwritten notebook higher than the final product). It's certainly more work for me, but I am hopeful that it will be worth it in the long run.