The Future of School
Technology has changed, so should school
School hasn’t changed much since I was a kid. My kindergarten self would feel right at home in my son’s current classroom. Mini-me might not understand what a smartboard is (apparently it’s the latest and greatest version of an overhead projector), but the math posters on the wall and the reading booklets in the bins and the handmade crafts hanging from the ceiling are more or less the same, if a bit sleeker and shinier. My daughter’s third grade class isn’t much different. Sure, they take standardized tests with a computer instead of #2 pencils, and they have laptops in the classroom instead of marching down to a computer lab full of luggage-sized PCs, but the classroom and the curriculum and the pedagogy are the same. Maybe it’s different in High School. It should be.
The internet never quite got around to revolutionizing school the way it could have. We haven’t been able to escape the gravitational pull of memorization and multiple-choice. The purpose of school has always been to know things, not so much because its the best pursuit, but because it’s easy to measure the progress. Of course, it’s necessary to have a certain base of knowledge and skills, like the ability to read, and it’s nice to have more facts to work with, but true education is less about finding out what you know and more about knowing how to find it out. A drop of curiosity is more valuable than the date of every battle in history, and an ounce of critical thinking is worth more than the multiplication, periodic, and spreadsheet tables combined, because they can get you all of that and more. But we don’t know how to teach curiosity or critical thinking, much less test for it, so we hope that if we force kids to learn enough stuff they’ll get the general idea of how to do it at some point.
That used to work a lot better when facts were still hard to come by, when if you wanted to know something you had to search through a book or listen to a lecture and pick out the pieces you needed, when learning to do something meant asking someone to help you or trying and failing to do it on your own with the barest instructions. Then came the internet, and all the world’s information, by design, became free and easy. Just type in what you need. Just watch a video showing you exactly what to do. You can find an answer to anything without ever having to think about it. That’s an incredibly powerful tool, and it can do a whole lot of good if used correctly, but it will keep you from ever learning anything at all if you let it. Why bother putting a fact in your head when you have it in your pocket? Why waste time and energy thinking about an answer when someone else has already posted it online?
It’s a real threat, but only if we don’t account for it. Plato made a similar argument about books, a few thousand years ago, but we’ve gotten along just fine since then. Our memory may not be as good as it once was, when humans regularly memorized epic poetry, but we’ve used the advantages of writing in retaining and transmitting knowledge to far exceed our individual capacity. The same can be true of the modern technology.
Our educational system should have adapted to this reality long ago, but as far as I can tell, schools have treated technology more like an advanced textbook or notepad than the world-defining revolution it is. They’re fighting against the technology, trying to mitigate its disruption and fit it into the old frameworks more than teaching kids to use its unique advantages and overcome its limitations. Technology doesn’t assist with recall and research so much as obviate it. It’s so easy and simple as to be nearly irrelevant. A single-answer test, whether multiple-choice or fill-in-the-blank, is a moldering zombie. Google killed it. And AI is coming for the essay.
What kids (or adults) need today is not the retention of more information, which is becoming impossible as we accelerate its production, or even its summarization, which the AI can now do, but the ability to analyze the vast amounts of data we daily encounter, recognize what we are missing, find what we need, discern what is good and valuable and accurate, and organize it into a coherent understanding. They must be able to do the quintessential human task of finding meaning, which is the way the abstract information in a computer interacts with the individual person and the material world. That’s always been important, of course, but before you had to struggle with the facts first. Now we’ve gotten the worst of both systems. People’s ability to read and recall has naturally atrophied, but they haven’t yet learned to use the tools we have to make up for the loss. So we’re stuck pretending society’s problems are lack of information. The opposite is true.
The last time I was in school was for a graduate-level seminar a few years ago. Since the group of students was small, the professor invited anyone from the community to join. The reading list was fairly hefty, maybe 5-6 full books, which weren’t heavy academic texts but not exactly beach reads either, over the course of the semester, and a few other assorted essays and such. The class met one night a week for a few hours. It was a real course that gave real credit for the actual students, and the professor made it clear he expected everyone to treat it like they were taking the course (if not final exams). About 50 people showed up anyway.
The course was great. The topic was relevant. The books were fascinating. The professor was engaging. The lectures were compelling. The discussion was…not. One thing became very clear very quickly: no one had done the reading. The professor would ask a question, often calling for someone to explain a particular point in one of the books, and the room would go silent. After a satisfactorily few awkward moments, I would slowly raise my hand. I hate being the one to know all the answers (that’s not true, I love it, but I hate other people thinking that about me), but I was the only one, apparently, who’d done the reading or could remember what I’d read. But then, I’m a reader. I’d read a couple of the books already, and I read them again for the course. So I’d give the answer, and that would get the discussion going for a few minutes, until the next question.
I could tell the professor was a little frustrated. We got a different kind of lecture one night, if you know what I mean. But I think the real problem wasn’t with lazy students so much as the changing technological landscape. The professor wanted everyone to have knowledge of the material in the books, to prove they’d read them and inform the discussion, so he asked questions to test that knowledge, but that’s not how information is used anymore. Information is no longer something you must memorize in order to keep handy at all times. Information is now something you can look up when you need it and forget when you don’t. The new learning environment should present the students with the facts first, rather than trying to test their recollection, and then ask them to analyze those facts.
From a certain perspective (such as that of an old school professor), that’s a regrettable change. I wish people would read and remember the books, too—they’d be better off for it—but longform reading and remembering are unfortunately not the essential skills they once were. The most essential skill is taking information that is presented and being able to understand what it means in that limited context. Ultimately, then, education becomes about training the sentiment, the instinct for what is true and right and good, which can be applied to any set of facts without the need to reference vast stores of internal knowledge. When the professor asked for feedback, I emailed him this advice. I don’t know if he took it, but I think he’d be a lot less frustrated if he did. I can’t always be there to answer every question.
Thanks for reading mostlyDad! Subscribe for free to receive new posts and support my work.