Opinion | A.I. Can Be Really Dumb. But It’s Still a Good Tutor.
I understand why parents are unhappy with the proliferation of computers in school, as my Opinion colleague Jessica Grose documented in a recent series of newsletters. “One way or another,” she wrote in one of the pieces, “we’ve allowed Big Tech’s tentacles into absolutely every aspect of our children’s education, with very little oversight and no real proof that their devices or programs improve educational outcomes.”
I’m optimistic, though, that artificial intelligence will turn (some) haters into fans. A.I. can custom-make lessons based on each student’s ability, learning style and even outside interests. For example, imagine teaching ratios by showing a Yankees fan how to update Aaron Judge’s batting average. A.I. can also give teachers and parents the detailed information they need to help their young charges more effectively.
As I wrote last month, there’s a risk that A.I. will substitute for human labor and eventually render us all superfluous. How much better it would be for us to have A.I. teach us, developing our strengths so that A.I. becomes a complement rather than a substitute for human labor.
There are no guarantees that this nice vision will pan out, of course. At worst A.I. could turn into just a more sophisticated drill machine or an electronic babysitter. My bottom line for it is the same as Grose’s, which is that the use of devices and apps in school “should be driven by educators, not tech companies.”
I’ve never taught, so I checked out this newsletter with two of my nieces, both of whom are science teachers. One teaches at a middle school in Connecticut, the other at a high school in the Los Angeles Unified School District. More from them in a minute.
I admit that A.I. is probably more of a headache than a boon to educators right now. Generative A.I. — which can write a perfectly grammatical essay on any topic in seconds — is ideal for cheating. That shouldn’t be much of a problem in the long run, though, because A.I. is really good at detecting material generated by A.I. (It takes a thief to catch a thief.)
Another problem with A.I. in education is that it sometimes gets the answers wrong. When a reporter for The Wall Street Journal tested Khan Academy’s A.I.-powered tutor, Khanmigo, earlier this year, the software miscalculated subtraction problems such as 343 minus 17.
Or take this incomprehensible response by ChatGPT when asked how a man and a goat who have a boat can cross a river:
The man takes the goat across the river first, leaving the boat on the original side. Then, the man returns alone with the boat. He leaves the goat on the other side and takes the boat back to the original side. Finally, he takes the cabbage across the river.
That farcical paragraph lays bare how large language models are really nothing but autocomplete functions on steroids. This one had clearly ingested a lot of puzzles involving boats, goats and, um, cabbages.
That’s OK, though. A.I. can be useful without duplicating human intelligence, just as a wrench or hammer or screwdriver can be useful without duplicating human features.
“It’s not that A.I. is going to answer questions,” Jonathan Grayer, who founded the education tech company Imagine Learning in 2011, told me. “What it’s going to do is change the process by which teachers teach, kids learn and parents help.”
Emma Braaten, the director of digital learning at the Friday Institute for Educational Innovation at North Carolina State University, told me that the lesson of the past two decades in ed tech is to focus on augmentation, not substitution, of teachers — “to do things that were previously inconceivable.”
This gets back to Grose’s idea that ed tech needs to be built around teachers’ needs. That often hasn’t been the case. “A lot of times ed tech is put into classrooms through whim, word of mouth, ‘Hey, let’s try this,’” Steven Ross, the director of the Center for Research and Reform in Education at Johns Hopkins University’s School of Education, told me.
When I wrote about Khan Academy’s Khanmigo a year ago, I didn’t realize it was going to have arithmetic difficulties. This week I spoke with Kristen DiCerbo, the company’s chief learning officer, about that and other challenges. She said the company has come up with four measures to get the math right. One is that when the software detects math is being done, it sends the problem out to a calculator to get the answer. (How human of it.)
She also said Khan Academy tries to remind students that Khanmigo may be their amigo, but it’s not a human being. And not just because it might be wrong occasionally. “As a society we need to wrestle with what it means if people start forming relationships with technology,” she said. “Seems like a slippery slope.”
I’ll finish with some thoughts from my wonderful nieces. Abigail in Los Angeles isn’t all in on tech. “I would prefer to teach in a low-tech school that bans smartphones in the classroom,” she told me via email.
Still, she has found several uses for A.I. “I find that A.I. is really helpful for speeding up the annoying prep tasks that take me the longest, like generating a list of new stoichiometry practice problems for students to drill at home to be ready for an assessment,” she wrote. She also finds it useful “to reword scientific articles to help students with disabilities” and to help Advanced Placement students “brainstorm topics and research questions.”
Amy in Connecticut recounted how in the course of teaching a unit on plate tectonics featuring Mount Everest, she found an article about it in The Times that was above her students’ reading level. “ChatGPT helped me make it an accessible text,” she wrote. ChatGPT also helped her concoct a scavenger hunt for her seventh graders, complete with rhyming clues. “It turned something that would’ve been tedious into a task that only took several minutes.”
My nieces have figured out how A.I. can be a help, not just a hindrance. If they can do it, others can, too.
The Readers Write
Taking the system in isolation, holding down benefits at the top would certainly help. However, I like to look at things from a macro perspective. Federal deficits crowd out national saving and investment. I wonder if a reduction in benefits paid at the top would reduce spending in that group. If it doesn’t — and I suspect the effect would be quite small — the impact would be merely to shift saving from households to the government (in the form of a smaller deficit).
Charles Steindel
Glen Ridge, N.J.
I am very surprised that you and others who address this issue do not consider that taxing only the first $168,600 of annual earnings is a gift to those who receive very high salaries. Rather than taking benefits from the wealthy, let them pay their full freight for Social Security.
Virginia Orenstein
Lakewood Ranch, Fla.
Peter here: Dozens of readers made this point. Applying the payroll tax to all earnings would amount to a big tax increase for upper-income families.
Quote of the Day
“But the age of chivalry is gone. That of sophisters, economists and calculators has succeeded; and the glory of Europe is extinguished forever.”
— Edmund Burke, “Reflections on the Revolution on France” (1790)