AI

Opinion | Every Tech Tool in the Classroom Should Be Ruthlessly Evaluated


A complaint I heard from many public school parents who responded to my March 27 questionnaire and wanted a lower-tech environment for their kids is that they’re concerned about their children’s privacy. They couldn’t opt out of things like Google Classroom, they said, because in many cases, all of their children’s homework assignments were posted there. Molnar has a radical but elegant solution for this problem: “All data gathered must be destroyed after its intended purpose has been accomplished.” So if the intended purpose of a platform or application is grading, for example, the data would be destroyed at the end of the school year; it couldn’t be sold to a third party or used to further enhance the product or as a training ground for artificial intelligence.

Another recommendation — from a recent paper by the University of Edinburgh’s Ben Williamson, Molnar and the University of Colorado, Boulder’s Faith Boninger outlining the risks of A.I. in the classroom — is for the creation of an “independent government entity charged with ensuring the quality of digital educational products used in schools” that would evaluate tech before it is put into schools and “periodically thereafter.” Because the technology is always evolving, our oversight of it needs to be, as well.

Stephanie Sheron is the chief of strategic initiatives for the Montgomery County Public Schools, the largest district in Maryland, and all the district’s technology departments report to her. She likened the tech landscape, coming out of the Covid-19 pandemic remote school period, to the “Wild West.” School districts were flooded with different kinds of ed tech in an emergency situation in which teachers were desperately trying to engage their students, and a lot of relief money was pouring in from the federal government. When the dust settled, she said, the question was, “Now what do we do? How do we control this? How do we make sure that we’re in alignment with FERPA and COPPA and all of those other student data privacy components?”

To address this, Sheron said, her district has secured grant funding to hire a director of information security, who will function as the hub for all the educational technology vending and evaluate new tech. Part of the standardization that the district has been undergoing is a requirement that to be considered, curriculum vendors must offer both digital and hard-copy resources. She said her district tried to look at tech as a tool, adding: “A pencil is a tool for learning, but it’s not the only modality. Same thing with technology. We look at it as a tool, not as the main driver of the educational experience.”

In my conversations with teachers, I’ve been struck by their descriptions of the cascade of tech use — that more tech is often offered as a solution to problems created by tech. For example, paid software like GoGuardian, which allows teachers to monitor every child’s screen, has been introduced to solve the problem of students goofing off on their laptops. But there’s a simple, free, low-tech solution to this problem that Doug Showley, a high school English teacher in Indiana I spoke to, employs: He makes all his students face their computer screens in his direction.



Source

Related Articles

Back to top button