Exploring the ethics of artificial intelligence in K-12 education

November 3, 2021

Artificial intelligence is everywhere: from reading emails to referencing a search engine. It is in the classroom too, such as with personalized learning or assessment systems. In recent months, AI in the K-12 classroom became even more prevalent as learning shifted to and, in some cases, remained online due to COVID-19. But what about its societal and ethical implications?

Two Michigan State University scholars explored the use of AI in K-12 classrooms, including possible benefits and consequences. Here’s what their research found.

Student work from the activity of “YouTube Redesign” (MIT Media Lab, AI and Ethics Curriculum, p.1); two offerings from the MIT Media Lab are amongst those Akgun and Greenhow examine in their article.

“Artificial intelligence can help students get quicker and helpful feedback and can decrease workload for teachers, among other affordances,” said Selin Akgun, a doctoral student in the College of Education’s Curriculum, Instruction and Teacher Education (CITE) program, and lead author on the paper, published in AI and Ethics. For example, teachers may use social media to encourage conversations amongst students or use platforms to support instruction in hybrid or mixed-ability classrooms. “There are a lot of affordances, but we also wanted to discuss concerns.”

Akgun and co-author Associate Professor Christine Greenhow identified four key areas teachers should consider when using AI in their classroom.

  • Privacy. Many AI systems ask users to consent to the program using and accessing personal data in ways they may or may not understand. Consider the “Terms & Conditions” often shared when downloading a new software. Users may just click “Accept” without fully reading and digesting how their data may be used. Or, if they do read and understand it, there are other layered ways the program could be using their data, like the system knowing their location. Moreover, if platforms are required as part of curricula, some argue parents and children are being “forced” to share their data.
  • Surveillance. AI systems may also follow how a user is interacting with things; the resulting experience provides a personalized experience. In education, this may include systems identifying strengths, weaknesses, and patterns in a students’ performance. While teachers do this to some degree in their teaching, Akgun and Greenhow say, “monitoring and tracking students’ online conversations and actions also may limit [student] participation … and make them feel unsafe to take ownership for their ideas.”
  • Autonomy. Because AI systems rely on algorithms—such as predicting how a student may perform on a test—students and teachers may find it difficult to feel independence in their work. It also, the scholars say, “raise[s] questions about fairness and self-freedom.”
  • Bias and discrimination. These factors can appear in a variety of ways in AI systems like through gendered language translation (“She is a nurse,” but “he is a doctor”). Whenever algorithms are created, the scholars say, the makers also build “a set of data that represent society’s historical and systemic biases, which ultimately transform into algorithmic biases. Even though the bias is embedded into the algorithmic model with no explicit intention, we can see various gender and racial biases in different AI-based platforms.”

“Artificial intelligence can manipulate us in ways we don’t always think about,” reiterated Greenhow, co-author and a faculty member in Educational Psychology and Educational Technology at MSU.

The publication came as a result of the College of Education’s Mind, Media and Learning graduate course, which encourages students to develop a paper based on an area of research interest.

Selin Akgun

“We want to ultimately cultivate different pedagogies, materials and better support for teachers and students,” said Akgun, who is also a research assistant in MSU’s CREATE for STEM Institute on the ML-PBL project.

As one way to assist in that process, the paper outlines three free sources for teachers to use in a classroom. Akgun and Greenhow chose these—a few of many available—that provided several options including collaborative and hands-on activities for students. It also gives considerations and suggestions of where education can go from here.

“The questions this article raised became increasingly important during the COVID-19 pandemic,” Greenhow said. “More and more online tools were being integrated into the classroom—sometimes on the fly and with little time to think. This paper raised important ethical considerations to think about as we move forward with those applications.”

Christine Greenhow interacts with a student on a robot.
Christine Greenhow, center, interacts with a student via robot in an MSU course. (Photo: 2017)

More from our researchers

Akgun was featured on The Sci-Files podcast in February 2021, talking about educational research during a pandemic.

Greenhow recently answered questions about online and classroom learning during our second school year dealing with the pandemic. Read her expertise, and check out her website for even more.