Zephyrnet Logo

AI: Can it think like your students do?

Date:

Key points:

2023 was a breakout year for artificial intelligence, with explosive growth of generative AI tools.

Since researchers at Carnegie Mellon University helped invent AI in the 1950s, AI has been transforming how we learn, work, and play–and that change is now happening at breakneck speed.

Over the last 30 years, I have witnessed the evolving landscape of AI in education. Many early AI efforts were focused on using computers to model human thinking as a way of confirming our understanding of how the human mind works. For example, Herb Simon and others studied how chess masters played the game in order to understand problem solving. They discovered that much of their skill involved developing perceptual abilities that allowed them to look at a chess board and immediately see potential moves, rather than searching all possible moves.

Over time, AI diverged into two tracks: replicating human intelligence, and expertly accomplishing tasks thought to be unique to humans. AI chess programs, like much of AI, focused more on playing the game well and less on playing it the way humans do.

In education, AI retains its focus on cognitive modeling. Unlike chess, where playing the game well is the point, education systems need to track students’ reasoning in order to help students build expertise. It’s not about speed or efficiency in arriving at the correct answer; it’s about nurturing a student’s comprehension and conceptual understanding.

The experience of creating AI that models human thinking is, perhaps, more relevant in education than in other fields. So, how do we ensure that AI supports the fundamental goal of fostering a student’s understanding, rather than simply focusing on speed, efficiency, or correctness?

Here are a few questions to consider when researching and evaluating AI programs for the classroom.

Does the AI think like a student?

Education is all about making connections with students. Because each student has different backgrounds, experiences, and interests, good teachers adjust their instruction to match each student’s needs. Good educational AI needs to do the same thing.

This is where empathy and data intersect. An effective AI program should grasp the student’s perspective, identifying where they stumble and why.

Take math, for example. Many students form common denominators to multiply fractions, even though they do not need to do so. A good teacher will recognize this error as indicating a lack of conceptual understanding about what multiplying fractions means and how it differs from adding them. AI should do this, too. An advanced AI program will have a cognitive model that helps it understand why students might confuse the two operations so it can intervene with hints, recognize common errors, and guide students toward a deeper understanding.

In this way, AI can also assist teachers by acting as a one-to-one coach for students. AI can adjust to every action students take to meet them where they are and help them progress at a very detailed, skill-by-skill level.

Does it provide teachers with critical data to help them guide students in real-time?

There are some things that technology excels at, like collecting data, and other things that teachers excel at, like teaching and motivating students. AI that is built with a live facilitation tool can provide teachers with in-the-moment data, such as when students are working or idle. Real-time alerts can indicate when students need extra support or when they’ve reached milestones.

When teachers have actionable insights into how their students are working and performing on specific skills or standards–as well as predictions of how far they are expected to progress by the end of the year–they can manage, guide, coach, and intervene more effectively.

Does it allow students to track their own progress?

In addition to providing teachers with data, AI should enable students to see their own progress. As students see their proficiency improving in each skill, their confidence grows and they become motivated by their results. They begin to develop a sense of ownership in their learning and a sense of responsibility for their success.

Is the AI unbiased?

Despite its benefits, AI can also bring ethical challenges to education. For example, some AI tools have been shown to exhibit bias. Even if that bias is unintentional, it can amplify stereotypes about race and gender.

There are many ways to guard against bias in data sets. To start, organizations that develop and instruct AI models for education–or any field–should have diverse teams. They should also rigorously test their programs to identify potential bias and then continually monitor them.

Is the technology safe, secure, and effective?

As with any technology, AI programs should protect student security and privacy, and abide by all applicable laws.

Further, engagement with the program should result in improved outcomes and better support for students, including those who have been historically underserved. Like other education and edtech programs, AI-powered software should be built on evidence-based research, as well as research on how the brain learns, to give students the best learning experience possible. It should also be proven by research to measurably improve students’ learning, growth, and achievement.

Looking ahead

AI has immense potential to transform teaching and learning. It’s time that the realm of AI in education evolves beyond mere efficiency and correctness. The true revolution lies in using AI to empower and elevate the minds of our students.

Dr. Steve Ritter

Dr. Steve Ritter is the founder and chief scientist at Carnegie Learning. He earned his Ph.D. in cognitive psychology at Carnegie Mellon University, and is the author of numerous papers on the design, architecture, and evaluation of intelligent tutoring systems and other advanced educational technology.

Latest posts by eSchool Media Contributors (see all)
spot_img

Latest Intelligence

spot_img