FacebookBlueskyLinkedInShare

LinkedIn Live: Navigating the Promise and Perils of Generative AI in Math Education Transcripts

Recorded November 20, 2025

Featured Speakers

  • Ann Edwards, Director of Mathematics at WestEd
  • Drew Nucci, Research Associate, Mathematics Education Team at WestEd
  • Tanner Higgin, Senior Research Associate, Learning and Technology Team at WestEd
  • Becca Bussello, Associate Director of Digital Fluency at WestEd

Host

  • Max Bronson, Senior Social Media, Visibility, and Engagement Manager at WestEd

Max Bronson:

Hello everyone, and welcome to our LinkedIn live event, Navigating the Promise and Perils of Generative AI in Math Education. We’re so happy you’re here today. And I’m Max Bronson, and I’ll be your host. We’ve got a fantastic discussion ahead of us with WestEd experts who will share their perspectives about how mathematics educators can realize the promise of GenAI while navigating the ethical, practical, and instructional challenges that come with innovation.

Our featured speakers today are Drew Nucci, research associate with our mathematics education team, Tanner Higgin, Senior Research Associate with our learning and technology team, and Becca Bussello, Associate Director of Digital Fluency at WestEd. Our goal is for this to be an interactive conversation. And we will save some time for a Q and A. So I encourage you to share your thoughts and questions in the chat as we explore today’s topic.

Now before we move right into the discussion, I’d like to take a brief moment to introduce WestEd. WestEd is a nonpartisan organization that aims to improve the lives of children and adults at all ages of learning and development. We do this by bridging opportunity gaps, supporting positive outcomes, and helping build communities where all can thrive.

Now I’d like to pass the mic to Ann Edwards, Senior Director of Mathematics Education, to get us started. Take it away, Ann.

Ann Edwards:

Thank you, Max. I am so excited and honored to be a part of this conversation today among some leading experts at WestEd in EdTech and AI in education, in particular. In this time together today, I’m hoping we can do a deep dive into what you all are observing about, what Max said, the promise and perils of using AI in mathematics education from your own research and from your work with educators and leaders.

But I’m gonna take a moment to start with sort of my own connection to EdTech and AI. So my own interest in EdTech started way back when in the early 90s when I was using graphing calculators and other digital tools to teach calculus. Just to give you a little taste of how long I’ve been in this game. And in fact, my very first presentation at a research conference was actually about using Mathematica to teach calculus at a distance using dial-up internet. Yes, it was very interesting. And this was back in 1996, so it’s been a while.

Fast forward to now and of course AI, and I’m still excited about using technology in teaching and learning. And in fact, I recently caught my 83-year-old mother and my 15-year-old daughter using ChatGPT together to learn about and make something that they just cooked up together. It was something they were curious about, that they wanted to look into and make. It was such a lovely example of how AI can be a catalyst for learning and discovery and creativity.

But you know, I’m also really deeply aware of and hearing a lot about the concerns and dangers of AI use. And in my conversations with state leaders in K-12 and in higher ed, alongside real excitement about the potential power of AI just to do any number of things, so for example, to support system operations, of course enhance teaching and learning, providing targeted supports for students and their families in the various ways in which they need to make decisions about their educational and career journeys alongside all those hopes and dreams and aspirations, there are many, many concerns.

For example, about cheating, and that’s particularly the case in mathematics. About bias, about equitable access, about how they can build AI capacity that’s needed to navigate all of this, about how to craft effective AI policies. And I’m also hearing a lot about how they can make sense of a rapidly expanding and confusing marketplace of tools and products.

So I wanna start there in our conversation about, you know, what you all are hearing and seeing in your own work. And so I’m gonna address my first question to Drew. So I know you’ve talked to a lot of teachers as part of the AmplifyGAIN R&D Center over the last year. And I know that you’ve also been traveling a ton to research in EdTech conferences, connecting with educators and leaders as well as researchers in this space. What are you hearing from them? What are your thoughts on the promises and, what are their thoughts on the promises and pitfalls of math education? And what do you think about them, Drew?

Drew Nucci:

Yeah, thanks, Ann. It’s super fascinating. It’s fast moving. I think that one thing I’m thinking about is if we situate this in terms of specific AI tools, the specific AI tools are moving too fast. So I just wanna, like, frame what I’m gonna say by just thinking about, well, generative AI is a language producing technology, and we make meaning through language, and so that’s really the basis for which we should be understanding this technology. And that means that AI is a new partner in discourse in mathematics classrooms, for both teachers and students.

One thing that we’ve learned is that AI use amongst math teachers is actually still pretty low. It’s lower than for science teachers or for English teachers. And there’s a lot of learning that has to happen for AI to be deployed, really, in a powerful way. As I talk to teachers about AI in teaching and learning, I think we could think about the promises and the pitfalls both in terms of student learning but then also in terms of teaching.

So the promises for student learning are that, one of the big ones to me, is that kids actually ask more questions of an AI than of a teacher, right? I taught for a long time. Kid would not ask me more than three questions, but they’ll ask, like, 30 questions of an AI. And the AI can adapt to the student’s language in a way that teachers sometimes struggle to do. Kids can get explanations of mathematical concepts even if those aren’t readily available in class.

But some pitfalls you already alluded to, for example, teachers are worried about students using AI as a substitute for thought as opposed to a partner in learning. And one of the concerns that I have is that with the very personalized individualized instruction, instructional possibilities, comes a question for me about how much the teachers actually know about student learning trajectories when students are using individualized AI systems. It’s that hyper-individualization of learning is also a pitfall because, you know, if you look at the mathematical practices, these are social practices, right? Modeling, justifying your reasoning, right? It’s a social activity. Schooling is a social activity.

We know from learning sciences that learning is a social activity. And so the hyper-individualization of learning is actually a pitfall. One question I have about AI, in turn, is shifting the focus to teaching. When I talk to researchers around the world, a lot of them say, “Here’s my AI system. Here’s what it can do.” Or they’ll say, “I use this AI system with students, and here’s what happened.” When that happens, I always ask the question, “Can you tell me how you’re thinking about the role of teacher in this intervention?”

So let’s talk about teaching for a second. So for teaching, there are some real possibilities here. Because for teaching, like, teaching math really well, Ann, is a very language-intensive activity. And so there’s some learning that teachers can use to actually teach better. There’s some learning that they can do with AI. So here’s some examples. Planning rich math tasks, right? I think the “Building Thinking Classrooms” book came out, and it was just really useful for a lot of teachers. There’s ways in which AI can actually help teachers reverse engineer tasks to take curriculum materials and make it into a more powerful thinking tool for their students.

They can plan for meaning and relevance. We want the kids to care about their learning. We want the kids to care about what, about class. And so what if we had high-quality instructional materials with wealth researched problems, but we could tailor those to the actual interests of the kids in the class. That’s a promise in teaching. Also, planning for discourse. This is notoriously difficult. We need teachers to be anticipating student reasoning and then coming up with questions that have the kids interacting with each other’s thoughts. That’s a very language-intensive process that AI can actually help teachers do that work better.

And then there’s also sort of scaffolding, right? For English language learners or students with disabilities, implementing UDL principles into instruction. And all of that is something that sort of AI assistance can actually help teachers do the work of teaching better. And I think one thing that I’m thinking about here, Ann, is that, you know, AI is pretty new, and it’s pretty exciting. However, all that we know about the learning sciences and all that we know about excellent mathematics teaching and the pedagogical science, those things haven’t passed away. So how can we tap into AI to help teachers do that work better? I think that that’s a useful question.

And how can we do it in a way where teachers are still connected to each other in the same way want students to be connected to each other? So those are my thoughts right now. So that’s hopefully very helpful to you.

Ann Edwards:

Yeah, I think one of the things, Drew, that you really are bringing home that I think is so important for us to hold at this moment is that really, like, what is mathematical teaching and learning? What is the nature of mathematical practice, mathematical doing, that we think should be centered in our vision for how AI can support it? And then that makes me think, like, well, the folks who are developing these tools, what do they need to understand about this? What do they need to hold at the center of what it is that mathematics doing looks like, mathematics teaching and learning looks like?

And so Tanner, I’m gonna ask you. I know that you’ve had a lot of experience in the design of and research on educational technologies and have experience working with those EdTech developers. And I’m wondering if you can tell us what you’re excited about and perhaps what’s giving you pause in what’s being developed right now, what you’re seeing in terms of how that development work is happening. Yeah.

Tanner Higgin:

Yeah, thanks, Ann. You know, right now, it feels very reminiscent to me of the early 2010s app store boom in the wake of the iPad. Feels like there’s tons of activity, but so much of it feels so similar. You know, you got your tutors and time savers, personalized content abounds, and so much of it just feels like reskins of large language models, right? And not really great tools that are developed with teacher and student needs in mind. And I worry about the long-term effectiveness of a lot of these tools when we have, like, ChatGPT coming out with their version for teachers just yesterday. And it felt like a dozen business models evaporated overnight with that announcement.

What’s exciting to me, I’m still really bullish on NotebookLM from Google. I think this is, it’s a really interesting tool that tackles the hallucination problem really effectively by relying on curating a set of sources and then using those as an assistive thinking tool. It does dangle the sort of danger of not actually engaging with those sources, which is, like, a huge problem in general right now with AI and just culture in general, like, if we’re still continuing to do the reading.

In terms of startups, I’m excited about really focused tools. There’s one called ClassWaves that I think is really interesting because it’s attempting to solve a problem we hear from a lot of math teachers, which is, you know, wanting to do small group instruction, but then not being able to be present in all of those small groups. ClassWaves is trying to find a way, without recording students, aggregating information from all of those different small groups and rolling it up to the teacher so they can make decisions and follow-up afterward. And what I love about it is it’s using AI to create more social interaction in classrooms instead of taking it away. And that, to me, is really promising and exciting.

What’s giving me pause though is that a lot of tools, and this happens in every big hype cycle in EdTech, a lot of tools just have not obviously been informed by deep work in classrooms. It seems like in 2023, a lot of people used ChatGPT. And then relying on their own experience as a student, they’re like, “I’m gonna revolutionize education.” This happens every time a new technology comes out. And I’m afraid to let a lot of tools, and I’m sure the people in the audience are seeing this, are not really solving day to day issues for teachers. And that is very, very concerning.

And we at WestEd learned a ton recently from math faculty and students. We did a big study where we talked to over 120 faculty and students in gateway math in the post-secondary context and learned a lot about needs and experiences there. We did deep listening with these folks. And what we really found, surprisingly, was that technology and tools are not top of mind. It’s not the thing people think of or mention when you ask about, like, what you’re really struggling with. The things that they mention are much more personal. From students, it’s like, “I want more social connection and community in classrooms. I want to feel welcomed in my classroom.” For students and faculty, they love written work. They want to get on whiteboards, they want to work on notebook paper. And they struggle to do this when digital tools are getting pushed on them or they have to upload work to a digital platform.

There’s a craving for real-world application. Prerequisite review is a huge need. So there are real needs there that I would like to see AI tackle. And I think developers, in particular, need to focus their design on being assistive and essential. For me, the mantra is like, “Tackle a specific need and solve it spectacularly.” This has always been true in EdTech, and it’s more and more true now with AI.

And I would love to see tools that are focused on giving students and teachers more agency instead of taking it away, right? And I do think AI has potential here. I think of a tool like Snorkl, but a snorkel without an E. Company founded by teachers focused intently on one critical need, which is getting students to, capturing students’ thinking around math through written and verbal explanations. And then trying to use AI to provide feedback on that. So, like, I look at that as like a shining light of what I would like to see at scale is this solving of those acute needs and making sure that students and teachers feel really supported and that they are given increasing amounts of agency. And that the tool maybe is sitting alongside and supporting those existing routines in classrooms.

Ann Edwards:

Yeah, Tanner, that’s so important. And I think, what is it, the thing with, if the hammer and the nail, I’m gonna get this wrong. But you know, if you have a hammer, I’m gonna get it wrong. But y’all know what I’m talking about. You know, we want the tool to serve the need, right? And we all know that teaching mathematics or any kind of teaching is a really complex activity.

And so there are so many different ways that technology could support them. And I’m so glad that, Tanner, you’ve had the opportunity to share with everybody that sort of central role that under. There you go, Drew, if you’re a hammer, everything looks like a nail. Yes. So I’m really, really glad that, you know, we are seeing the development of these technologies as being in service of a rich understanding of mathematics teaching and learning.

And then, so Becca, I wanna sort of segue to you. Because I think so much of the work that you do really is around supporting the actual teachers in this space and the people that they work with to build their sort of foundational fluency to be able to take best use of, and take the kind of agency that Tanner is talking about. And so I’m curious what you can tell us about in that space of the intersection between learning and technology. What are you seeing as the key needs of educators and leaders, and what are you doing in how you’re thinking about digital fluency to support that?

Becca Bussello:

Yeah. Thanks, Ann. So echoing so much of what Drew and Tanner have already said, we, a lot of what we’re seeing on the ground has been shifted so much in the past two years, right? Ever since these tools have come to market, we are moving faster than the speed of light. And attitudes, practices, behaviors are shifting as well. And so these conversations that we’re having on the ground are so critical to shaping the work that we’re doing and, really, our approach to digital fluency and AI learning.

We work with a number of districts and regional partners across the country. And one of the patterns that shows up all the time is that teachers and leaders want to understand AI. They wanna become familiar with these tools, but what they really need is support that’s anchored in their actual daily work. And we hear feedback of, you know, I know this is important, I know these are out there, I know everybody’s using them.

I’m told that privacy and security and hallucinations matter. And also I don’t wanna be policing cheating all of the time. But translating all of that into what I’m going to do in my work tomorrow is where the real need is. There’s so much information out there. And so the work that we do is sort of less focused on what I might call AI literacy, getting information out there, and more focused on helping people build capacity and building the skills that they need to do meaningful work today, tomorrow, the next day.

And so when we are crafting the professional learning that we do, the first question that we ask is not, “What do you know about AI?” It’s, “What are you struggling with right now? What is the task that is eating your time, that’s making your job harder right now?” And we work collaboratively to figure out how can AI help you do that better? You know, we know from adult learning theory that adults want authentic practice in their professional learning.

And we know that when we give that to people, they build real capacity surprisingly fast. And so this is the focus of the work that we do, and it has led us to developing a framework that we use in our professional learning. And this framework is called Friction by Design. And the main idea behind it is really that learning itself requires friction. Drew, you touched on the learning sciences and how they haven’t changed, right? We have these new tools in this space right now, but everything that we’ve learned over the past, I don’t know how many decades, about what real learning and teaching look like and how they can be effective, none of that has changed. We just have these new tools.

And so whether we’re talking about five-year-olds learning or adults learning, we know that there’s, struggle is required. You have to have some sort of effort in order to get meaningful learning. But we also know that not all struggle, not all friction is equal. And especially in classrooms, there are struggles that we put in front of students that move learning forward and there’s some that get in the way. And that has been a problem in education for decades. But it matters in the age of AI because these tools are specifically designed to remove friction. They’re built to make things easier.

And that’s fantastic. It’s why we love them. It’s why there’s so much promise there. But when we’re introducing them into educational spaces, we really need to be intentional about how we are using them to remove friction and what friction we’re having them remove. So Drew and Tanner, I both heard, heard both of you talking about the social nature of learning. And we all know that interacting with people is challenging. It’s friction, right? It can be hard, but it’s a really critical part of learning.

And so that’s one of the types of friction that, in our professional learning, we really wanna think about, is are you introducing AI in a way that is getting rid of this social sense-making that is really productive to learning? And we don’t wanna do that. And so we’re trying to build capacity and the ability to think about the design, our instructional design, what we’re doing in classrooms, why we’re doing it, where we’re introducing friction. Are we being intentional about it? And is it driving learning forward?

And if it’s not, if there are things that are happening that introduce friction that are making it harder for students to learn, can we use these tools to get that friction out of the way? And so I would say that sort of putting those two things together, our practice first professional learning model and also this lens of friction by design, is, what we see, is a really promising path forward in using AI in advancing learning.

Ann Edwards:

Oh thanks, Becca. I love this notion of seeing friction as a sort of a key conceptual anchor for how we can think about both the ways in which we need to struggle to learn, but then also the ways in which friction can be a barrier, and then utilizing this hammer, so to speak, right, as a way to enhance the productive kinds of friction, and reduce the unproductive kinds of barriers that are created in certain kinds of friction. I love that sort of framing for this.

So I’m curious and, I think, and I think it dovetails so, so well, as a general framework in the context of mathematics teaching and learning, right? So mathematics, learning and mathematics, doing has often been framed as something that is very friction-producing for so many folks, and really deeply understanding what the productive forms of struggle are in mathematics teaching and learning, which, as Drew pointed out, we’ve known for quite some time. And how we can utilize AI to create design with intention, learning experiences, the resourcing and tasks associated with driving those learning experiences, and also the instruction that drives those learning experiences, I think, is critically important. I love that.

I’m curious. Number one, I’ve been asked to engage those of you who are out there to please drop some questions in the chat, ’cause we would love for our speakers to be able to respond to you. But as we wait for people to drop some questions in, I’m curious, just broadly speaking for each of you, and I’ll start with Drew, where do you, so, where do you think the field should go from here? And you know, what are the important things to focus on? And then where is your work in that?

Drew Nucci:

Thanks, Ann. I think that there’s, Becca’s brought forward, like, a really nice way to think about this. When we think about productivity, we think about doing things faster and doing things better. And when I talk to teachers, many of them are focused on the ability of AI to help them do their work more efficiently, mostly because they’re overburdened. But there’s also a side of productivity where we actually want to use the technology to improve teaching.

And so I think that the key here is that one understands AI as a technology and the other one understands AI as an instructional technology. So I think that the next frontier in terms of practice is we actually know that leadership has to build capacity around visions of high-quality math instruction and visions of student capabilities in order for interventions in math education to actually improve teaching and learning. I would say the same thing is true about AI. We have to build capacity with leadership to understand AI’s power as an instructional technology and how that fits into improvement goals.

So I think, in terms of practice, that is key work. In terms of research, we actually need a better theorization of what the Technological Pedagogical Content Knowledge is and how people should be learning it. And we need to understand, as so many of us have been talking about, the social and critical dimensions of technology use and make sure we’re not thinking about technology use in classrooms solely as a matter of accomplishing tasks. But what are the social ramifications, and how can we be critical in our deployment and interaction with this technology?

What my work is in that is I love helping teachers and leaders build that capacity, for sure. And we’re engaged in ongoing research projects to try to theorize that social element of AI in classrooms.

Ann Edwards:

Thanks, Drew. What about, Tanner, what about you?

Tanner Higgin:

For me, you know, thinking about EdTech developers and learning design and how we can improve, it really comes down to something low cost, not a ton of time, that developers could do early, which is just engaging in evidence-based design practice as early as possible. It is going to save tons of time, money, headaches in the future by just getting these things started early. That’s just simple stuff, like creating a logic model where you think through your design and make sure what you’re doing is actually leading to learning outcomes. Conducting literature review, so you learn about this rich body of learning science.

Doing rigorous discovery, usability studies, I think, is critical, especially usability, which is top of mind for those at the state or district level in terms of procurement. And then the hidden secret, to me, is something very few companies do, in my experience, which is competitive analysis. Looking at what’s been tried before, understanding what other people are doing, and trying to do something else and not reinventing the wheel. More developers engaging that is just better for everyone involved. And you know, these are really fast, low cost, and they’re transformative activities that I would love to see more people doing.

And we can definitely help with that. At WestEd, we have, we provide these kinds of services, but there are many other organizations as well that do. So that would be my big recommendation.

Ann Edwards:

Oh God, +1,000, Tanner, on all of that. And Becca, what do you think?

Becca Bussello:

Yeah, I mean, amen to all of those ideas that have already been said. I’ll just throw one out there, which is differentiated learning for teachers around using these tools, not just thinking that everybody needs a hammer. One of our partnerships right now is a district that wants support in using AI for retention for two really distinct cohorts. One is veteran teachers that we’re trying to keep in the classroom to finish out their career and one is novice teachers. And the needs of those two cohorts, both in terms of learning and in terms of what’s draining their energy, are very, very different.

The way that a veteran teacher should be learning, using AI to support their teaching is not in the same capacity building sense that a novice teacher should. We don’t want these tools to take away from our opportunities to develop as professionals in the space. And so really thinking about differentiation with our audiences as we’re bringing these tools into our own professional learning, I think, that is one of the biggest things we should be doing right now.

Ann Edwards:

Oh, thank you all. I’m personally taking away a ton. Really, you know, Drew, thank you for sort of, like, really laser focus on, you know, AI in mathematics, teaching and learning and the vision of mathematics that should inform the kinds of things that Tanner’s talking about. With respect to, you know, my big takeaway, Tanner, is everything you just said about what developers need to be doing and paying attention to, really centering on the core disciplinary practices and what we know from the learning sciences about teaching, and really knowing what’s already in the landscape so they can be additive.

I think that’s really, really important. And Becca, I so appreciate what you’re talking about with the nuanced way that we need to be attending to capacity building and what the actual needs are in the field, and attending to that in a way that is not the same hammer for all the nails, but, like, you know, different tools in terms of capacity building to really meet the needs of the different educators and the leaders who are out there. And with that, I’m just gonna hand it back over to Max. But I just wanna say thank you for the opportunity to be a part of this conversation. Max.

Max Bronson:

Wow. Amazing. Y’all, I was kind of glued to my seat even though I’m standing up right now, if that makes sense. But thanks to all of you for sharing these reflections with us. This has been a fascinating conversation.

And I just wanna mention that we will have two 30 minute “Leading Together” webinars coming up on December 11th and January 13th, 2026, where we’ll continue exploring key topics like this, related to GenAI in education. And I’ll drop links to register in the comments of this event. And please stay connected with us by following our AI at WestEd showcase page here on LinkedIn and subscribing to our WestEd e-bulletin, where you’ll find upcoming events just like this with these wonderful speakers and more, and information about resources and services.

And yeah, we’d just love to stay in touch with you, so thank you for coming. And we’ll see you soon. Have a great rest of your day.