Google is one of the larger education technology companies adding generative artificial intelligence to existing products that are already popular in the K-12 world. The head of education impact at a technology company says new technologies can help solve some of the problems schools currently face, but he also acknowledges their limitations.
Last month, Google announced Gemini.It will be available as an add-on feature for educational institutions using the Workspace for Education product, a generative AI model. Educators now have access to the Gemini chatbot and generative AI capabilities in Google Workspace apps like Docs, Sheets, Slides, and Gmail.
Other education technology companies have also announced AI features in their education products. Microsoft announced that its AI chatbot, Copilot, is now part of Microsoft 365 apps like Teams, Word, and PowerPoint and can create materials for educators. Khan Academy has Khanmigo, an AI assistant for students and teachers. And OpenAI, the creator of ChatGPT, now has ChatGPT Edu, but that's mostly for people in higher education.
Google's announcement and other companies' technological developments are happening as more educators try out AI-based tools. At the same time, schools are struggling to figure out how to use technology in education and school operations, given the huge concerns about data privacy and academic integrity that come with the use of AI.
At the International Education Technology Association conference here, Education Week spoke with Jennie Magiera, Google's head of global education impact, about the role of AI in education, the technology's limitations, and educators' concerns about AI. Before working at Google, Magiera taught in Chicago public schools for 10 years. She also served as Chief Innovation Officer (CIO) outside of Chicago for two years.
This interview has been edited for brevity and clarity.
How do you see Google's AI capabilities solving some of the challenges educators face?
What we're trying to do is empower educators and help them personalize learning for every student., we have been trying to do that for years. Now, as AI technology becomes more advanced and more prevalent across all of our products, those hopes and dreams are becoming more realistic than ever.
One of the products I really like is the practice set. When I was in the classroom, I spent all my time trying to close the instruction-differentiation-assessment-reteaching loop, and it took weeks. What the team has done with the practice sets is to allow young people to engage in activities or explorations. In moments of cognitive dissonance, you can find out in real time exactly what didn't work and then relearn the material from within. Moment. As an educator, this not only saves you time, it accelerates your students' progress, makes them feel successful faster, and builds their confidence.
What problems are facing educators that these AI capabilities cannot currently solve?
I don’t think generative AI can solve every problem educators have, and it shouldn’t. We need the human element, we need teachers in the loop—Human in the loop.
What is Google's philosophy on developing AI tools for K-12?
Technology is a means for educators to do their jobs better and help students achieve their goals. This is not about technology, but about technology being an invisible, ubiquitous tool that can be accessed by anyone as needed.
We do this by respecting our users. We spoke to educators at all levels around the world to find out what they need. What opportunities for change do they want? We sometimes call it a magic wand wish. It's like if you had a magic wand you could make one thing appear, but what would it be? And our team works hard to find ways to make this a reality.
Is Google helping to provide educators with professional development in AI?
We do this through two main methods: One: Our community of practice brings together educators to celebrate, uplift, and help uplift and celebrate one another.
The other is to support educators by providing them with training and professional learning support. We recently published two free online courses on AI. One of them was in collaboration with Grow With Google. And MIT raise [Responsible AI for Social Empowerment and Education]. Generative AI course for educators. This process is product agnostic. The goal of the course is to be a course that requires no entry and is very accessible.
Another course we offer is Getting Started with Gemini for Workspace. That's literally what this button does. This is what functions can do. This is its power. This is how to approach it.
Along with Google for Education Champions, we also have a series of YouTube Shorts. These are some of the most passionate, passionate, imaginative and creative educators in the world, creating YouTube Short videos on how to use AI.
Some educators are concerned that AI could diminish the critical thinking skills of students and teachers. Some even say that human touch is better than AI. What is your response to this?
When I hear educators say they have fears about AI or concerns that AI will replace humans or that AI will reduce human standards or levels of creativity., I hear you. I thought so too. I felt that too.
As I've been learning more about AI and talking to more people, what I've found is that AI actually enhances my humanity. It enhances my creativity. The reason is that I find that I'm spending so much time on mechanical tasks that I'm not getting to the creative part of my brain, the creative part of the work product, because I'm doing all these other things. Pieces.
If you're an educator, you'll be grading papers all day and come up with a slightly different version of this lesson plan. But imagine if AI could do all the work for you and even give you prompts when your brain gets fried and you get writer's block. It almost becomes like a spaghetti thinking partner on the wall.
So in many ways, I have become a more creative and more human version of myself. Because I entrusted all my robotic tasks to AI and asked it to help me stimulate my creativity so that it can do more things. that.
Some teachers say AI doesn't make their jobs easier because it requires more effort to rethink assignments and double-check the work of the creation tool. What do you think?
I think [about] Setting expectations. If we expect AI to provide students with the perfect lesson plan or the perfect prompts, that’s not what we want. I don't want a world where AI can do that. Because I want a world where humans come first and humans become teachers.
It saves time to take a step back and think of it as a start instead of giving it all to me. I can't tell you how many times I've sat in front of a blank screen, staring at it for hours, thinking about what I'm going to teach on Monday and Sunday night, and thought, “I just don't even want to.” “I’m going to start.”
But with AI, you can say: “Give me the shell of this lecture, the shell of that differentiation, the shell of that essay, and then I will use the human brain. , made my human heart, my spirit, my experience into something truly wonderful.” I believe this will not only save you a lot of time, but also make the final product much better.
What should schools know and do to combat AI’s imperfections?
Just because the technology exists doesn't mean we have to use it in everything. So I think a big part of schools will be about supporting educators and asking important questions about how they apply technology. When to use AI How do you use AI?
And you have to think about it carefully. Just because it exists doesn't mean it should replace all of the practices we do in the classroom. We must consider where the opportunities lie and lead to the needs.
Is Google conducting research on the impact of AI on education and learning?
We're always looking for ways to learn more about how our products are used and what impact they have. We also strive to maintain the same humble level of our expertise. What are we good at? And where should we play our role in studying the long-term impacts of learning?
We know that there are other organizations that are doing very strong work there and have a long history and resume of that kind of action research. We have recently joined several consortia that are deeply researching AI policy and practice, such as Teach AI, EdSafe Alliance. We are also working closely with ISTE/ASCD (Digital Promise).
We strive to remain collaborative thinking partners and friends, sharing what we learn, learning from them, and incorporating those learnings into the way we work.
How do you foresee AI being integrated into K-12 education?
I hope this provides an inflection moment where we can approach the power of technology in schools in a more thoughtful, equitable, and intentional way than in the past.
I was part of a big one-on-one. [computing] step At school. Back then, it was like getting the device in front of the kids, opening it up, using it, putting it on. It was more like time for technology and time for work.
What I hope about AI now is less about the technology and more about who has access to it and how they are using it. Is this the right purpose? I see many organizations doing that, including ours. We are choosing what features to build, how to build them, and how to make them accessible.