As the use of artificial intelligence proliferates in K-12 education, it’s important to examine the technology’s implications for historically marginalized populations, according to a panel of technology leaders, educators, and mental health experts.
AI experts have touted the transformative power of the emerging technology., but many have raised warning flags. These tools can generate responses based on outdated information or manipulate facts when asked about events that occurred after learning data from a specific period of time. It can also lead to biased responses and amplify harmful stereotypes. About people who are already disadvantaged.
Educators who bring these tools into the classroom must think about the balance between ensuring Black students have access to these technologies and protecting them from the pitfalls of tools they may not have in mind, says the State of the National. said Leah Austin, President and CEO. Black Child Development Institute during a panel discussion at the June 26 International Association for Educational Technology conference.
The panel discussed the ethics of AI and the impact of technology on Black children. Along with Austin, the panel included Winston Roberts, a teacher at KIPP New Jersey, Kiesha King, senior national education manager at T-Mobile, and Jalen Taylor, affiliate president of the Black Child Development Institute in Colorado.
Here are three key takeaways for educators from the panel discussion:
1. Recognize biases in technology design.
Panelists said that to participate in solutions that create more inclusive tools, educators must first know what the problem is.
“There are stark differences in some of the things we need to consider as Black parents, as people for Black children, and as people who actually teach and educate Black children,” King said.
Depending on the priorities, preferences and biases of the people who create the technology, Austin said, technology can be shaped to reflect their experiences. And the people who create the technology often look nothing like the students who use it.
A representative example is facial recognition.Austin said these buildings are often not built with the safety and security of Black people in mind. There have been instances where technology has misidentified or misclassified black people.
2. Find out how technology impacts students.
AI is so pervasive that Roberts' students mention it without even asking.
He recalled that one time, a student with learning differences who had never been particularly interested in projects and presentations suddenly became interested in doing them. The student presented the project and Roberts was amazed at its quality.
He asked the student if he used AI and the student said yes. Instead of reprimanding the student, Roberts said he used it as an opportunity to teach. He asked the student what tools he used and if he could show the class how he used them.
Roberts said it's important to teach students about AI. Because students will need to know how to use AI effectively in the future.
“As educators, we all need to think about the world of tomorrow, not the world of today,” he said. “When my students complain about a rule or a lesson, I have to say, ‘I need to think about what you want to be at 22, not what you want to be at 10.’ -I need you old.’”
Young people also have fears and anxieties about this technology.Taylor said it is important to develop knowledge and confidence in AI tools. They need to learn not only how to spot bias when using tools, but also how to use them in ways that strengthen their skills.
3. Advocate for better designs and standards.
Panelists said educators have a responsibility to be aware of the impact technology has on the children they teach. That means local leaders and policymakers must support teachers to learn more about AI, they said.
Our society has been slow to recognize the negative impact social media is having on youth mental health, Taylor said. When it comes to AI, we need to keep an eye on its effects.
“At various levels [developers, district level, classroom teachers]We need to ensure good communication with AI implementations, she said, to alleviate the problems that are present.
Developers and educators must ensure that AI systems are trained on diverse data sets and that learners from diverse backgrounds are involved in creating these tools, King said.
Policymakers should also create standards to guide the creation of these tools to ensure they do not harm subgroups, she added.
Austin said the job shouldn't fall solely on the shoulders of black educators.
“We need everyone’s voice at the table,” she said.