I really thought everyone was going to fool me.
Midway through the semester, I asked students (via an anonymous survey) to tell me how they had used ChatGPT in other university courses. In my classes, I worked hard to demonstrate how AI could be their TA. It helped me brainstorm ideas, organize my writing, and focus and clarify my thinking about complex issues. This is the essence of a good cognitive apprenticeship. But in my students' other courses, all expectations were defied. Simply put, ChatGPT gives students the perfect shortcut to get things done.
So I was shocked by their reaction. “I use it the same way I use it in this class,” one student wrote. “It’s strange but useful to learn how to use ChatGPT as a tool and guide rather than a way to get things done,” said another. “When I’m thinking about what to write, I use ChatGPT to get ideas,” said a third student. Overall, about 80% of my students used AI in some form, but only about 20% said they used AI for cheating, such as having ChatGPT write all or most of their assignments. I realized that most students use ChatGPT as an aid rather than a replacement for learning. Essentially, they are using it as a mentor.
Call me naive. But I'm super excited about this. We believe that in an age where higher education is becoming more personalized and mass-market, students are in desperate need of personalized mentoring. In fact, research shows that “mentoring can act as a catalyst to improve critical thinking.”
The problem, of course, is how can I, as a single instructor, provide one-on-one instruction to 30 to 150 students? Forget it. However, if faculty can actually embrace AI in higher education rather than reject it, I would like to suggest that ChatGPT could be a powerful tool to help faculty provide a high-quality education to all students.
But that's a big “if”.
“Critical thinking,” defined as “rational, reflective thinking focused on deciding what to believe or do,” is actually really difficult to teach. You can't just plug the answers into your students' brains. Students need time and practice to figure out complex problems, whether chess or ethical dilemmas. Fortunately, cognitive science research tells us that “practically anyone can learn anything they want, given favorable learning conditions for deliberate practice and if learners invest enough effort in learning opportunities.”
However, fostering high-quality learning requires high-quality teaching. I can't just lecture my students. Rather, teaching complex topics requires a combination of dialogue, genuine engagement, and mentoring. Learning science has provided many strategies for fostering powerful conversations and engaging in case studies, problem-based learning, and other authentic, real-world examples. But what about mentoring? I was stuck until ChatGPT came along.
So listen to what another one of my students has written. “I use ChatGPT as a teaching assistant and it gives me extra help in brainstorming different ideas just like everyone else.”
Dear reader, please understand the phrase “as you would do to others.”
To be clear, I am not trying to anthropomorphize ChatGPT or pretend that it can solve all of higher education's problems. Rather, I think we are at a crossroads with AI.
One path makes it increasingly easy for all of us in higher education—faculty and students—to fall into a vicious cycle of performative spectacle. We pretend to teach (through lectures) and they pretend to learn (by letting AI do the work). for them). If you think the value of higher education is being questioned right now, you haven't seen anything yet.
But there is another possibility. Students are starting to accept that ChatGPT can actually be a mentor to them and help them learn how to think carefully and critically, just like anyone else. So it's up to me and the rest of higher education to step up and show them the way along that path. There are no shortcuts here.