Late last week, it was announced that Arizona State University had partnered with OpenAI in a deal to provide ChatGPT-4 for free to “approved” users (currently $20 per month for individuals).
The partnership fits ASU's self-branding as an innovative institution, and the specifics of how ChatGPT will be used were somewhat vague. Maybe it's because no one knows how to get started with this yet. In a report published in Axios“ASU plans to use ChatGPT to build personalized AI tutors and provide writing assistance to students in one of our largest classes, Freshman Composition,” said ASU’s Chief Information Officer.
When this hilarious story started spreading on social media, the sound you heard was the sound of thousands of first-year writing instructors gasping as if the Empire had fired an arrow from the Death Star.
In terms of innovation, one of ASU's most enduring goals appears to be reducing or eliminating labor costs associated with teaching general education courses such as first-year writing.
In late 2014, ASU announced a plan to unilaterally increase the teaching load of all part-time writing instructors from 4-4 to 5-5 without a pay increase. I poked around at this blog a bit and elicited the following comments from our Senior Director of Media Relations and Strategic Communications: “Recommended class loads fit the academic ideals, but are a luxury that universities attempting to educate cannot afford. Tomorrow’s population and workforce will grow.”
Instructors were able to organize and secure some concessions in terms of course volume and salaries, but student volume far exceeded disciplinary maximums. In 2015, ASU announced plans to outsource general education to edX and create an “all-MOOC freshman year.” The Global Freshman Academy was quietly downsized in 2019. As Lindsay McKenzie reports here . Inside higher educationOnline courses initially enrolled hundreds of thousands of students, but “after four years, only a few have completed the course and very few have paid to receive college credit for their efforts.”
Now that we have generative AI and ChatGPT, ASU clearly sees new opportunities to replace human labor. In this case we have algorithmic automation.
I'm going to cut ASU some slack in that they will be using ChatGPT to help teachers with freshman writing. This is the kind of idea that comes up during brainstorming among people who don't know what it is or don't think about it much. It means teaching students to write, and there is currently evidence that this is an unplanned concept.
I would like to explain why, in my opinion, it is a bad idea and therefore should not be implemented.
ChatGPT-4 may seem surprising, but it's important to remember that it's an automated phrase generation algorithm. It's unthinkable. I can't feel it. It doesn't read. I can't even understand it. There is no style or understanding of writing within a rhetorical situation. Some people like to claim that large-scale language models are a form of intelligence, but at their core they are nothing like human intelligence.
The human-like characteristics we detect in the output of large-scale language models are entirely a byproduct of humans projecting these characteristics onto automation. Yes, you can use GPT to create scenarios where you can simulate any type of feedback. I think I will Although they appear in instructor comments on student assignments, the process by which those comments are derived is not the same process used by human instructors.
This distinction makes sense if you want education to have any meaning.
Comments generated by GPT are simulations. They are fake and do not truly communicate. I think we've all heard about the 'hallucination' problem, where AI produces false or inaccurate data. But from an AI perspective, it's important to recognize this. everything It's a hallucination. It has no ability to distinguish between real and fake, true and false. It doesn't matter that good advice can come from time to time or even often. Because there is no real intention of meaning or communication behind the creation of that phrase.
That the kinds of comments that large-scale language models can generate that can be substantively conveyed is largely a reflection of how ideas about what writing means have been reduced in school contexts, especially in grades 8-12, where standardized assessments dominate. This is a comment. Really poor places, mainly interested in showing a few authorial moves that fit the prescribed template.
This is academic cosplay, not writing. It is a phenomenon I have explored for a long time, and I believe rather convincingly. Why They Can't Write: Getting rid of the five-paragraph essay and other essentials.. There would be absolutely no temptation to use ChatGPT as a tutor or teacher if you didn't miss out on a truly meaningful experience in your writer's development.
This desire to automate what could and should be a human response is also just a further indication of a dismissal of the importance of the labor of the primarily part-time faculty who teach courses such as first-year composition. As I've seen first-hand, these people often do a great job resolving issues that previously arose in generally hostile situations in terms of pay, workload, and security. Even entertaining the notion that generative AI can perform the task demonstrates the depth of contempt some managers have for tasks that should be considered core to their institutions.
Do I sound frustrated? I hope.
I would also issue a warning that the slope is indeed slippery for tenured faculty who are unlikely to be immediately affected by AI teachers placed in Freshman Composition. Just as adjunct faculty positions continue to diminish the quality and autonomy of full-time positions, the same is true when automation is allowed to replace human labor. If you allow what you do to be devalued by some of the people who do it, it will be devalued by everyone who does it.
The conclusion that generative AI teachers can and should be used to help students write is an announcement that you have abandoned the task of teaching. You are currently in the business of “automatic batch processing per student”.
Perhaps this is the inevitable future of higher education in this country. ASU is on a more solid financial footing than many of its state university peers because it has adopted things like automated batching of the per-student model to increase scale and revenue. This does not mean that learning does not occur at schools like ASU. Although I am sure many faculty members are doing their best within the system, it is undeniable that the values of such a system are anti-educational.
We'll hear what this technology means. supplement, which is nonsense when it comes to writing instructions, not a replacement for human instruction. All uses are replacement. The claim that human labor is a luxury is a choice to focus resources on algorithmic automation rather than human interaction.
It's a bad choice. This is a choice that ultimately leads to a complete abandonment of any kind of true educational mission.
ChatGPT can absolutely be a useful tool for writers, but if writing is important to you beyond generating text for the purposes of ratings as part of a larger credential, you need to treat writing as a truly human thing and not just an act of phrase generation.