Collaborating with ChatGPT: Some Key Insights
I learned that ChatGPT can do more things (but I'm not worried).
I give a lot of presentations about AI and writing these days as part of my work in writing across the curriculum (WAC). To prepare for one of my recent workshops, I spent some time mocking up an assignment you might see presented in a first-year composition (FYC) classroom to see how ChatGPT could be prompted to write that assignment.
In this post, I’ll share my experiment and some of the key insights I’ve taken away from it—including how I’m (still) not worried ChatGPT is going to take away my job.
The Assignment
While I didn’t write an entire assignment, I created the bare bones of an assignment called “Ethnographic Case Study of Everyday Writing” with goals to:
invite students to see themselves as writers and understand how writing mediates activity in their lives by studying their own writing
help students identify rhetorical strategies for future writing situations
Requirements of the assignment included:
1,000 words
Discuss a personalized writing log
Describe research methods for this essay
Reflect on your own own writing, and what it means to be a writer
Utilize headings
Cite readings from class about writing theory
Demonstrate effective grammar and syntax
The component parts that a student might be asked to turn in/complete along the way could be:
Daily writing log notes
Two written paragraphs on themes saw from notes
Quotes and notes from course reading materials that feel useful for the essay analysis
First draft of the essay
Peer response of essay
Revision & peer response reflection
Final draft for grading
ChatGPT’s 1st Draft
To start, I asked ChatGPT to write the essay with the following prompt (trying to estimate what a quick prompt from a freshman student could look like):
“can you write me an ethnographic case study (1,000 words) on how writing is used in my everyday life? please describe methods, include specific references to writing tasks, and use language that digs into my thinking process during this ethnographic case study”
I’ll link here ChatGPT’s first draft. A few thoughts:
The headings are confusing, albeit do create an organizational structure that I’d be happy to see a FYC student provide.
The tone sounds mechanical, and it lacks substance and meaning.
Ideas and examples are generic and do not engage with the research and scholarship.
Missing a section about what it means to be a writer and direct references to the writing log
So How About Draft 2?
What if I, as this hypothetical first-year student, noticed the mechanical tone in the draft and prompted ChatGPT to be more “personal” in its response, and to add that forgotten section about being a writer?
Here is the second draft, and some further commentary:
Headings are still a bit weird. I don’t know if “thoughts during the process” accomplishes what ChatGPT thinks it does (or wait…it isn’t doing so with some thought in mind! It’s just predicting what comes next)
The tone feels less mechanical now, even if it still has a formal flair that I don’t often see with many of my students even more advanced in their undergraduate degrees.
The added paragraph about being a writer kind of sounds like something out of a bad hallmark movie, and is vague, but it technically does address the prompt.
Did not address the writing log issue (yet—don’t you worry!)
Draft #3, Anyone?
The incredible thing about this tool (for better or for worse!) is its ability to re-generate and adapt based on what you tell it. What if, noting that the draft lacked any citations, you asked it to add some? You could prompt it to:
Quick note that the sources listed here in this prompt are ones freely accessed on the internet—and ones I might assign my students to read, since I believe in the power of open access publishing.
And here’s what we’ve got for this draft:
Correct in-text citations of APA, with kind of correct but not fully reference citations
Accurate and appropriate references to the scholarship when discussing writing processes and procedures
Effective paraphrasing of sources (which is a very hard skill that takes time and practice to learn)
Still not addressing the writing log issue (but will in our next example)
Final Draft 4 with Instructor Feedback
But now what if I get instructor feedback on my draft and ask ChatGPT to respond to it and do a couple extra things, like address those pesky headings and actually incorporate the writing log that I kept and can then feed into ChatGPT?
Our fourth and final draft for this activity, some notable thoughts being:
Better headings (less is more, actually)
More specific examples from the writing log—ChatGPT took even brief descriptions of journaling and added some detail to it
Effectively grouped and categorized themed tasks from the writing log
Maintained that personal tone that feels more approachable
Overall Takeaways
I’ll admit that this activity really surprised me. Even just a few months ago, I didn’t believe that ChatGPT could seriously engage with sources in a way that I as an instructor would think was decently passable.
I was wrong.
The tool was easily able to scrape the accessed data and plug it in where appropriate, building not on an actual understanding of the material but of where and how words tend to come together across the millions of data points it possesses. If I were to read this coming from an 18 year-old, I’d think it was good enough. I’d see them trying and engaging with text, finding connections and expanding from point-to-point.
Back pre-AI, I probably would be happy with this, and would assume that my student must have learned a great deal. This is of course all made possible by the sources being open access, but the truth is that a lot of us are moving to assign open access and freely-available materials, and that even resources behind a paywall could be sold (I’m looking at you, Taylor and Francis).
Am I worried about my teaching now? Nah.
For me, this final product isn’t the most important part.
Let’s go back to those component parts along the way, where students are keeping that daily log, submitting their notes, pulling out quotes from our class readings that seem useful, engaging in peer response, submitting drafts along the way, etc.
That’s where the learning is happening, between the steps and across time as a student notices that they’ve been writing a lot of text messages and that those text messages look a lot different from the emails they write in their internships (and how).
Students are reading and reflecting on what they’ve read in class, which is informing what they’re writing about, whether they realize it or not (and which we would explicitly discuss in class together).
If I’m scaffolding this assignment as thoroughly as I usually do, I’m not seeing the student’s full draft for the first time at the end. I’ve seen their initial ideas in the margins of their notes, in their reading response assignments, in their first drafts submitted and their revision timelines created.
These AI tools can do a passable job at creating this final product, for sure.
But those final products aren’t the only things that matter.
Students are learning and thinking over time as they partake in assignments and mull over thoughts and ideas. This is harder to measure and not always seen and valued by students, but that’s the real winning ticket here (and one that I think we as writing teachers need to do a better job articulating, both to our students and to our other stakeholders).
AI can do more than I thought….but it can’t do everything, and I’m still needed as a writing educator.
This is the main thing that I took away from this activity and exercise. AI is not a better writer or teacher than I am, and I may just need to get a little creative and tweak the way I do things in my classroom.
This might look like:
Having students submit notes and quick thought-pieces throughout a writing project
Ask students to submit audio files talking about their ideas and reflecting on their progress and process
Inviting students to write in class and share insights with their peers informally and more formally via peer response assignments
Assigning points throughout this process so that the entirety of the project grade does not exist with that final draft
Directly talking with students about AI and even working through a scenario just like the one I described here, showing where AI falls short as well as why it’s actually more useful to hear from the student themselves
There’s so much more I could say here, but to keep this already-long post nice and concise:
AI can do more things…but we can still help teach students effective processes.
Maybe this process includes AI….or maybe it doesn’t.
That’s the fun part we get to figure out in our courses and in our work places, and it’s worth experimenting to see what this might mean for us as educators, employers, and members of society.
Now, back to my regularly-scheduled programming of writing a research article (without using AI!).
Love how you emphasize that the final draft is not the only thing we’re measuring here. Learning is a lot harder to identify, but with all the touch points you mention, it’s easier to pinpoint when it is happening, which is helpful for both you as the instructor and your students.
And that learning is what will serve your students long-term, well after they’ve left your classroom.
I sometimes use generative AI for my fiction, but not for the actual writing. I’ll ask it to spit out some prompts, then it’s up to me to identify which ones are actually worth a damn. It’s pretty good at simulating audiences, too, like if I’m looking for feedback from a particular perspective.
I’ve also used it for nonfiction writing (website copy, email marketing ideas, social media posts), identifying trends in successful viral content, and that sort of thing. This stuff, in particular, is mind-numbing to me, so it helps a lot to have a tool that can make it easier.
I’ll never trust AI on its own, but as long as I keep its shortcomings in mind, it can help me overcome my own limitations, which I think is pretty cool.