Skip to content
About

News and Events

Fellowships, research, and professional development

Home Podcast Writing in the age of AI, with Lizzie Callaway and Scott Black

Writing in the age of AI

with Lizzie Callaway and Scott Black

Why learn to write in the age of artificial intelligence? Elizabeth Callaway, Assistant Professor of English at the University of Utah, talks with Scott Black about writing pedagogy with and about AI. Links:

Episode edited by Ethan Rauschkolb. Named after our seminar room, The Virtual Jewel Box hosts conversations at the Obert C. and Grace A. Tanner Humanities Center at the University of Utah. Views expressed on The Virtual Jewel Box do not represent the official views of the Center or University.

  • [This transcript is automatically generated and may contain errors.]

    Scott Black: Our writing class is obsolete. Why bother? To learn to write in the age of AI? Welcome to the Virtual Jewel Box Podcast of the Tanner Humanities Center.

    I am Scott Black, Director of the Tanner Humanities Center, and today I am joined by Dr. Elizabeth Callaway, Assistant Professor of English here at the University of Utah. Dr. Callaway is the author of Eden's Endemics: Narratives of Biodiversity on Earth and Beyond, as well as essays on environmental humanities, digital humanities, science fiction, and contemporary literature.

    She is also a wonderful teacher, and in the past few years has designed groundbreaking classes addressing AI, including an exceptional and innovative first-year writing class called Writing for Humans in the Age of AI. No one is thinking more creatively about the implications of AI in the writing classroom than Dr. Callaway, and I'm delighted to talk with her about writing in the age of AI. Welcome, Lizzie.

    Lizzie Callaway: Thanks, Scott.

    Scott Black: There's a lot of pressure on writing classes now. Both professors and students are wondering: what's the point? Could you give us some background on your class? How did you conceive it, and how is it designed?

    Lizzie Callaway: Well, I think, first of all, you started off with a question that you just have to address. I start off my class asking the question: why write? Why learn to write in the age of AI? And I start off with a picture that I love. I have a mini video lecture, and I start them off with a picture of my daughter traveling in Spain, and she's writing in her journal and sketching. She's looking out a window at a church tower. And I ask them: why would she spend hours doing this—because she does—when ChatGPT can do this? And I show them what ChatGPT can do. Along with DALL·E, it can make a beautiful sketch and a really nice journal entry about a day in the Catalonian Pyrenees.

    And we just start from that question, because ChatGPT will give you a piece of writing, and it will give you a piece of writing quickly. And actually, there are a variety of LLMs that can give you a piece of writing. But I think the fundamental question that I want my students to consider is: what do they get out of writing that’s not just a dead piece of writing at the end of it? So we get to talk about what this process of sketching, journaling, paying close attention to the world and to herself does for her—and hopefully can do for us—as we focus on texts that reward multiple readings, on our own thoughts, and on the world around us.

    So writing ideally is a process that does something to us and for us—not just something that produces a piece of writing at the end of it.

    Scott Black: Yeah, that's a really important point. If all we're doing is asking our students in our classes to produce an essay, we've missed the point that the essay is supposed to serve. And I think that's one of the interesting things about AI. It's asked us to reconceive—or maybe not reconceive, but rearticulate—what we want students to do when we want them to write an essay. And as you said, it is about the process. It is about the kind of attention we bring to our work—to whether it's the world around us or a text in front of us.

    So how do you do that in your class? Do you want to talk a little bit about the course itself? One of the fascinating things about the class is that it's both about AI but also practice in using AI.

    Lizzie Callaway: Yeah. Ideally—I say this at the end of the course description—ideally we use AI to help make our writing more human rather than just merely mechanical, rather than just that writing that exists just to be a dead piece of writing at the end of it.

    And what I think is one of the biggest differences between this writing class and other writing classes that might also look at AI is the fact that we don't just look at LLMs. Of course we start with LLMs, because they're the innovation that brought this whole conversation to the forefront around November 2022, when ChatGPT was released.

    But we look at writing as a series of widening perspectives. So first we look at the writing itself and how LLMs can do the writing itself—what they might be good at, what they might not be good at, what we're cheating ourselves out of, what transformation we're missing for ourselves if we just let a machine do the writing for us.

    But we also look at writing in this larger context of AI. LLMs are not the only AI that is making us write mechanically like machines. If you just think of search engine optimization or trying to get something viral on what used to be Twitter or YouTube or TikTok—any kind of composition now is on a tilted table where you're writing towards and for an algorithm rather than other people.

    In addition, the algorithm curates your information scape. So as much as writing is about writing, it's also about research and learning and reading. And so we look at all the algorithms and the search engine AIs and the recommendation engine AIs that are curating our information scapes. And then we look even further out—at labor and the environment, and how different types of writing throughout time have had different impacts on labor and the environment.

    So we try to take a really broad look at AI, not just at LLMs.

    Scott Black: I think that's right, and I think it's really important for students to demystify—or for us to demystify—AI. Students sort of enter search strings in a prompt and then they get out a piece of writing. I don't know if many people, and certainly not our students, understand how that process works—both globally, literally globally.

    Many AIs have been developed in third-world countries, in Africa, by very low-wage workers. They've been trained in all kinds of different ways. There's a whole—there's literally a mechanical infrastructure, electronic infrastructure, and of course an energy cost—a huge energy cost—to running AI. All of these are aspects of what looks like this magical piece of software in our computers that really have effects that people have made choices about, and we should understand that as part of our being responsible citizens.

    But it also is important to understand what this tool can do, and what maybe it shouldn't do—which gets back to what we want people to do when we teach them to write.

    Lizzie Callaway: Yeah. When you were talking about the labor cost of AI and the development in the third world, I was just really thinking of how this reinforcement learning with human feedback is usually what they’re doing. That is a process that’s required for any AI to sound remotely human. A lot of times it’s like an A/B test: if you want it to be good at any particular task, it’ll produce two options, and then a human tells it—this one or that one—which is better, to help refine an LLM and get it better at any task.

    But this job has been stripped of anything that makes a job humane. Not only is it kind of exploitative in the way that you might think of sweatshop labor, but you don’t have any colleagues. You aren’t allowed to talk to anyone about what you do. You don’t know what you’re doing. You’re just following a set of arbitrary rules.

    There’s this great article that you actually brought to my attention, Scott—Inside the AI Factory by Josh Dzieza—where the journalist goes and does one of these trainings. He’s labeling pictures: is it clothes or is it not clothes? Is it clothes real humans can wear or not? So doll clothes are not clothes. A costume is not clothes. A picture in a magazine of someone wearing clothes is not clothes.

    So he goes, “Oh, I know what’s clothes and not clothes.” And then there’s a picture of someone with a reflection and he says, “That is not clothes. That is a reflection of clothes.” But it turns out that is clothes. So you just have to memorize all these arbitrary details.

    And in the same way, I do feel like we dehumanize ourselves when we write to the algorithm—the social media kind of writing, the rant that’s going to get shared a lot—versus the carefully considered, complex exploration of a topic.

    And I do have the students experiment with Quick, Draw!, which is an AI by Google. It gives you a prompt—it's Pictionary—you draw it, and it guesses in 20 seconds or less what you’re drawing. And then we have a discussion: did you start drawing to the algorithm? Did you start thinking, “What’s the quickest and most efficient way to draw this so it will get it,” instead of just drawing it the way you naturally would?

    Scott Black: Yeah. And did they find themselves actually trying to adapt to the algorithm?

    Lizzie Callaway: Yeah. And we connect it to autocomplete, which I find highly offensive—but other people use. It’s time-saving: maybe you weren’t going to phrase it exactly that way, but you can just press Tab instead of typing out the rest of your sentence. So there are all sorts of ways that our writing is becoming more mechanical without us even knowing it—even before LLMs.

    Scott Black: Yeah. And to be clear, there are times where that’s helpful. There are times where I don’t need to write a memo from scratch because it’s either boilerplate or just needs to be done. That doesn’t involve me. So I think it’s a mixed bag. And I think your class also has people use AI sort of in the middle of the writing process rather than at the end of it—which can be helpful. I think a lot of writers I’ve heard from, like creative writers, find it can be really helpful with early drafts. None of them are happy with those drafts, but that’s where the actual work, you know, comes in.

    Lizzie Callaway: Yeah. Have you read the Co-Intelligence book? I try to read things from people who love AI too, because I could be missing things. Ethan Mollick thinks it’s great because he doesn’t have to read bad writing—like bad proposals—anymore. He rightly points out that LLMs have now been able to pass, with flying colors, nearly every test we have of human creativity. And he also points out that most of its ideas will be bad, but that’s your job as the person and the editor and the one in charge—to go through all the ideas and get rid of the bad ones and know what’s a good idea.

    I think students have found it really helpful in things like brainstorming. I do wonder if it’s better at brainstorming monetization ideas or company ideas than it is for, I don’t know, a close reading paper. But it’ll come up with a lot of ideas, and something might spark your excitement or interest or creativity. And you don’t have to do the job of coming up with all the bad ideas—you can just sift for the good ones.

    Scott Black: And of course, when you brainstorm, when you actually draft papers, they’re full of bad ideas. So getting some help with that doesn’t sound terrible.

    Lizzie Callaway: Yeah.

    Scott Black: So okay, this is what AI can do. Why do students then need to write? And this gets back to what we started with. One of the things that AI has forced us to do as English professors—and I’m sure in most other fields—is ask ourselves, what exactly do we want out of a paper, if what we want out of a paper isn’t just a paper?

    And I think, for myself, and I think for a lot of our colleagues, we’ve been asking: what does this mean? What are our assignments for? Or more generally: why do we need to write—again from the very beginning—why do we need to write if we have this tool that will do it for us?

    Lizzie Callaway: Yeah, and I think that’s one of the best things that has come out of LLMs—the fact that we’re just asking this question. And then I have to go back to basics. I think just the fact that we’re thinking: what are we asking for? What do we want? What do we want students to get out of this process?

    Because writing used to just be a way—it used to be evidence of thinking.

    Scott Black: Right.

    Lizzie Callaway: So that was really nice, when we could use that as a proxy for thinking. We wanted them to be able to think long, deeply, with great concentration about topics—and we can’t just use writing as a proxy for that anymore. So I’m curious actually what you think, Scott. Do you have ideas?

    Scott Black: I have ideas. Do you want mostly bad ones? But we can brainstorm right here. Then we can ask AI to clean it up for us. So one of the things that I’m really interested in is making life more difficult for students. I really mean that. There are lots and lots of ways they can make their lives easier right now, especially in school.

    But part of the challenge of education is learning that what you produce isn’t actually the point. And I think this works all the way through school. I think in classes, all of us are giving assignments not so the assignment gets finished, but so that students learn something. And more generally, students are getting the degrees—ideally, I say—not because they need the credential (although they do), but because that credential stands for something they’ve learned.

    And I think reminding students that they’re in college and in class to be educated—not just to get the credential—is important. Of course, a lot of students don’t care about their education. Not a lot—I would say a few—but most of them understand they’re here to develop themselves in some way: to develop their habits of learning, hopefully develop their curiosity, their ways of grasping the world. And that’s something really—despite what we do in our class—they can only do for themselves.

    So we’re offering them occasions to develop themselves in important ways. I think writing is one of those tools that only really works if you’ve learned to do it yourself. And what I mean by “works” there isn’t necessarily to produce a great paper, but to develop the kind of close attention, thoughtful reflection, and honestly the stringing together of multiple ideas into one larger idea—rather than going for the simplest answer.

    I think those are all the skills that writing is uniquely trained to develop—but only work if you do them yourself or if you’ve learned to practice them for yourself.

    Lizzie Callaway: And do you think they need that skill?

    Scott Black: I think they desperately need that skill. I think we need that skill as a world.

    Lizzie Callaway: Yeah.

    Scott Black: I think people individually… People always do what they love. They practice what they enjoy. And one of the challenges for a writing class is we have to make it worth students’ time in doing it. I don’t think it’s hard, because writing can be about anything. And asking students to practice it on something they love is not a chore for them, but actually a chance to indulge whatever their particular mode of fandom is.

    Lizzie Callaway: I know. And it’s more fun to read as an instructor too.

    Scott Black: Absolutely.

    Lizzie Callaway: Because I’ve been learning so many things. I had them last week write a letter to their AI handler. I have a friend who calls the recommendation algorithms his handlers. He’s like, “Oh, the handlers really want me to know about cute cats on pianos these days.” So they had to write a letter to their handler about their feelings about what’s being recommended to them.

    And many of them had a similar structure: “Thank you for these things, and I have a few requests going forward.” But it was fun. People were very honest about the kinds of things they wanted to see more of. Someone mentioned barnacle-scraping videos, and I had to hold myself back from looking those up.

    Scott Black: Should I ask what a barnacle-scraping video is?

    Lizzie Callaway: I think it’s just scraping ship hulls, and it’s satisfying to watch them get cleaned of the crustaceans on them.

    Scott Black: That sounds fabulous—except for the poor barnacles.

    Lizzie Callaway: I know. But they were so thoughtful about asking, “I’d love you to slow it down and not show me interesting things after 11 p.m.” Or, “I don’t like how every once in a while you try to throw something really sensational in there or a hot-button issue. I’m only here for the kitten videos. There are better ways to keep me on platform than making me feel upset and scared.”

    Scott Black: And it also reminds us that valuable writing is personal.

    Lizzie Callaway: Yeah.

    Scott Black: I think this is something that as English professors we’re trained to forget—I mean, almost literally trained to forget—because we want them to sound professional, or whatever that version of authority is. And they’ve really internalized this. I had a professor in college who used to tell us, “I don’t want you to sound like your daddy”—that sort of image we have of what authority should sound like on the page. That’s not interesting. It wasn’t interesting before AI, and it’s even less interesting now.

    Lizzie Callaway: Yeah. And that is how default LLMs sound usually—if you haven’t prompt-injected them to sound different.

    Scott Black: I actually think it’s how artificial intelligence sounds in general. It’s sort of very grandiose. It’s very general. Very vague.

    Lizzie Callaway: Exactly. And yet when you write about things from your own perspective, it’s very particular. It’s very enthusiastic. One way or another. It could be enthusiastic or really negative. But that’s what makes a person sound like a person.

    Scott Black: I know.

    Lizzie Callaway: I feel like it’s really hard because—the book I mentioned earlier, Co-Intelligence—thinks AI is wonderful, and that the LLM can write in E.K.’s style. But I think it’s really hard when you’re a student and you don’t necessarily—he’s a professional writer—if you don’t know what your style is yet, it’s hard to know if the AI has gotten it right or wrong or has a style. It usually doesn’t, really. It’s very bland and vanilla.

    Scott Black: And part of the point is that Mollick has a style. In other words, he could train an AI on what he’s written.

    Lizzie Callaway: But you’d have to have written something.

    Scott Black: Exactly.

    Lizzie Callaway: Exactly.

    Scott Black: But I think style—that’s actually interesting, because that’s precisely the thing that we’re asking students to develop: their own voice. And that also means to develop their own personality and develop their own self. And of course, I mean—sorry, 18-year-olds—but 18-year-olds don’t yet have a fully developed personality. And part of what they’re doing in school is working that out and learning what they think, but also how they sound.

    Lizzie Callaway: And I think we’ve just made it harder for them.

    Scott Black: Yeah.

    Lizzie Callaway: Because—

    Scott Black: They’re off-the-shelf models.

    Lizzie Callaway: Yeah, yeah.

    Scott Black: Yeah.

    Lizzie Callaway: That will sound authoritative—

    Scott Black: Right.

    Lizzie Callaway: —even when they’re wrong. I think one of the things I also like about writing assignments is that I’m the audience, and they know I’m the audience. And so it’s also a chance to write to something that’s not an algorithm. There is going to be a human being reading this at the end of it. And I feel like it hopefully frees them up to write in a style that’s them, and not a style that’s, I don’t know, hoping to get picked up by an AI and recommended.

    I also, sadly, do know that they’ve been writing to AIs their whole education. I have fifth-grade twin daughters, and their writing—like, state writing exams—are all graded by an AI. And the teachers are very open about it. They’re like, “So, when the AI’s looking at your writing, it’s going to be, number one, looking at length.” Which is so backwards.

    So I do feel like, by the time I get them as college students, they’ve been a little bit broken by this system, where they have been taught to write to an AI. There are AIs that can write for them. And I don’t know—hopefully, just having a human audience is something different for them.

    Scott Black: Now I’m really depressed. So much of what we’re doing in college English is actually trying to address what was done in high school language arts.

    Lizzie Callaway: Yeah.

    Scott Black: And it’s not the LA teachers—they’re brilliant. They’re heroes.

    Lizzie Callaway: Exactly.

    Scott Black: It’s the system that they’re having to work within.

    Lizzie Callaway: Yeah. And to be fair, my girls’ fourth-grade teacher was amazing. He and I—volunteering in the classroom—read every single piece every single person wrote and kind of edited it in front of them and taught them. At this point, they’re learning what’s a full sentence and what’s a sentence fragment. And he just wanted to be transparent with them: “When you take your state test, you’re writing to a machine. This is what the machine cares about. Here’s how to game that system with your writing.”

    And they watch their score actually as they’re doing it. They see—they add another sentence and their score goes up whatever percentage. But it’s not because it’s a good sentence or contributes something new or is unique. It was just more words.

    Scott Black: Wow.

    Lizzie Callaway: It’s very sad.

    Scott Black: It’s really sad.

    Lizzie Callaway: But my daughters write in their journals too, like I said.

    Scott Black: Yeah. So that actually—that returns us to the point. So why do they write in their journals?

    Lizzie Callaway: Other than me being a fabulous mother?

    Scott Black: Of course.

    Lizzie Callaway: They write in their journals so I can brag about it to my students, Scott. I’ve come up with all these ideas. It’s a chance to pause. If you’re writing about something, you will notice things that you won’t notice about the world if you didn’t take a break and actually have to do the slow process of writing about it. It’s a chance to form memories. If you write something out—especially like she was, by hand—you’re dumping it into long-term memory. So 20 years from now, you might remember this trip to the Pyrenees.

    You’re also entering a conversation. I mean, a journal’s not a great example of that—I might read it, or she might read it again 20 years from now and remember her 8-year-old self writing in that journal. As a scholar, definitely one of the reasons I’m writing is to enter into a conversation—to say something back to people who have said things about the objects I care a lot about.

    And it’s also a chance to get to know yourself better—to think about what you like, what you don’t like, what is a worthy use of your time. I also think of her writing in a journal as making sense out of a day. A day is just things that happen until you put a narrative on it and really create a sense of what your day was, where it’s been, where it ended up—kind of form a personality. Like you were saying, writing personality is sort of just a narrative of yourself.

    But I came up with all these reasons that I think she writes—or both of them write—but I finally asked her. And I said, “You know, we make AI do dragon paintings for us sometimes, and you know it can write like this. Why do you write in your journal still?”

    And she said, “Well, there are things I want to say. And the AI’s not going to say the things I want to say. And I want to say them. And it can produce something that’s also a piece of writing—but it’s not the thing I had imagined and wanted out there on the page.”

    Scott Black: That’s great. It’s really great.

    Lizzie Callaway: Yeah.

    Scott Black: She should be an English teacher.

    Lizzie Callaway: Maybe she will be.

    Scott Black: I wouldn’t condemn her to that. But I think all of that is really, really important. And I really feel that the slowness of the process is really part of the point. What you are willing to take time over is what you care about—what is most worthwhile for you, and what’s most formative for yourself.

    Which is why, if you can get people to pay attention to the things they love, they will spend more time and slow down and really explain what they’re thinking—and frankly, have their own thoughts, as you said, as your daughter said.

    Lizzie Callaway: Yeah. I do feel like attention is the crux of it—what you’re willing to put your attention on. I also think there’s a war over attention, because all these other AIs I’ve been talking about are plugged into the attention economy.

    There’s this new book out, The Siren’s Call

    Scott Black: Mmhmm.

    Lizzie Callaway: —by Chris Hayes?

    Scott Black: Chris Hayes, yeah.

    Lizzie Callaway: And I’m reading it, and he just points out so many things that make so much sense when someone says them out loud. He was on this other podcast I like about energy called Volts, and he said, “You could go in a room with 500 people and ask any one of them, ‘Hey, can you get everyone’s attention?’” And anyone could. You just start yelling, or dancing, or taking your clothes off, he says—do anything. Anyone could do it.

    But there are very few people who you could say, “Okay, now I want you to hold these 500 people spellbound for an hour.” And it would take a very, very special person to be able to do that at all.

    So what we have is an attention system where it’s just grab, grab, grab—no holding. Just grabbing, over and over again. And we’re used to our attention being grabbed. And I think it’s making it harder to consciously and intentionally place where we want to put our attention—and hold our attention.

    Scott Black: Hold our own attention.

    Lizzie Callaway: Hold our own attention, yeah. And I think what we lose from that is everything we lose from writing—when we don’t do the writing ourselves.

    Scott Black: I think that’s really important. And I completely agree that attention is now the most valuable currency. And I mean that—well, I shouldn’t say currency—it is being monetized, but it’s also our most important resource for ourselves.

    And you know, for years the humanities has sold itself as “better information,” as if we’re participating in the discovery of new things, as I think science does and science should. But I’m not so sure that’s—or ever really has been—what makes us valuable. I think it’s exactly that we learn to do something other than grab new things or search for new things.

    We’re actually committed to the preservation of things—and the development of the ability, as you said, or as Chris Hayes said, to hold things. To be able to practice being ourselves in our own ways. Which, in other words, means not just being reactive or responsive, but actually deciding for ourselves what we care about, what we value, what we want to do with our time—or maybe at this point, just to have time for ourselves.

    Lizzie Callaway: Yeah. And yeah, where we want to put our attention. I love talking to Scott because every time I talk to you, I come up with a new idea about what the humanities do that I’ve never heard before. And I think I’ve always been like, “Well, knowledge production—just like everyone else doing knowledge production.” But I love the idea of attention training.

    Scott Black: I really— I mean, I strongly believe this. But also, it’s one of the problems with the way the university is set up, because we have to justify ourselves in terms that have to do with what we can create. And one of the things we can create is more knowledge. Honestly, there’s nothing but knowledge out there.

    Lizzie Callaway: I know. There’s a lot of knowledge about.

    Scott Black: And it’s mostly not knowledge.

    Lizzie Callaway: Right.

    Scott Black: It’s just information. So there’s the famous Walter Benjamin essay about the storyteller, where he talks about the difference between novels and stories. And he says novels are just information. What we actually need are stories. And I think that’s what we should be thinking about in why we do what we do. And now I’m talking about English, but I think it would work with lots of other humanities fields.

    We’re not just there to teach you new things—we’re teaching you how to understand for yourself. And central to that, as you just said, is the practice of close attention. And your own attention, as your daughter said.

    Thanks, Lizzie. This was really fun. It’s great to talk about why we want to write—and why we want to make things difficult for our students in the age of AI. Thanks very much for the conversation.

    Lizzie Callaway: Thank you, Scott. Bye-bye.