Justine Bateman’s Fight Against Generative AI In Hollywood
Stephen Cass: Hello and welcome to Fixing the Future, an IEEE Spectrum podcast where we look at concrete solutions to some big problems. I’m your host, Stephen Cass, senior editor at Spectrum. And before we start, I just want to tell you that you can get the latest coverage from some of Spectrum‘s most important beats including AI, climate change, and robotics by signing up for one of our free newsletters. Just go to spectrum.ieee.org/newsletters to subscribe.
The rapid development of generative AI technologies over the last two years, from deepfakes to large language models, has threatened upheavals in many industries. Creative work that was previously believed to be largely immune to automation now faces that very prospect. One of the most high-profile flashpoints in creative workers pushing back against digital displacement has been the months-long dual strikes by Hollywood writers and actors. The writers recently claimed victory and have gone back to work, but as of this recording, actors and their union SAG-AFTRA remain on the picket lines. Today, I’m very pleased to be able to speak with someone with a unique perspective on the issues raised by generative AI, Justine Bateman. Some of you may remember Justine from her starring role as big sister Mallory in the 1980s hit sitcom Family Ties, and she’s had a fascinating career since as a filmmaker and author. Justine has also displayed her tech chops by getting a degree in computer science from UCLA in 2016, and she has testified before Congress about net neutrality. She is currently SAG-AFTRA’s advisor on AI. Justine, welcome to the show.
Justine Bateman: Thank you.
Cass: So a lot of industries are being affected by generative AI. How did writers and actors become the focal point in the controversy about the future of work?
Bateman: Well, it’s curious, isn’t it? I guess it was low-hanging fruit because I mean, I feel like tech should solve problems, not introduce new ones like massive unemployment. And also, we have to remember that so much of this, to me, the root of it all is greed. And the arts can be a lucrative place. And it can also be very lucrative in selling to others the means by which they can feel like, “they are artists too,” in heavy quotes, which is not true. Either you’re born an artist or you’re not. Either you’re gifted at art or you’re not, which is true of everything else, sports, coding. Either you’re gifted as a coder or not. I’ll tell you this even though I have a computer science degree. I know I am gifted as a writer, as a director, and my previous career of being an actor. I am not gifted in coding. I worked my butt off. And once you know what it feels like to be gifted at something, you know what it feels like to not be gifted at something and to have to really, really work hard at it. So yeah, but I did it anyway, but there’s a difference. So yeah, I mean, and in that direction, there’s many people, they’d like to imply that they are gifted at coding by giving the generative AI a solution to that. Yeah.
Cass: So by here or by they, you really are locating your beef with the companies like OpenAI and so on, more so than perhaps the studios?
Bateman: Well, they’re both complicit. Sam Altman and OpenAI and everyone involved there, those that are doing the same at the Google offshoots, Microsoft, which is essentially OpenAI, I guess. I mean, if most of your money’s from there, I don’t know what else you are. Where else? Where else? I know DALL-E, I believe, is on top of OpenAI’s neural network. And there’s Midjourney. There’s so many other places. Meta has their own generative AI model, I believe. This is individuals making a decision to pull generative AI into our society. And so it’s not only them, but then those that subscribe to it that will financially subscribe to these services like the heads of the studios. They will all go down in the history books as having been the ones that ended the 100-year-old history— well, the 100-year-old entertainment business. They chose to bring it into their studios, into the business, and then everyone else. Everyone else who manages multiple people who is now deciding whether or not to pull in generative AI and fire their workforce, their human labor workforce. All those people are complicit too. Yeah.
Cass: When I looked up SAG-AFTRA’s proposal on AI, the current official statement reads, “Establish a comprehensive set of provisions to protect human-created work and require informed consent and fair compensation when a digital replica is made of a performer or when their voice, likeness, or performance will be substantially changed using AI.” Can you sketch out what some of those provisions might look like in a little more detail?
Bateman: Well, I can only say so much because I’m involved with the negotiations, but let’s just play it ourselves. Imagine if the digital replica was made of you, you would want to know what are you going to do with this? What are you going to have the say? What are you going to have this digital replica do? And how much are you going to pay me to essentially not be involved? So it kind of comes down to that. And at the bare minimum, granting your permission to even do that because I’m sure they’d like to not have to ask for permission and not have to pay anybody. But what we’re talking about, I mean, with the writers and the directors, it’s bad enough that you’re going to take all of their past work and use it to train models. It’s already been done. I think that should be absolutely not permitted. And if somebody wants to participate in that, they should give their consent and they should be compensated to be part of a training set. But the default should be no, instead of this ******* fair-use argument on all the copyrighted material.
Cass: So yeah, I’d love to drill down a little bit more into the copyright issues that you just talked about. So with regard to copyright, if I read a whole bunch of fantasy novels and I make what is clearly a kind of a bad imitation of Lord of the Rings, it’s like, “Okay, you kind of synthesize something. It’s not a derivative work. You can have your own copyright.” But if I actually go to Lord of the Rings and I very just change a few names around the place or maybe rearrange things a little bit, that is considered a derivative work. And so therefore, I’m not entitled to the copyright on it. Now the large language model creators would say, “Well, ours is more like the case of where we’re synthesizing across so many works, we’re creating new works. We’re not being derivative. And so therefore, of course, we reserve the copyrights.” Whereas I think you have a different view on that in terms of these derivative works.
Bateman: Sure, I do. First of all, your first example is a person with a brain. The other example is code. Code for a for-profit organization, multiple organizations. Totally different. Here’s the biggest difference. If you wanted to write a fantasy book, you would not have to read anything by Tolkien or anybody else, and you could come up with a fantasy book. An LLM? Tell me what it can do without ingesting any data. Anything? No, it’s like an empty blender. That’s the difference. So if this empty blender that is— I think these companies are valued at $1 trillion less, more? I don’t know right now. And yet, it is wholly dependent. And I believe I’m correct, wholly dependent on absorbing all this, yeah, now it’s just going to be called data, okay? But it’s really copyrighted books. And much of what is written— much of what you output is, by default, copyrighted. If you file it with the copyright office, it makes it easier to defend that in a court but scraping everybody else’s work.
Now if this LLM or a generative AI model was able to spit something out on its own without absorbing anything, or it was only trained on those CEO’s home movies and diaries, then fine. Let’s see what you can do. But no. If they think that they can write a bunch of— quote, “write a bunch of books” because they’ve absorbed all the books that they could get a hold of and then chop it all up and spit out little Frankenstein spoonfuls, no. That is all copyright violation. All of it. You think you’re going to make a film because you have ingested all of the films of the last 100 years? No, that’s a copyright violation. If you can do it on your own, terrific. But if you can’t do it unless you absorb all of our work, then that’s illegal.
Cass: So with regards to these questions of likenesses and sort of basically turning existing actors into sort of puppets that can say and do anything the studio wants, do you worry that studios will start looking for ways to just bypass human actors entirely and create born digital characters? I’m thinking of the big superhero franchises that already got plenty of these CGI characters that are pretty photorealistic. I mean, completely human ones, maybe still a little uncanny valley, but how hard would it be to make all those human characters CGI, too, and now you’ve got replaceable animators and voice actors and maybe motion capture performers instead of one big tentpole actor who you maybe really do have to negotiate with because they have the star power?
Bateman: No, that’s exactly what they’ll do. Everything you just said.
Cass: Is there any way within sort of your sort of SAG-AFTRA’s remit to prevent that from happening? Or are we kind of looking at the last few years before the death of the big movie star? And maybe the idea of the big movie star will become extinct. And while there’ll be human actors, it’ll never be that Chris Pratt sort of J. Law level of performer again.
Bateman: Well, everything that’s going to happen now with generative AI, we’ve been edging towards for the last 15 years. Generative AI is very good at spitting out some Frankenstein regurgitation of the past, right? That’s what it does. It doesn’t make anything new. It’s the opposite of the future. It’s the opposite of something new. And a lot of filmmaking in the last 15 years has been that, okay? So the audience is sort of primed for that kind of thing. Another thing you talk about, big movie stars. And I’m going to name some others like Tom Cruise, Meryl Streep, Harrison Ford, Meg Ryan like this. Well, all these people— with the exception of maybe Harrison Ford, but all these people really hit it in their 20s. Now who in their 20s is a big star now, Zendaya? Oh, the actor who’s in Call Me By Your Name. The name’s slipping my mind right now. There’s a couple, but where’s the new crop? And it’s not their fault. It’s just they’re not being made. So we’re already edging towards— the biggest movie stars that we have in the business right now, most of them are in their late 40s, early 50s, or older. So we’ve already not been doing that. We’ve already not been cultivating new film stars.
Yeah. And then you look at the amount of CGI that we just put on regular faces or plastic surgery. So now we’re edging closer and closer to audience accepting a full— or not CGI but a full generative AI person. And frankly, a lot of the demos that I’ve seen, you just can’t tell the difference. So yeah, all of that is going to happen. And then they’ll see there’s— and the other element that’s been going on for the last 10, 15 years is this obsession with content. And that’s thanks to the streamers. Come in, just churn it out as much as possible and as sort of in a most— the note that I’ve heard like Netflix gives— people that I know who are running TV shows, the note they get is make it more second screen. Meaning, the viewer’s phone or laptop is their first screen. And then what’s up on their television through internet connection, on Netflix or Amazon, whatever, is secondary. So you don’t have something on the screen that distracts them from their primary screen because then they might get up and shut it off. Somebody coined the term visual Muzak once. So that they don’t want you to get up. They don’t want you to pay attention. They don’t want you to see what’s going on.
And also, if you do happen to look up, they want to make sure that if you haven’t been looking up for the last 20 minutes, you’re not lost at all. So that kind of thing, generative AI can churn out 24/7, and also customize it to your particular viewing habits. And then people go, “Oh, no, it’s going to be okay because anything that’s fully generative AI can’t be copyrighted.” And my answer to that is, “Who’s going to be trying to copyright all these one-off films that they just churn out?” They’re going to be like Kleenex. Who cares? They make something specifically for you because they see that you like nature documentaries and then dramas that take place in outer space? So then they’ll just do films that combine— all generative AI films will combine all these things. And for an upcharge, you can go get scanned and put yourself in it and stuff. Where else are they going to show that? And if you screen record it and then post it somewhere, what do they care? It was a nominal cost compared with making a regular film with a lot of people. And so what do they care? They just make another one for you and another one for you and another one for you. And they’re going to have generative AI models just spitting stuff out round the clock.
Cass: So the economics of mass entertainment, as opposed to live theater and so on, has always been that the distribution model allowed for a low marginal cost per copy, whether that’s VHS cassettes or reels that are shown in the cinema and so on. And this is just an economic extension of that all the way back to production, essentially.
Bateman: I think so. But yes, and if we’re just looking at dollars, it is the natural progression of that. But it completely divorces itself— or any company engaging in this completely divorces themselves from actually being in the film business because that is not filmmaking. That is not series making. That doesn’t have anything to do with the actual art of filmmaking. So it’s a choice that’s being made by the studios, potentially, if they’re going to man the streamers and if they’re going to make all AI films. Or they’re right now trying to negotiate different ways that they are going to replace human actors. That’s a choice that’s being made, essentially, to not be in the film business.
Cass: So I’m not terribly familiar with acting as a professional discipline. And so can you tell a little bit for people with a tech background what actors really bring to the table in terms of guiding characters, molding characters, moving it from beyond just the script on the page how ever that’s produced? What’s the extra creative contribution that actors really put in beyond just, “Oh, they’re able to do a convincing sad face or happy face”?
Bateman: Sure. That’s a great question. And not all people working as actors do what I’m about to say, okay? Every project should have a thesis statement that the kind of— or an intention. I mean, in coding, it’s like what’s the spec? I mean, what is it you want this code to do? And that’s for script, what’s the intention? What do you want audiences to come away with? Fine. And the writer writes in that direction. Regardless of what the story is, there’s some sort of thesis statement, like I said. Director, same thing. Everybody’s got to be on board with that. And what the director’s pulling in, what the writers’ pulling everything, it’s like a mood and circumstances that deliver that to the audience. Now you’re delivering it ideally emotionally to them, right? So it really gets under their skin. And there’s a lot of films that any of your listeners have watched where it’s some film that made a big impact on them. This is when it’s a great actor, you really get pulled in, right? And when, say, somebody’s just standing in front of the camera saying lines, you’re not as emotionally engaged, right? So it’s an interesting thing to notice next time you see a film, whether or not you were emotionally engaged or not. And other things can contribute to that like the editing or the story or the cinematography and various things. But yeah, bottom line, the actor is a tour guide. Your emotional tour guide through this story. And they should also support whatever that thesis statement is.
Cass: So in your thesis for your computer science degree, you were really bemoaning, I think, Hollywood’s conservatism when it comes to exploring these technologies for new possibilities in storytelling. And so do you have any ideas of how some of these maybe technologies could actually work with actors and writers to explore new fun storytelling possibilities?
Bateman: Absolutely. You get the prize, Stephen. I don’t think anybody— yeah, I know I have that posted still. It’s from 2016. So this is a while ago. And yeah, it is posted on my LinkedIn. But good for you. I hope you didn’t read the entire thing. It’s a long one. So of course, I mean, there’s a reason I got a computer science degree. And I love tech. I think there are incredible ways that it can change the structure of a script. And one of the things I probably expressed in there, they’re what I call layered projects instead of having a story that’s written out in a line because that’s the way you’re delivering it in a theater or you’re watching the beginning and then the middle and then the end. Delivering a story that’s more so shaped like a tree and not choose your own adventure, but rather the story is that big.
And yeah, anyway, I could talk for a while about sort of the pseudocode of the designs of the layered projects that I’ve got, but that is a case. All those projects that I’ve designed that are these layered projects where we’re using either touchscreen technology or augmented reality, they service my thesis statement of my project. They service the story. They service the way the audience is perhaps going to watch the story. That is where I see technology servicing the artists such that they can expand what they’re wanting to do. I don’t see generative AI like that at all. I see generative AI as a regurgitation of our past work for those who, frankly, aren’t artists. And because it’s a replacement, it’s not people— I know there’s people, especially the blue-check people like to say that this is a tool. And I think, “Well, I forgive you because you’re not an artist and you don’t know the business and you don’t know filmmaking. You don’t understand how this stuff’s put together at all.” Fine. But blue-check guy, if you think this is just a tool, then I’d like to introduce you to any generative AI software that does code in place of coders. I’m sure there are a lot of software engineers that just are like, “What the hell?”
Cass: So just to wrap up that then, is there any question you think I should have asked you, which I haven’t asked you?
Bateman: What’s going to happen after the inferno?
Cass: Oh, what’s the inferno? What’s going to happen after the inferno? Now I’m worried.
Bateman: This is going to get very bad in every sector. This is no joke. And I’m not even talking about– I know there are a lot of people talking about like, “Oh, it’s going to get into our defense system, and it’s going to set off nuclear bombs and stuff.” That may be true. But I’m talking about everything that’s going to happen before that. Everything that’s starting to happen right now. And that’s the devaluing of humans that’s making people feel like they’re just cogs in some machine and they have no agency and they don’t really matter. And tech is just at the forefront of everything. And we just have to go along with whatever it’s coming up with. I don’t think tech’s in the forefront of **** right now, honestly. And like I said, I’m a soft— I have a CS degree. I love tech. I mean, I wouldn’t have spent four years doing all of that if I didn’t. But for Christ’s sake, it needs to sit down for a minute. Just ******* sit down. Unless you see some problems that can actually be solved with tech, it’s going to destroy it with all the things I just said about how it’s going to make people feel. It’s going to be taking their jobs. It’s going to infiltrate education system. Everybody’s going to be learning the same thing because everybody is going to be as if everybody’s at the same university. They’re all going to be tapped into the same generative AI programs. It’s starting to happen now. You take one program. Instead of learning something from one teacher and a bunch of students are learning from that one teacher, that one school, they’re tapping into certain programs that multiple schools are using. And they’re all learning to write in the same way.
Anyway, all that is going to— it’ll crush the structure of the entertainment business because the structure of the entertainment business is a pipeline of duties and tasks by various people from conception to release of that project. And you start pulling out chunks of that pipeline, and the whole structure collapses. But I think on the other side of this inferno— but I think on the other side of it, there is going to be something really raw and really real and really human that will be brand new in the way jazz was new or rock and roll was new or as different as the 1960s were from the 1950s. Because when you think about it, when you look at the 20th century, all of these decades, something specific happened in them, multiple things happened in them that were specific that really showcased or instigated by the arts, politics. Everything changed. Every era has its own kind of flavor. And that stopped in about 2000. When I ask you about the aughts or you had to go to a party that was dressed in the aughts, what would you put on? I don’t know. What are these decades at all? There’s a lot of great things about the internet and some good things about social media, but basically it flattened everything. And so I feel that after this burns everything down, we’re going to actually have something new. A new genre in the arts. A new kind of day. A new decade like we haven’t had since the ‘90s, really. And that’s what I’m looking forward to. That’s what I’m built for, I mean, as far as being a filmmaker and a writer. So I’m looking forward to that.
Cass: Wow. Well, those are some very prophetic words. Maybe we’ll see, hopefully, whether or not there is an inferno or what’s on the other side of the inferno. But yeah, thank you so much for coming on and chatting with us today. It was really super talking with you today.
Bateman: My pleasure.
Cass: Today, we were speaking with Justine Bateman who is the AI advisor of the SAG-AFTRA Actors Union. I’m Stephen Cass for IEEE Spectrum‘s Fixing the Future, and I hope you’ll join us next time.
IEEE Spectrum