Skip to main content

The Synthesis: Valerie Veatch Connects AI and Eugenics in ‘Ghost in the Machine’

“Back Up a Layer”

Image
A CGI-rendered Kool-Aid-looking juice jug stands on a stage at a convention with men with lanyards looking in the audience

“Back Up a Layer”

Ghost in the Machine. Courtesy of Sundance Institute

Valerie Veatch traces the links between AI and eugenics in Sundance-premiering Ghost in the Machine, a self-funded documentary treatise made with Zoom calls and conviction

Welcome to The Synthesis, a monthly column exploring the intersection of Artificial Intelligence and documentary practice. Co-authors shirin anlen and Kat Cizek will lay out ten (or so) key takeaways that synthesize the latest intelligence on synthetic media and AI tools—alongside their implications for nonfiction mediamaking. Balancing ethical, labor, and creative concerns, they will engage Documentary readers with interviews, analysis, and case studies. The Synthesis is part of an ongoing collaboration between the Co-Creation Studio at MIT’s Open Doc Lab and WITNESS.


Premiering at Sundance, Ghost in the Machine is a burningly urgent self-funded documentary treatise connecting the historic and on-going links between AI and eugenics. It pulls on nearly forty Zoom interviews with historians, scholars, computer scientists, and human rights activists, which are collaged together with archival clips. The film draws a fat sharpie marker line across time to make the lineages clear. 

 

The film slaps key conceptual post-it notes on the wall, such as how the term “Artificial Intelligence” was concocted as a marketing term to raise research funds. The concept of intelligence and the field of statistics were constructed to justify white supremacy. And the current promises of superintelligence are merely pseudoscience drawing on those legacies of hype and racism. 

We Zoomed with the film’s director Valerie Veatch (Me at the Zoo, 2012; Love Child, 2014) before she headed to Park City to learn how she pulled together this film with duct tape, a boxcutter, and a lot of conviction. The following interview was edited for length and clarity. 

 

DOCUMENTARY: How did you come to make a film that connects AI with eugenics?

VALERIE VEATCH: A dear old friend of mine signed me up for this artist group with OpenAI Sora. I didn’t even have a laptop, as I was heavy into parenting, living in a field in the middle of England. It was really exciting, because there were all these artists on this Slack channel, and everybody was experimenting with this new creative tool that’s democratizing storytelling. And then, the darkness came in.

It is immediately clear that there are huge racial stereotypes in the outputs. It makes prompting embarrassing and hard to do, because you there’s a violence in the image you’re receiving, and you didn’t mean to make that depiction. Then it hypersexualized depictions of women unprompted. You could be prompting “women standing in a coffee shop” and with each iteration of the prompt, she’s losing more clothes. 

I started corresponding with the company’s artist liaison, and they said, “Here’s a document you can put all your research in.” I was finding all this interesting work from researchers such as Ababa Burhani about poison data sets and multimodal behavior based on bias, and I was like, “Oh my gosh, OpenAI must not know about this! I’m gonna tell them.” And then there was an incident that happened in the group that made it very clear to me that they did not care. And, in fact, it was cringe for me to be bringing this up. In the end, around January last year, I was referred to a third-party DEI specialist who deals with equity initiatives in tech.

I was overcome with this grief around how this group of people have so much cultural power. The truth is, it’s not filmmaking. It is not an intentional production of images, which is what we do. It’s the racist, sexist hallucinations of a dataset. And since I started making the film, the integration of generative AI into every piece of software that we have is shocking. If you look around enough, you begin to encounter this narrative of superintelligence that all of these huge AI companies are selling.

And that is the second crime that this moment holds. It’s this message, not only to us, but to our children, that a computer can think, and that it will replace humanity. That is a lie. As soon as we touch that idea slightly and push it over, it evaporates into pseudoscience and fantasy, which is what our film kind of explores. At the end, what we’re left with is not superintelligence, but rather this infrastructure that’s not going to be used for creativity or to create sentience, but to enable a certain kind of technological system that we all live inside. 

It’s this message, not only to us, but to our children, that a computer can think, and that it will replace humanity. That is a lie.

— Valerie Veatch

D: The film has a really urgent punk aesthetic with a mash up of tons of interviews, clips, archives. Can you talk a little bit about this creative choice you made?

VV: I just didn’t get funding for it, because this story is not something that people want to fund. And maybe a year from now, it will be different. But this is a consistent theme throughout my work—it is difficult to get tech-negative projects funded, especially now that all of the distribution companies are tech companies.

So I grabbed my webcam and started reading white papers and contacting people, and I wasn’t sure if a Zoom call would make a compelling film. I trust my own filmmaking, I’m an editor, and I know I can make something good, but I didn’t know if it would be coherent, hang together, or feel like a movie. There’s a reality check inside this film. If you look at the Netflix style guidelines along with Showtime and HBO… in order to fulfill those guidelines, you have to spend at least $1 million. It becomes this hurdle. If you look at the stories that have the funding and deal with this issue, their conclusion is to let the tech bros keep doing their tech. If that’s not your conclusion, then it’s too outsidery, and too scary, and too threatening. 

D: What was your budget? 

VV: It’s been all self-funded at this point, with a bit of grant money to cover archival. I can self-distribute. I think there’s something that happens to a film when it becomes proprietary.

D: Throughout Ghost in the Machine, in the top right-hand corner of the screen, you have a label that displays either “NOT AI” or “AI” as a disclosure. It’s a brilliant feature… but at the end of your film, Dr. Jonathan Flowers encourages all of us to ask: “Why we need AI in anything at all?” So, why did you need AI in your film?

VV: There are two ways to contextualize the use of AI in the film. First, this whole project started out with very open-hearted, pure engagement with the OpenAI artist group, learning a new technology. That betrayal left something unresolved creatively for me, so it’s used in the film as a punchline. When we’re talking about superintelligence, and then we see [AI-generated images like] a guy putting on two pairs of glasses and a jug of Kool-Aid dancing, it punctuates the horribleness, and the emptiness, and the wrongness of what we’re talking about. As a filmmaker, I see that immediately, and I’m like, that’s hilarious. 

I also understand, as an audience, there’s so much information coming at you that this mechanism could be disorienting, or can get lost on some people, and I’m fine with that. To me, it was very obvious what was AI and not AI, my friends and family couldn’t tell. So I decided I’ll label everything—first all of the “Not AI” stuff, then eventually the AI stuff too. I love that mechanism, because it addresses the cognitive effort of trying to decipher if something is real or not. 

I would frame the use of AI in the tradition of feminist video art that used the aesthetic of the patriarchy to critique the patriarchy. At Sundance, we’re gonna have these little white buttons that say “NOT AI” that we will pass out to everybody at the screenings. We have tote bags that say “NOT AI” too. 

I would frame the use of AI in the tradition of feminist video art that used the aesthetic of the patriarchy to critique the patriarchy. 

— Valerie Veatch

D: You’ve said that you have shared the film with some family and friends and audiences that may not have a high level of AI literacy. How do you hope this film will connect with folks that don’t have that kind of context? And more broadly, what can a documentary do to contribute to the growing movements to expose the harms of AI? 

VV:  People who are conservative, who are Trump voters, have watched the film. And I’m telling you, from the front row, their hearts are changed. When you depict white supremacy, when you trace this mechanism and show how it is interwoven with AI, you can’t separate these stories. 

The story of the Kenyan data workers is something that I didn’t know before I started making the film. The whole ecosystem of AI is rotten. Even if the end products don’t bother you, and you’re okay with women being undressed non-consensually on X, fine—then we back up a layer. What about the people who are tasked every day with training these data sets, moderating this content, and organizing this data? It’s often super harmful and always in very exploitive labor conditions.

There’s the environmental impact of these hyperscale data centers, which is extremely real, touching every community. I look at this film like it’s a mural. When you stand back and you see the whole thing, you’re like, “Ew, why would I use this technology?” Marshall McLuhan says the medium is the message. That’s the generative AI in the film. It sends its own message of grotesquery—that this is gross, and using it is gross. 

Related Articles