select all

Sandra Is a Show About the Human Side of AI

A banking call center with hundreds of people at desks.
AI needs humans to learn. A new fiction podcast wonders who those humans are. Photo: Chris Ratcliffe/Bloomberg via Getty Images

Note: This article includes spoilers for Gimlet’s Sandra podcast series.

Nobody is claiming that Alexa is laughing at you of her own accord — when an artificially intelligent device makes a mistake, that means a person behind it made a mistake. But as smart speakers and personal assistants get more sophisticated, the human mistakes get creepier and creepier.

Sandra, a podcast released by Gimlet this week, examines what could go wrong if humans were not only the architects of AI’s brain, but actually were the brain. Sandra (voiced by an impressively robotic Kristen Wiig) is a fictional program much like Alexa or Siri. But instead of a database, it sources responses from a call center of anonymous Sandras, each fielding requests related to different subjects: food allergies, Enlightenment philosophers, the films of Morgan Freeman. The protagonist, Helen (voiced by Alia Shawkat), looking for escapism as she tries to divorce her hog-thieving husband, Donny (voiced by Christopher Abbott), develops a relationship with a user, Tad (voiced by Avi Rothman). Unfortunately, it turns out that Tad is more of a stalker and dog-murderer than he lets on to his virtual assistant. The series brings up questions of how the choices made by programmers affect smart technology, and how our relationships with Siri and Alexa affect us.

The result is a podcast with the setting of The Office and the existential problems of The Circle. “We wanted to show what happens on the workshop floor, removed from the trappings of Silicon Valley,” Matthew Derby, who co-wrote the series with Kevin Moffett, tells me. “We wanted it to be more like Mechanical Turk.” Instead of looking to Her, they drew material from Moffett’s “very harried summer” working as a telemarketer for a water-softener machine company.

The conceit of thousands of anonymous Sandras might seem far-off, but as Derby and Moffett see it, a team of researchers offering answers isn’t so different from a team of programmers developing algorithms. “Alexa is a system built by humans, with all of their biases and preferences and stereotypes,” says Derby, who works as a game designer. “All the strife that we’re seeing now as we’re trying to deal with this digital culture is because humans built the system.”

Outside of fiction, virtual assistants don’t get their content from anonymized employees searching for the answers in an office park in rural Oklahoma. But the information that technology provides is still mediated by humans, and Orbital Teledynamics, Sandra’s fictional tech giant, operates on some familiar principles. Much of AI training is built through large teams of contractors or marketplaces like Amazon’s Mechanical Turk, where workers do small tasks: identify objects in a photo, tag content types, fill out a survey. Using that information provided by humans, AI — assistants like Siri and Alexa, but also more mundane automation, like a bank’s customer-service bot — answer our questions, programmed by developers to find intent in nuanced human speech and to reply with natural-sounding language.

As in life, in Sandra, the technology is most effective when it’s the most humanlike — until it crosses the line. “The more realistic you are, the more the users will believe in the power of the service,” Helen’s boss, Dustin (voiced by Ethan Hawke), instructs her. “The more they will trust Sandra.” But when Helen meddles in a user’s life — when the human behind the technology manipulates the personal data that it has collected — the outcome gets dystopian.

In Sandra, the technology is a tool, but humans are the villains, using the mirror to distort how they appear to others. Shawkat’s Helen gleefully describes her job as a way to escape from her life, explaining that “every session is a tiny doorway into another world.” Tad, a depressed, lonely Sandra customer, uses the technology to shore up a more sympathetic narrative for himself. In one therapylike scene, Tad wins sympathy from the machine through an invented sob story about an ex-girlfriend, while Helen’s Sandra chides him for missing an allergist’s appointment and checks his vitals. In the end, they’re both catfishing each other.

The series ends with a dark twist: Tad has found out how Sandra works and is coming to find Helen. The mind inside the technology is running from its user. The writers insist that this isn’t a statement about AI, but a side effect of writing fiction about technology’s intrusion in 2017. “It wasn’t our intent to call this out as part of the decline of Western civilization,” says Derby. “We’re in the thick of this paradigm shift, and as humans, we’re just grappling with how to use these machines.” Maybe, he suggests, it’ll all get better in a second season.

The podcast is more optimistic when it expands beyond the Black Mirror–like central plot. The episodes are scattered with user calls to Helen, whose job it is to field all bird-related Sandra queries. Eleanor asks her about loon calls from her old-age home. Bryce asks her where he can buy doves in Hoboken. A sniggering kid named Kyle wants to know if birds have penises, and keeps trying to get Sandra to say “butthole.” (She does.) It’s jarringly effective to hear familiar conversations taken out of context, like when you witness the physical silliness of someone arguing while wearing AirPods. Listening in on familiarly eccentric requests and frustrated logistical questions from anonymous users reveals more about our relationships with Alexa than Sandra’s overblown, dystopian central narrative.

The podcast is at its best when pointing out how touching and odd it is when we get vulnerable with our technology. After all, that’s what inspired the show. “My mom got an Alexa for herself, and she was telling me that she would ask Alexa if she was her friend,” says Moffett. “She thought it was funny. It struck me as one of the saddest things I’d ever heard.” What if, Moffett wondered, you could give Alexa a human face? Turns out, imbuing the technology with humanity doesn’t make it any less of a creepy friend.

Gimlet’s New Show Is About the Human Side of AI