In 2019, Anne stood in line outside an unmarked office building in Dogpatch, waiting for work to begin. What the work would entail, well, she wasn’t exactly sure.
Anne (who asked Gazetteer not to use her real name, citing strict nondisclosure agreements), and presumably many others waiting that morning on Illinois Street, had found the gig on Craigslist. The listing didn’t offer many details, just to show up to the address, do some simple activities, and walk away with $100 cash.
“It was this quiet area,” recalled Anne, who is in her 50s. “But there were a bunch of people waiting outside, so I figured, ‘This must be it.’ Then I heard all this yelling, like, ‘SHUT UP!’ or ‘LOOK AT ME!’ or ‘HOW ARE YOU?!’”
Eventually, someone led Anne into the building, through two sets of heavy curtains, and into a small room, “which was obviously not soundproof.” The room was empty except for an old TV set. Anne’s instructions appeared on the screen. “It would show certain phrases and tell you to whisper, or scream.” She was instructed to express different emotions: angry, happy, “really sad,” she recalled. She assumes her vocalizations were recorded, possibly on video.
“It was quite fun,” she said.
Anne fake-raged, fake-moped, and fake-smiled through the hour, before heading home with a crisp $100 bill in her pocket, satisfied. “I referred, like, everyone I knew.”
With that gig and subsequent ones, Anne joined a new and curious sector of the gig economy: Helping train artificial intelligence on the humans it would eventually serve and, some say, replace.
In addition to scraping huge datasets from video content posted to YouTube, TikTok, and other social media sites, tech companies have contracted untold numbers of short-term human workers like Anne to train their AI for voice models used for customer service chatbots, voice-to-text services, and facial recognition tools. The $100 gigs Anne and other Craigslist-recruited workers were a tiny outlay for companies valued in the billions of dollars and that keep growing.
Anne says she gigs full time and finds most of her work on Craigslist. That first gig in 2019, which was posted on the site by a now-defunct voice bot company called Monster AI, was particularly lucrative. But as the AI industry has boomed, the pay for gig workers training the models has declined. Anne said the standard rate now hovers around $25 an hour, which she said is on par with other gig work she does, like staging homes and staffing events.
Over the last six years, Anne has made faces, moved around rooms, conversed in soundproof booths, read, whispered, and mimed at computer screens to train AI models. She has been fitted with headsets rigged up with microphones, motion sensors, sponges, and other devices that allow the companies to collect whatever data they need, which, like so much machine learning technology, is frequently shrouded in secrecy.
When we think of gig work, we usually think of platforms like Uber or DoorDash, where a subcontracted person is directly involved in the consumer experience: Call a car, and a person will you pick you up; order food, and a human will, at the very least, text you a photo of the package at your door. But AI-training gigs extract not only labor but biometric data from their contractors, functioning more like medical studies. In fact, they are often advertised as “studies” on Craigslist, social media posts, and flyers posted around university campuses.
Unlike medical studies, which are highly regulated, the tech companies soliciting paid test subjects are often evasive about who they are and what they’re building. Anne does not remember the 2019 gig mentioning AI at all. Only in retrospect could she identify it as an AI study, having done other similar trainings recently that are more upfront about their commercial intentions. These days, companies seem more likely to explicitly tag their Craigslist listings under “AI” and “gigs,” but they rarely disclose their names, which participants may only learn once they sign an NDA.
Conduit, an AI company developing a “thought-to-text” brain-computer interface, is one exception; its name is clearly listed in its many Craigslist postings currently calling for research participants.
Gigging for Conduit pays $25 an hour, with higher rates for sessions before 9 a.m. or after 7:30 p.m. The study involves messaging with AI for two hours while wearing a large, brain-scanning headset that looks like a motorcycle helmet.
A Conduit participant named Vic (not her real name), who is in her early 30s and, like Anne, grew up in San Francisco, told me that the helmet is so heavy she always asks her boyfriend to give her a neck massage when she gets home from a training session.
More than just physically taxing, Vic also said the studies can be emotionally draining. She said the conversations with Conduit’s AI chatbot start off basic, asking about hometown and profession, but can get invasive quickly.
“They were like, ‘What are your regrets?’ and ‘Tell us about a time you gave up on something and why?’” she said. When she gave vague answers, the AI pushed her to go into further detail.
“It’s kinda too far. Like way too far. And if this information or data does get leaked, it’s nobody’s business to know what I have going on,” she said.
Privacy is a concern for these gig workers. The transparency Conduit displays on their Craigslist postings only goes so far; the company lists their name and explains that the study will help train their AI, but how exactly the participants’ data will be used and protected is unclear.
Earlier this month, I signed up to do the Conduit study. Its research facility is housed in a dingy semi-basement behind an unassuming Victorian at 660 Oak St. Inside, blackout curtains section off the room into dark cubicles. In the front area, an employee named Terry quickly ran through the privacy disclosures, mentioning at the very end that, like at any company, a data leak is certainly possible, and to keep that in mind as I sign the paperwork.
I was unable to participate in the study as I refused to sign the NDA.
Conduit’s headquarters at least had some privacy. Anne said that most of the gigs she had participated in took place in unbranded, rented office spaces in SOMA and Midmarket. At one gig that paid $15 an hour, which is below minimum wage, Anne was stationed at a desk in the middle of an open-plan room, selling fragments of her interiority to another AI startup while its full-time employees worked around her, as if she wasn’t there.
“That felt like, ‘Wow I’m pathetic.’ I’m sitting at the little dunce desk for low pay, while they’re doing all the important work,” she said. Eventually she asked for earplugs, because it was too loud to concentrate. “I was finally like, ‘This is just below my dignity, I can’t.’ This is below minimum wage, and I’m like decades older than all these people working here.”
Some AI training gigs are one-time only, but many allow participants to return several times. Sometimes they can offer gig workers nearly full-time income, while the studies last.
Last year, Anne spent nearly five months gigging for a company called Wispr Flow, reading the entirety of Hanya Yanagihara’s novel A Little Life three times over: once aloud in a normal voice, then again in a whisper, then for a third time in silence, just mouthing the words. The book is over 800 pages long, and infamously hard to stomach. “Oh my god, it was traumatic,” Anne laughed anxiously, recalling how the monitor would emit a cheerful ping! sound and display “Good job!” while she read graphic depictions of self-harm. When she chose the title off the top of an alphabetically organized list, she had “thought it was just a one-off gig. I didn’t know I’d be coming back!”
At that gig, Anne enjoyed the social aspects. She got to know familiar faces, other subcontractors who were making this their nearly-full-time gig. One woman said she was commuting in from El Cerrito.
Once, on a walk between reading sessions, Anne stopped into a nearby thrift store and struck up a conversation about the study with the man behind the counter. She told him that at the beginning of each session, she was instructed to recite a nonsensical poem — a public domain text known as the “Grandfather Passage” — to calibrate the machine. It went: “You wish to know about my grandfather. Well, he is 93 years old. He dresses himself in an ancient, black frock coat.” Before she finished, he jumped in to recite it with her; he had been doing the study, too.
She even felt a sense of camaraderie with the Wispr Flow team. “It was a great gig,” she said. The people running the studies “were all really young and really smart, and they were really excited about what they were doing.”
Anne, who grew up in Polk Gulch, said she is “not angry at all the tech people.”
“That’s just what they’re into, and I’m tired of everyone hating on everything,” she said. “But I do think it’s had an impact on our daily lives. I mean, I’m not feeling great about where all this AI stuff is going.”
Vic echoed the feeling. “I’m just not a fan of the whole AI thing,” she said. “Soon we’ll all be out of homes, jobs, everything.”
Anne said she feels some cognitive dissonance between her day-to-day experience training AI and the ambient threat the industry poses to the economy and social reality.
“I do feel an element of shame about doing the studies, about contributing to this vast thing that’s going to overtake humans,” she said. “I would not be surprised if I’m held accountable for contributing to a bad thing. Maybe I’ll just view myself at some point as someone who contributed to something that’s doing harm.”
Vic expressed similar mixed feelings about participating in the development of AI. “I do the study because I need the money. Do I care for the study? Absolutely not. Do I think it’s BS? Absolutely. Do I think they’re just trying to use human beings to create human robots? Absolutely.”
But for better or worse, their immediate needs — food, rent, gas — dwarf these ethical concerns.
“Honestly, apart from this conversation, I’ve never really thought that hard about all of it,” Anne told me. “Needing money is a real motivator.”







