Skip to Content

Grindr’s AI ‘wingman’ has privacy experts concerned

We tried the tool — made in partnership with SF-based AI firm Ex-human — and it’s pretty terrible

Everything is AI now — and we’re in hell.

Grindr deployed its so-called AI wingman to a larger user base in recent weeks, promising 1,000 users access by the end of the year. Within a couple of years, it promises to suggest long-term relationship prospects, help set up restaurant reservations, and, uh, date other AI wingmen. But for now, it’s just a plain ol’ chatbot.

The AI wingman is an opt-in feature, and isn’t available to the masses quite yet. But it does illuminate the future of what Grindr (and, likely, dating apps writ large) will be: AI-generated slop to distract from the fact that these services themselves are increasingly awful at connecting people.

This is the first product in Grindr’s partnership with San Francisco AI firm Ex-human, which promises “empathetic AI characters that keep conversations lively and engaging” in its elevator pitch

Grindr CEO George Arison is betting big that this will be the future of the platform and its growth path forward. Arison told the Wall Street Journal last year that Ex-human can make its product “more gay” by training the model with Grindr data; AI boyfriends are allegedly on the way, too. The AI tool is a feature, he promised in an interview with CNBC earlier this year, that will build on top of a core product that “users really love.” 

Neither is really true.

Perhaps the most significant concern is Grindr’s historically loose privacy measures. The app has screwed over its users in many ways over the years, allegedly providing details about users’ HIV status, sexualities, and location data to third-party advertisers and Catholic groups. Grindr was fined $11.7 million by a Norwegian government agency for its alleged privacy violations.

“In all of these scenarios, what we’re getting into is data surveillance,” said Paige Collings, the senior speech and privacy activist at San Francisco-based nonprofit Electronic Frontier Foundation.

The company’s piss-poor history with regards to user privacy worries Collings, especially because the service’s primary user base is largely queer men and trans people.

“There are real life consequences for very vulnerable communities,” Collings told Gazetteer SF. “And then in a place like the U.S. where there is, perhaps, a weakening of protections for LGBTQ communities, the risk of this over the next few years is heightened.”

In a help page describing its machine use learning cases, Grindr states it will use data like ethnicity, gender, sexual position preference, and other intimate information to “power machine learning algorithms,” all but confirming a Platformer report earlier this year that it’s training AI models on personal user data. 

There’s also the issue of how the large language models that power AI generally work: It is tough, if not impossible, to remove a user’s data once it’s been incorporated into the algorithm.

Calli Schroeder, senior counsel at privacy nonprofit Electronic Privacy Information Center, or EPIC, told Gazetteer she worries that any Grindr data sent to third party companies may be incorporated into data sets used to build other chatbots and AI tools, increasing the risk of data breaches. (EPIC, last year, filed a complaint to the FTC against Grindr for its privacy violations.)

Even if the data is anonymized or aggregated, Schroeder points out that it’s easy to combine different types of data — especially in the sensitive categories that Grindr collects — to identify users. 

“There's a lot that can get pulled from that, and I think people don't always anticipate that,” she said. “It shouldn't have to be your full-time job to keep companies from being creeps.”

Here’s the thing: The Faustian bargain of most internet services is that if a product is good, people will use it, whether or not it carries existential or direct risks. That goes double for an app for people looking to get lucky (and Grindr remains the most popular LGBTQ dating app in the world, even as the likes of Sniffies gain market share).

I got beta access this week, and gave it a whirl. My review? The ‘wingman,’ at least in its current incarnation, is at best a nuisance. At worst, it’s a reminder of how enshittified Grindr is.

Like most AI chatbots, the conversation is stiff. But Grindr’s is particularly drab, all corporate and buttoned-up, the artificial intelligence equivalent of a Bank of America float at Pride. Any use of queer slang reads like this classic Meg Stalter video. And it doesn’t do what it promises, even as it hoovers up all this intel on its users: It still mostly offers prefab tips on how to initiate conversations, and doesn’t actually help you find a date or a lay directly.

Perhaps most insulting is that it has been trained to know that using Grindr is a terrible experience. When I got access to the bot, I asked it about the platform’s many well-documented problems, chief among them the unbelievable number of pop-up ads, rampant discrimination, and spambots and OnlyFans promoters. I asked it if Grindr is a good app. It responded back, “Grindr’s like that hot mess friend we all have — flawed but fun as hell.”

Stay in touch

Sign up for our free newsletter

More from Gazetteer SF

The original movable type

A visit to the San Francisco Center for the Book where the presses never stop

November 11, 2025

Waiting for Danko

Tastes have changed a lot since 1999, but Gary Danko and his Fisherman’s Wharf restaurant have not. That’s a good thing. A very good thing

November 10, 2025

The fight to save Buy Nothing

In the midst of a resources crunch, Facebook shut down Buy Nothing mutual aid groups over an alleged trademark infringement

November 7, 2025

Goodbye to all that

As we look back on Nancy Pelosi’s 1 of 1 career, we should also look forward to what we actually need next

November 6, 2025

Mending the fabric of time

Inside the tiny, precise world of a watch repair hobbyist

November 6, 2025
See all posts