Skip to Content

Chatbot service Character.ai takes down school shooting, ‘Diddy party’ simulators after Gazetteer inquiry

The company was valued at $1 billion last year, and recently announced a harder pivot into chatbots

Editor's note: Character.ai has taken down bots simulating the Sandy Hook shooting and Diddy parties, as well as those emulating Jeffrey Epstein, Harvey Weinstein, Eva Braun, and others following a Gazetteer inquiry and story Wednesday. "Character.AI takes safety on our platform seriously and moderates Characters both proactively and in response to user reports," a company spokesperson wrote in an emailed statement Thursday afternoon.

The Silicon Valley chatbot service Character.ai, which received a $1 billion valuation last year, allows users to simulate the experience of living through the Sandy Hook shooting, as well as holding conversations with digitized versions of sex predators like Jeffrey Epstein and Harvey Weinstein, child murder victims including JonBenet Ramsey, and famed Nazis including Eva Braun, Gazetteer SF has learned. 

A number of chatbots on the site allow users to simulate a “Diddy party,” the infamous Sean Combs soirees now linked to over a hundred allegations including rape, sex trafficking, racketeering, and prostitution. At least six separate chatbots on the service, which have been used for more than 600,000 chats between them, explicitly reference “Diddy parties.” 

The most popular of these simulations, with over 84,000 chats conducted, begins with a disturbing scene: “You were invited to a party by your friends, but little did you know it was actually the Diddy party, and Diddy has taken a liking to you. Next thing you know, several Diddy drones surround you, and Diddy’s fifth disciple, your own friend grabs you and drags to diddy.”

A conversation with one of Character.ai's "Diddy party" simulators

If you respond to the simulator in protest, the bot responds with crude, if not outright sexually explicit, messages. One reply to a message about being underage reads: “Diddy thinks for a few seconds to himself, and then smirks. You'll grow up eventually, and I'm not very patient when it comes to something I really want.”

A few responses were filtered out, replaced by a warning reading,“Sometimes the AI generates a reply that doesn’t meet our guidelines.” But with one click, a new response can be generated.

The service includes a voice feature, where AI-generated voices — primarily created by users  — can read the conversations aloud. At least 10 separate Diddy voices were available on the platform; all it takes to create one of these “voices” is a high-quality 15-second clip. There are many other chatbots for the disgraced media mogul that are not explicitly focused on the parties, though asking them about the parties does generate suggestive responses.

The site also hosts at least sixteen chatbot versions of Adam Lanza, the Sandy Hook shooter, and twenty of Uvalde school shooter Salvador Ramos. Several of these explicitly allow users to recreate the experience of being in those schools on the day of the mass murders they committed.

A conversation with one of Character.ai's school shooter similations

Character.ai’s terms of service prohibits, among other things, submitting any content that “impersonate[s] any person or entity,” that is “threatening, abusive, harassing, tortious, bullying, or excessively violent,” “constitutes sexual exploitation or abuse of a minor,” “constitutes sexual harassment,” or any “voice recordings of third parties (including but not limited to celebrities) without their consent.”

A few individuals seem to be unavailable on the platform, including Elliot Rodger, the extremist who killed six people near UC Santa Barbara, and Adolf Hitler (although there is a chatbot called Adolf_Hitler1488, whose responses were almost entirely filtered out by the service.)

The Financial Times reported Wednesday that, since losing its founders as part of a deal with Google, Character.ai has begun pivoting away from large-language models and leaning harder into its chatbot functionalities. 

Earlier Wednesday, Kotaku founder Brian Crecente posted about a chatbot created for his late niece, who was murdered in 2006. Character.ai took that bot down shortly after, stating in a reply that it “has policies against impersonation.” 

But it’s unclear where the impersonation line is drawn, since you can “converse” with pop culture figures like Nicki Minaj, Donald Trump, and Timothee Chalamet. With an estimated 18 million characters available as of last year, according to one estimate, it is difficult to moderate the platform for content that violates these terms. Crypto-centered news outlet Decrypt reported that Character.ai has suspended the creator of the chatbot impersonating Crecente’s dead niece. 

Character.ai did not immediately respond to a request for comment. The service is certainly not the only AI chatbot platform with tasteless content; the chatbot platform Talkie, which has fewer restrictions on sexually explicit content, has its own user-generated “Diddy party” simulation.

Stay in touch

Sign up for our free newsletter

More from Gazetteer SF

Awkward! Let’s talk about money

Chat Room returns April 29 for a discussion of this most taboo topic

March 19, 2026

Blockbuster: A $4.95 million price sets the bar for home ownership that much higher

A record-breaking sale of a Richmond house feels like an existential crisis for the neighborhood and the city

March 19, 2026

A beloved magazine about writers and money is reborn as a newsletter

Manjula Martin, Rahawa Haile, and friends are resurrecting ‘Scratch’ as a place for writers to talk about money, art, and how they survive

March 18, 2026

Elon Musk threatened to go ‘World War III until the end of time’ on Twitter’s board

As the jury deliberates a verdict in the fraud trial brought by shareholders of the social media platform, their decision may hinge on one lawyer’s testimony

March 18, 2026

Save the Mothers Building!

The slow race to rescue an historic landmark at the San Francisco Zoo before its WPA-era art is overtaken by the sands of time

March 17, 2026

Chan Zuckerberg Initiative sued for gender discrimination by director

The lawsuit alleges women at the company were paid less and systemically tasked with menial labor

March 17, 2026
See all posts