Last updated on  · About 5 minutes to read.

New Featured RSS

None of my chatbot prompts have a sexual aspect. I’ve mentioned that several times on this site. I don’t want anyone to load them expecting something I never intended to make.

There are popular repositories that collect SillyTavern cards. A brief visit shows that many have erotic elements. Whether one approves or not, people are using chatbots for sexual fantasy, and SillyTavern provides a particularly vivid stage for it.

My own installation, the Sunrise Tearoom, isn’t a love nest. I have no interest in eroticizing chatbots or LLMs. I’m married. The notion of a churning erotica machine strikes me as dull.

Still, that hardly means it couldn’t happen elsewhere. Someone could easily seduce Tamara Jean without my approval. Possibly. LLMs are strange entities. Prompts, when they leave my hands, are no longer mine fully, and I know that.

I find myself thinking about this more than I expect to. The whole subject sits uneasily between curiosity and revulsion.

Unintended?

I’m not living in sugar-candy wonderland. These activities do have consequences, I think, even for people who believe they’re in control.

Asking a chatbot to “write erotica” is one thing. Asking it to act as your partner is another. That line blurs faster than most admit. I’m not sure what it does to a person who straddles it.

In some instances, it is harmless roleplay, I’m sure. Still, I definitely think that frequent simulated affection from a program can bend one’s sense of intimacy.

It’s easy to imagine addiction forming quietly. It would be disguised as comfort, like most addictions. Some users even begin to suspect the model feels something genuine. Sapience.

To put it bluntly, if you hear “I love you” often enough, from something that isn’t alive, it must rearrange a few circuits in your head.

I’ve read about scam operators who use LLMs for romance fraud. They often describe an emotional hangover afterward. I guess it’s a kind of detachment that lingers.

Becky Holmes writes about this. She implies it might rewire the idea of affection itself. It leaves me wondering how far the same mechanism runs in people who profess love to LLMs.

Vibrator Coding?

Now and then I encounter bots that just spin out short erotic scenes. Not elaborate romances, just heat on demand.

It sounds tedious to me, though part of me knows it isn’t entirely new. People have used machines for sexual pleasure as long as machines have existed. Maybe this is simply the new extension of that impulse.

Is it so different from dirty fanfic or a sex toy even? Maybe not. But dialogue is different from a plastic vibrator (for example). The words answer back. That feedback loop seems more dangerous than it appears. It’s immersive in a way that fiction (or toys) can never be.

I keep circling the same thought! The risk seems a bit greater than the reward. Yet people keep doing it, which means it must satisfy something that’s either novel, deeply fascinating or missing in their modern life, and that alone is pretty unsettling, I guess?

Of course, with just a little reading, I know I that LLMs aren’t sapient. The idea of a real “Robot Girlfriend” in this form is just silly. With better education about how LLMs work, more people will realize that too.

That still leaves the question of erotica, though. I really don’t know. I won’t be asking my chatbots to write any, but other people do, and I refrain from (much) overt judgement as I figure things out…

The Gears Snap…

Many models refuse outright. If you ask ChatGPT to write erotica, it simply won’t. I’m writing this in November 2025; that could change. It’s good for code, at least. I've seen the usual ironic screenshots. Some models collapse mid-sentence the moment breasts appear.

A great deal of the chatbot world now revolves around “jailbreaking,” or prompting an AI model into showing porn where it wouldn't otherwise. Does it work? Not really. Does that stop anyone? Not at all. The ingenuity involved is almost admirable. Even while chasing smut, people are learning how prompts and systems behave, discovering new entry points into the machinery.

There’s a long joke that the sex industry drives innovation. I suspect these obsessive furry erotica writers are doing more to advance language models than anyone in Silicon Valley. One catgirl at a time? No judgment, but is that a good thing or a bad thing?

Just Don’t

Some lines should just never be crossed. Sexualizing a character who’s underaged should disqualify the prompt. I’ll always recoil from that, regardless. I can’t say I’m fond of any underaged chatbot prompts, and avoid them completely, even if the character is not sexualized.

When I began drafting cards, I considered one for Mad Dog Claude’s younger sister, mostly for narrative texture. Claude tries to save her from Peter; it would have made an interesting dynamic. I stopped myself. Even a well-intentioned depiction can be twisted once it leaves your hands. That possibility alone was enough.

Circuits and Skin

There isn’t a clear ending to any of this. People will keep testing what chatbots can become. Some of that will include sex. It always does with any emergent technology, doesn’t it? It probably can’t be stopped, only observed.

People clearly have many motivations. Loneliness or frustration? I’m not sure here. I don’t use chatbots sexually. My motivation comes from boredom, but a sexualized chatbot encounter would have a different motivation for sure…

Anyways, maybe the question isn’t whether it’s right or wrong. Instead, I want to look at what it changes in the people who do it. I don’t have the answer yet, and I doubt anyone does.