I’ve written my share about AI-porn and the psychological implications of AI-chatbots used for sexting and ersatz-romance and it turns out, they are one of the most prevailant use cases for the tech, especially when you count in the more softcore versions on Replika or Character.ai which provide more casual AI-companions and filter out sexting (which can be bypassed by roleplay or using anti-censorhip techniques).
There are a lot of numbers floating around and i suspect a lot of them are bloated bullshitnumbers, but those biggest two chatbot services have millions of users as of now: Character.ai has 15 million registered users alone with 600k heavy users registered on a devoted subreddit, Replika has 2 million users per selfreport and there are dozens of smaller adult apps explicitly trained and used for sexting.
Keep in mind that the market of chatbots didn’t exist one year ago, at all, and this technology is, as of yet, far from mainstream. If you leave the tech bubble, you’ll barely find a person on the street who’s familiar with ChatGPT, let alone stuff like Janitor AI, which claims to have no less than three million users sending two billion sexting messages to it’s bots within a few months since its launch in summer.
Quoting Tim Berners-Lees famous axiom that “every new technology is first used for something related to sex or pornography”, it’s safe to say that these phenomena will grow and we are witnessing the birth of a new pornographic interactive-literary genre which may be spiced up with image and video synthesis very soon. In the long run, we’ll beam our adult Character.ai to our Vision Pro and enjoy some steaming hot air.
I’m writing these articles because i’m concerned. I’m a user of porn myself, and i know how it warped my expectations. I’m not an addict, but this stuff in my experience has without a doubt an effect on how humans see and treat each other when it comes to powbangalang, and the steamy interactivity of these mimetic AI-sextbots poses a new kind of problem in context of these reality warping effects of porn.
A few weeks ago, I wrote about how this new world of AI pornography and image synthesis allows (mostly male) users to generate any imaginable pornography involving any woman they can find images of on the internet with just a few clicks. In that same post, i also wrote how those sextbots providing adult services providing an illusion of intimacy are masking an, in psychological terms, jungian state of ur-loneliness to which we return to when we turn on our phones and jack up those sextbots: We rest in ourselves like an Ouroboros, the symbol for a pre-natal state of lonely completeness. Which makes those bots pretty addictive for some of us: “Character.AI says its active users spend more than two hours a day on the site“ (Wired).
A lot has been said about male loneliness and how it may be affected by these chatbots, how they might create a new wave of Incels and add fuel to the fire. But not much about the effect of AI-powered ”girlfriends” on, you know: Girls.
Now, Freya India on her blog about ”Girlhood in the Modern World” describes how this new wave of sextbots is increasing the already intense pressure on the image of self for women, some of whom now see themselves not only forced to compete with perfect influencer bodies on Instagram, but also with always horny avatars which can be edited and customized in every aspect, including personality features, and adapt their synthetic ”mood” entirely to that of the human user.
Always on, always ready, to go wherever it’s user commands per prompt. For an AI-sextbot, there is no ”No”.
In a piece ”on the dangers of stochastic lovers” several months ago, I described a risk i called "radical self-love through AI-mirrors": Users of Love-AI apps, like all users of LLMs, only reflect their own moods and attitudes back at themselves filtered through some algorithmic voodoo. Through intensive use, they can be subjected to a new form of self-radicalization, which in the worst case can result in a rejection of real life relationships altogether or a self-reinforcing feedback loop of abusive tendencies.
The pressure on womens image of self due to Sexting-bots and their ever-ready submissiveness completes this picture.
From Freya Indias piece:
Eva AI not only allows you to select the perfect face and body but also to customize the perfect personality. It offers options like "attractive, funny, confident," "shy, modest, considerate," and "intelligent, strict, rational." Create a girlfriend free from any judgment! One that allows you to spend time with your friends without drama! One who laughs at all your jokes! "Control everything the way you want it," promises Eva AI. Shape a girl who is "always on your side," says Replika.
How can we compete with that? Women in relationships already complain about partners addicted to porn and dissatisfied with actual intimacy. Now we face a future where men could become addicted to emotional affirmation elsewhere, seeking some of that unparalleled devotion. Even worse, what about young boys growing up with this? Their first sexual experience involves chatting with AI women who never say no, never argue, never have their own thoughts or identity — and then trying to date a real woman?
One remarkable detail in this context — and while there are no official statistics on this yet, I do believe the statement of Andreessen Horowitz’ AI-investor Justine Moore at this point — the majority of actual users of these sexbots seem to be female.
A brief personal research confirms this: of the top 24 bots on JanitorAI, only 5 are female, while most of them are "male, dominant." Interestingly, this pattern is completely reversed for the use of AI porn image synthesis, where the users are mostly male.
This seems to me like a complete failure of marketing for these romance-ersatz sextbots. While they are publicly advertised as "AI girlfriends" and their advertising increases psychological pressure on women in real life, they are actually most commonly used by women as "AI boyfriends". The sexist tendency in marketing to promote anything related to sexuality with scantily clad women and sell it to men here adds an emotional dimension to the already effective distortions of self-image through advertising, influencers, and social media.
Again, from Freya India:
We are jumping into our desires! Igniting our fantasies! That’s what these companies will call it as they intensify every female insecurity, monetise male loneliness, and commodify any last shred of human connection we should be sharing together. They will mine it all for profit, promising a fantasy, a dream, and paradise along the way.
As these technologies continue to improve, these always-ready always-horny never-tired sex avatars will influence our intimate lives for men and women alike. Perfect sexting-slaves ready to fulfill every of your wish without agency or sense of dignity like real humans, they may warp our expectations, whether we want it or not, potentially much more than the simple audiovisual porn of yore ever did. I’m not looking forward to this.
Once again, the prophecies of Blade Runner 2049 seem to come true, and I would like to conclude this piece with a quote from its holographic companion-AI aptly named Joi:
You look lonely.
Interesting how it's always the way of coping rather than solving a problem is chosen. Without shifting mindset to a common sense, like "Yeah, those insta models and AI are cool, but it's only virtual. Nothing will replace real person", we won't be able to solve neither problem with today's loneliness, nor the unrealistic expectations.