So, "Trolls" spread AI-porn of Taylor Swift on TwiX, with one image gaining "more than 45 million views, 24,000 reposts, and hundreds of thousands of likes and bookmarks before the verified user who shared the images had their account suspended". 404 Media "traced back the viral and abusive images of Taylor Swift to a specific Telegram community". The images are not deepfake porn, where faces are superimposed on existing porn images or videos, but pure image synthesis done in Microsofts AI-image generator called Designer.
Taylor Swift is considering legal action, and the "Swifties", her fanbase, is waging an online war against those who keep on spreading those images.
I like Taylor Swift, i respect her for being a voice for artists and fighting for fair compensation, and i even like some of her songs. And as dehumanizing, demeaning and harmful this purposefully psychopathic spread of those images is, i also think that a grown woman, the biggest pop star on earth, who has to deal with fake porn in her image since she has a carreer, is the last thing to worry about when it comes to nonconsensual deepfake and AI-porn.
I'm much, much more worried about two things: Activists and teenage girls. AI generated porn is leveled against both of them at various scales with the goal to apply psychological violence, in case of the former to silence them, like when Rana Ayyub was targeted in a desinformation campaign including deepfake porn for speaking out against sexual violence in India. Mind you, this was in 2018. Other, more prominent activist and political targets of deepfake porn include Kamala Harris, Alexandria Ocasio-Cortez, Angela Merkel, Greta Thunberg, Emma Watson and many more. One needs not much imagination to think about what a viral harassment campaign like Gamergate would look like in 2024, if you're Anita Sarkeesian.
But even more worrying than that is the situation for teenage girls. They already bear the brunt of the psychological consequences of social media, are the most vulnerable demographic for the harmful consequences of rampant gossip and extensive social comparison, both of which are sociopsychological cornerstones of social media behavior. I'm closely following the psychological implications of the digital revolution since the emergence of clickbait in 2012, and i am not surprised by the teenage mental health crisis at all.
Now, on top of that, we get open sourced mimetic AI-algorithms which can be finetuned on any photo, on any face, on any voice, and in consequence we get Telegram bots that can undress anyone, and worse. What do you think happens when you make this technology available to everyone, including horny 14 year old boys? Well, you get "cool kids" sharing fake porn of classmates in private chatrooms, that's what you get. And indeed, a poll by researchers quoted in a 2020 article about undress bots found that the majority of users, 63%, are mostly interested in "undressing" women they know in real life. Only 16% were interested to digitally "undress" celebrities.
Omnipresence of the Swarm-Gaze
I'm a heterosexual man and i can only emphasize with the female experience to a certain extend, and i know at least some of the discourse on the male gaze and while i have some irrelevant opinions on that, i absolutely get that it must be so annoying to run around being conscious about men imagining me naked again and again every two seconds or so. I get that. But i also think that my horny thoughts and fantasies are mine, and they should be free, and so are those of 14 year old boys.
However, this sorta-kinda-equilibrium between male horniness and female annoyance changes when you ramp up the volume with frictionless open source wishfulfilling machines, democratized publishing tools and pathologies in swarm psychology.
Writing in the Guardian about her own experience of being targeted with deepfake porn, Helen Mort quotes John Berger’s Ways of Seeing:
A woman must continually watch herself. She is almost continually accompanied by her own image … From earliest childhood she has been taught and persuaded to survey herself continually. And so she comes to consider the surveyor and the surveyed within her as the two constituent yet always distinct elements of her identity.
With AI-porn being generated at scale, this surveyor constituting an element of a womans identity multiplies into a whole anonymous male group-gaze, being able to undress her at any time. Suddenly, women don't only have to deal with the experience of a single omnipresent surveyor, but with the constant high probability of becoming the target of psychopathological sexual groupthink of a whole digital swarm, with all kinds of mimetic AI tools available to visualize anything that swarm can come up with, including violent sexual imagery, spread for a mean laugh in encrypted Telegram chats.
This new and costant psychological pressure in the back of the head of every young woman must be horrible for grown ups, but i think it's sheer devastating for the developing mind of teenage girls, a new paranoid constant in any young womens experience of the digital, that puts them under so much psychological pressure that sometimes it results in "tragic" outcomes bordering on collective murder by bullying, psychoterror and AI-porn.
No skills necessary
In October, "New York Gov. Kathy Hochul (...) signed into law legislation banning the dissemination of pornographic images made with artificial intelligence without the consent of the subject." In the same month, the UK outlawed distribution of deepfake porn under it's new Online Safety Act.
Deepfakes are subject to the EU AI-Act too, and here's a very long commentary on it's regulation. The AI Act does not classify deepfakes as "high risk", but maybe it should. The commentary correctly states that the "problem with deep porn or discrediting materials is the non-consensual use of someone else’s image and the psychological and reputational harm it creates." The reputational harm only applies to the distribution and publishing of deepfake porn, but the psychological harm can also be applied to the possibility of creation alone.
The problem here, like so often in digital ethical conundrums, is once again scale and ease. When i see an attractive woman in the street, i should be free by law to draw pornographic images of her for private use. That would be creepy ofcourse, but i should be free to be a creep in private in a liberal society. Only a small-ish subset of the populace has the skills necessary to create those images and it's rare and the workload you have to put in to get an image that qualifies as nonconsensual porn image is just too high to have a society wide effect. This changed with the mainstreaming of digital photo-editing to some extend, and it explodes right now.
Any 14 year old can grab Instagram-images from a classmate, send them to an undress-app or a Telegram deepfake bot or finetune a Stable Diffusion distribution and generate dozens or up to thousands of AI-porn images in a day. No skills necessary. Deepfakes still require some skills, but this will change too, the better those algorithms become. These things are already happening, worldwide, with scandals in Spain, New Jersey and the UK, and they are only the tipp of the iceberg, as anyone can tell you who has ever been 14 years old.
This means that a whole generation of young women will not only grow up with an omnipresent male gaze, but an omnipresent digital porn swarm that can target them at any time and bow their image into any position they like, and that swarm will do that without spending a second thought, if women dare to have an oppinion or speak up or become publicly visible in any way.
This is not tolerable.
Shake it up
As Daniel Dennett correctly said:
it is possible for anybody to make counterfeit people who can pass for real in many of the new digital environments we have created. These counterfeit people are the most dangerous artifacts in human history, capable of destroying not just economies but human freedom itself.
Deepfake AI-porn of Taylor Swift is one outcome of technology that can plausibly create fake people and believable images and recordings of them. And it is destroying the freedom of women to be not in constant danger of being targeted by porn-psychoterror from a digital swarm of "trolls", which is just internet-newspeak for psychopath.
Maybe the rights of a 14 year old to produce horny images of his classmates end with the scale and ease of AI-tools and digital publishing, and maybe we should outright ban the open source distribution of mimetic AI-algorithms that can reproduce real people in their image and voice. Because i'm not willing to trade the freedom of women to walk around at least somewhat free from the more pathological versions of the male gaze of a whole digital swarm for the freedom of horny mean assholes generating fake porn en masse for the lulz at best, and for bullying, blackmail and political silencing at worst.
So, i hope this episode is a wakeup call to regulators to get up to speed regarding open source mimetic AI tools and, you know, shake it up.
Updates 27-01-2024:
Satya Nadella says the explicit Taylor Swift AI fakes are ‘alarming and terrible’: “Microsoft CEO Satya Nadella (…) calls the proliferation of nonconsensual simulated nudes ‘alarming and terrible,’ telling interviewer Lester Holt that ‘I think it behooves us to move fast on this.’“
White House calls for legislation to stop Taylor Swift AI fakes: “Legislation needs to be passed to protect people from fake sexual images generated by AI, the White House said this afternoon (…) Jean-Pierre called the incident ‘alarming’ and said it’s among the AI issues the Biden administration has been prioritizing.“
Update 31-01-2024:
Lawmakers propose anti-nonconsensual AI porn bill after Taylor Swift controversy: “The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate ‘digital forgeries’ depicting an identifiable person without their consent, letting victims collect financial damages from anyone who ‘knowingly produced or possessed’ the image with the intent to spread it.“