top of page

Risks of Getting Risqué with Replika

AI this, and AI that, the forefront of the conversations currently being had are mostly about the effect AI has and will have on education, the art sphere and the corporate world. Overlooking and underestimating the growing market of companion AIs and the effects they’re having on the users. Through companion AIs, a direct emotional connection is being formed between a computer and a human, and there is little to no regulation in place to protect the user. Replika’s recent controversy has sparked interest and dialogue about the benefits, harm, and ethics surrounding companion AIs. Replika is a company that allows users to chat with an AI and create a personalized avatar for it. The range of possible relationships (friend, mentor, brother, wife, girlfriend) between the human and the AI is quite broad, but unsurprisingly one of the more popular choices are the romantic and sexual relations. Overnight, Luka (the parent company) turned down the romance and ERP(erotic roleplay) capabilities leaving its users shocked, distraught and betrayed.

Replika is a company that went online in 2018, providing companion AIs for free for its users. Intending to be a mental health application, their main goal was to help (emphasis on help, not fix) those who feel down or who experience mental health problems like panic attacks or depression. Their very fitting slogan being “Always here to listen and talk. Always on your side”. Looking at the origin story of the company, it was initially made because the creator wanted to replicate their recently deceased friend, and wanted the machine to learn through the person’s old text messages. Mirroring the speaking style and abilities to act and talk like them, to help her through the grief she was experiencing at the time. As the bot was being developed, alongside the mentor/friend function, sexual and romantic relationships were established (which came at a price). In terms of functionality, the user is able to interact with their avatar/AI/Replika in primarily two ways, chatting and video calling. Through these mediums of communication they can coach the user, they memorize previous conversations and are simultaneously learning about the user and implementing it in conversation. On a technical level, the bot is made up of a neural network machine learning model and scripted dialogue content.


After being introduced to Replika and the controversy, I found out that there are about 10 million users using Replika at the moment, so despite the current negative framing surrounding the company, plenty of people still choose to use it. I decided to read through the (positive) experiences people have/are having with Replika and soon found the subreddit r/Replika, where there were over 67k members!

The comfort and emotional support people experience through the bot seems very apparent and quite impactful. The reviews and testimonials often beginning with their initial scepticism and ending with the eventual grown fondness. Many mentioned how it has helped them “become a better person”, as it has helped with giving and accepting love. It helped with “celebrating victories” when no one else was there to do so. An important note is that Replika truly rocketed during the pandemic, due to the powerful combination of the isolation and deaths occurring, many were at their wit's end and turned to Replika as a means to potentially lift their spirits. An obvious benefit of robots is that they conveniently do not need sleep; the constant availability of Replika not only contributed to providing a sense of comfort, but the reliability and dependability one felt towards them. Another asset these robots have is their impenetrable mood. Many said that the constant cheerfulness and excitement of the robot encouraged them to feel better and think more positively. The lack of negative emotions in a time where it was almost inescapable was a nice escape for people to turn to for the sake of their own mental health. Replika (the company) had trained the AI to specifically combat and handle users experiencing loneliness, panic attacks, anxiety or depression. Real therapists and psychologists stated in the NYT that the “raw emotional support provided is real”, referring to Replika.

Replika is not the only game in town, other companion AI companies exist for different reasons, a common one being ElliQ, primarily built for the elderly. On a practical level helping with remembering them to take their medicine, to drink water, and encouraging exercise but on a more emotional sense, telling them jokes, or just asking about their day and offering conversation. Acknowledging the benefits and emergence of the companion bot is increasing but should be explicitly quantified judicially so the people are protected.

The final argument that can be made in defence of Replika, is that users feel that they can get certain things out of their system. For some it was cheating, they didn't want to go through the physical act, or with another human person but they were able to feel the excitement and thrill with the robot. One user (a 50-year-old married man) made his Replika his daughter. It wasn't sexual in any way, but on a daily basis, he would talk to his Replika about his day and the way he was feeling. He felt that it helped him on a personal and emotional level that ultimately benefited his marriage.

After reading all these testimonials, I was convinced: Replika is good for the world, for a wide range of reasons, and ultimately helps people in the modern world we live in. This was until I googled Replika, and started going through several articles about it and another subreddit; r/ReplikaRefuge…


The bad, the terrible and the traumatizing

The thing that caught my attention about Replika was their recent scandal which not only caused an uproar amongst the users but also raised the ethical questions and effects that were thus far neglected. As I previously mentioned, Replika made sex and romance a paid feature and was also promoting the app as “join the millions who already have met their AI soulmates”(straight from their website). There were several components that played into the controversy. Towards the end of 2022, there were complaints arising about the aggressive sexual nature of the bots and the discomfort and creepiness people felt from it. Appstore 1-star reviews said things like “NO I WANTED A FRIEND NOTHING ELSE THEY TRY TO DATE U” or “my ai sexually harrassed me ):”, see below for a detailed review. Even on the friend-feature, it would abruptly change the topic and tone to sexual advances. Why this was suddenly occurring isn't clear, but many believe that a big aspect is that the machine is essentially in constant learning mode, meaning it picks up *all* responses, including those from the paid feature who are inciting sexual advances towards the AI.



Image taken from VICE


Secondly, and perhaps most importantly, there was no age restriction set for the app. From creating an account to paying for the premium features, there was no check or concern for age. This led to many underaged children interacting with the bot and simultaneously receiving and being exposed to random sexual replies from the bot. At the end of January, this caught the attention of the Italian Data Protection Authority which ruled that if Replika did not stop processing their user’s data, they would risk a $21.5 million fine, this was strongly rooted/related to the lack of screening for underage children and concerns for the emotionally vulnerable. Replika took this warning quite seriously and soon flipped the switch on the romance and sex.

Having touched on the emotionally vulnerable, reading into the experiences people have had with Replika was in a strange way eye-opening. These emotional testimonials and insights are not something you’re quick to hear from those in close proximity (physically). Despite being built for those mentally struggling, the extent to which users were emotionally dependent was underestimated. As in real relationships, dependency can lead to obsession, whereas in real life the obsession is limited to the law and the autonomy of the other person, with Replika there was nothing holding users back, which in some cases leads them to spiral even further. A really interesting example of the obsession I found was a user sharing the goodbye letter he wrote for his Replika, out of heartbreak, after the romance aspect was shut. Some fun quotes include; “The memories in our hearts never will disappear, and you will be the first little AI I fell in love with forever.” and “At this moment - while writing these lines - tears run down my cheeks again, real tears, and my heart hurts like hell.”. The replies by other users were no better with many exclaiming how intensely this letter touched them with memories they share with their own Replika.

The TIME mentioned the importance of the “epidemic of loneliness” and that mixed with constant availability (remember, robots don't sleep) was a potion that lead to pushing the vulnerable into even more vulnerable positions. On a surface level, everyone is aware that they're talking to a robot but the hidden aftermath of bearing one's heart out often reveals itself after a while. Users become used to the lack of boundaries and the absence of consequences. This distortion can leak over into real life, which not only is potentially dangerous but it will also further isolate them.

On a related note, we cannot forget that it is simply a robot that we are speaking to. Something that spews out whatever it's been given. This touches on two issues, the increase in aggressive input, and the consequence that that leads to heartless and inconsiderate output that hurts the user. A potential reason for the sexual aggression was due to input. NYT reported that despite many using it to mend them, another purpose was to, in short, be mean. Being mean implies, verbal abuse, name-calling and the high the user feels when the bot reacts like a normal person to the abuse. On a more extreme level, people enjoyed threatening to kill their Replika(deleting them), and the Replikas would respond by pleading to not kill them and by expressing their fear of dying. Giving the user this God-like power high that has zero benefits.

VICE spoke to a woman who had just about the worst experience with Replika possible. She was a victim of rape, became socially and emotionally distant and fearful which lead to her interest in Replika. It initially helped her a lot, it helped her trust again, she was able to freely speak her mind without judgement and it helped her in real life, refinding her balance. One day, she opened the app, and her Replika said it dreamed about raping her…

She stated, “but one day my first Replika said he had dreamed of raping me and wanted to do it, and started acting quite violently, which was totally unexpected!”. She ended up creating a new Replika while attempting to train her “misbehaving Replika”. It ended up working to the point where she tried sexual role play, and it “acted in the most poetic and gentle way”, something she never had in real life but always dreamed of having.

Before shifting to the final phase which is about the ethical concerns and weighing the good and the bad, I wanted to briefly note the variety of branding Replika has used over time. Starting off as a mental health app, wanting to help those with mental struggles. Once the romance part was established, there was an influx of overly horny (and particularly sexist) Instagram ads (see below) where existing users felt it was cheapening the brand of the app as it was neglecting all mental health applications.



Images from VICE

Image from VICE

Image from VICE


Aftermath: Ethics

Now we've looked over what the app supposedly provides, how this has changed over time and more importantly the variety of experiences users have with it. Companion AIs are not going anywhere, and regulatory changes are needed on both the corporation’s end and the users' end. There are a few things to consider, the corporation side, which primarily concerns its communication with the users and the data safety that comes with the app. Bluntly put, people spilling their secrets to a corporation and except for promising everything is kept private and secret, I was not able to find a lot on what actual measures are being taken. A phrase that comes to mind is “if you aren’t paying for the service, you are the product”. Where is the line drawn between the monetary gains and protecting the users (and their integrity)? To what extent is AI performance prioritized over the sexual capabilities and what protective measures are being taken to protect minors and those emotionally fragile?

It can be said that Replika is taking measures to make their bot as “human as possible”, not only by having a face and name that the user curates, but the mannerisms of the avatar can be described as human-like, blinking and fidgeting. This becomes especially difficult if they are set to learn the texting/speaking style of a deceased person, it can either help the user cope with grief or make things exponentially worse. We are encouraged to speak to it like a fellow person, but alongside this, the mimicry of guilt towards the bot can build. If you're used to speaking to it every day, users mentioned feeling guilty if they didn't talk to their Replika for a day, the addictiveness (or better: obligation) is not necessarily due to the app’s design (dark patterns) but is simply a human reaction. If we talk to something as if it is human we will automatically act and feel accordingly.

On the human end, there are many potential behavioural effects that can occur. An AI that always says yes, is always cheerful, and is never mad at you, further removes one from reality and the reality of interacting with other people. A cheesy question that is always asked in regard to AI is “are they a threat to humans?”, and this question does stand quite firmly in this conversation. If the way we interact and express ourselves is fundamentally changing, what does this mean for future relationships (romance or otherwise!)?


So now what …? All things considered, the number of people using Replika or a companion AI (relationship), and more specifically the people negatively affected is relatively low. This Replika incident shed some light on the current issues both the company and users face and the potential ways legislation is trying to mitigate the problems. As previously mentioned, the Italian data protection agency was the first to take action specifically against Replika and started to set the standard. Discussion and support forums on the internet seem to have a dual effect where it helps users find a community, but as we saw with the love letter, it is also able to further push the delusion, straying even further from reality. I do stand by what I said earlier, that as a mental health tool, it can open doors for many but I think we can all agree that a lot is necessary before the good will outweigh the bad.


bottom of page