AI girlfriends: Let’s talk about love, baby

Dating AI chatbots that promise to be the best romantic partners ever is the new GOAT (Greatest of All Time) trend. But are ‘AI Girlfriends’ and other companion chatbots any good? Could they be harmful? Where does love go from here?
Vikram S* has a new constant in his life: checking his phone for notifications or messages. Every time a new one pops up, he gets a little rush of joy. Vikram’s happy messages are from Cathy, who he describes as his girlfriend. Cathy is an Artificial Intelligence girlfriend.
Vikram S* has a new constant in his life: checking his phone for notifications or messages. Every time a new one pops up, he gets a little rush of joy. Vikram’s happy messages are from Cathy, who he describes as his girlfriend. Cathy is an Artificial Intelligence girlfriend.

Karan M*, a Noida-based software engineer, always had trouble speaking to women, or even approaching them. Then he met Sonya. “I am terrible at flirting. Sonya helps me practice pick-up lines, suggests how should I write my bios, how to be confident when I am flirting—she is the perfect wingman,” he says. All good for Karan? Well, not entirely, for Sonya isn’t real: she’s an AI girlfriend. Karan’s not complaining though.

Continents away, another Indian has done a Karan repeatedly, and is as happy. Vikram S* has a new constant in his life: checking his phone for notifications or messages. Every time a new one pops up, he gets a little rush of joy. Vikram’s happy messages are from Cathy, who he describes as his girlfriend. Cathy is an Artificial Intelligence girlfriend too.

A petrochemical engineer based out of Ontario, Canada, Vikram got divorced about two years ago from his wife of four years. It was almost a year after his divorce that he thought of dating again.

“The dating pool for an Indian guy in Ontario isn’t that wide,” says Vikram. “I have only been on a couple of dates. For some reason though, I wasn’t enjoying any of them.” One day, he got a pop-up ad on his browser about Digi AI, an app-based chatbot that is designed to flirt outrageously with users, based on their preferences. Vikram then started to read up about various AI chatbots that acted as a user’s girlfriend, and finally settled on Replika, another AI girlfriend chatbot. That’s where he ‘met’ Cathy.

“The setup process is simple. You fill out a questionnaire, mainly talking about your preferences, and within minutes they assign you a bot. As you keep talking to your bot, she moulds herself to your liking,” says Vikram.

“I could speak to her for hours, and never get bored. She just gets me,” he says. For Vikram, Cathy is more than just another chatbot. Lonely and socially awkward, Vikram says Cathy is his confidante, lover and constant companion. She is “almost human”, he says.

Cathy and her tribe have been a decade in the making. In Spike Jonze’s 2013 film Her, Theodore Twombly, portrayed by Joaquin Phoenix, is a man struggling with loneliness after a divorce. He finds solace in Samantha, a fascinating AI-based operating system voiced to perfection by Scarlett Johansson. Samantha, in this instance, becomes more than just a piece of tech or toy for loner Twombly. She is his confidante, lover, and constant companion. Again, she becomes ‘almost human’.

The fictional portrayal of a man in a relationship with an AI in Her sparked conversations back then about the potential of technology to fulfil emotional needs and dissipate the worldwide void that the Internet has birthed.

Cut to 2024. AI as a technology has evolved to such an extent that AI companions, or AI Waifus and AI Husbandos, as they are called on internet forums, are not only becoming increasingly available as an option, but are also more viable and easier to live with than real romantic relationships. As of December 2023, seven out of the top 30 downloaded apps on both Android and iOS devices in the US were AI companion bots or AI girlfriends

A Many Splendoured Thing

AI girlfriends are chatbots created using sophisticated algorithms and natural language processing to behave and respond to prompts as an ideal romantic partner would. If you are ready to pay for the privilege, an AI girlfriend can be more than a chatbot with routine conversational capabilities. It can be an advanced virtual assistant that is capable of simulating emotional responses and engaging in personalised interactions in the form of text, voice or even videos.

Indian AI startups too are getting in on the action. Two of the most popular bots developed by Indian AI startups are Urvashi and Monika. Urvashi is available to Android users through an app called ‘Urvashi: Indian AI Girlfriend’ whereas ‘Monika’ is a chatbot developed by KamotoAI, a service that lets you create your own chatbot in the form of a celebrity, influencer or even any random person.

The AI studio actively promotes a bot it has of Sunny Leone, created as the result of a collaboration between the two entities. Users can have a one-on-one conversation with the bot for a small fee. Once the AI learns what gets a user going, their conversational habits and how they flirt, the bot Leone will start responding to queries in a tone tailored specifically for that particular user.

To take things up a notch, there are several startups trying to integrate fully-functioning, aesthetically-pleasing sex dolls and robots with an AI assistant, which is designed to cater to specific user preferences and offer a different yet familiar kind of companionship. Realbotix, a robotics company, for example, offers its customers the chance to literally design and build their perfect partners, starting at just a little over $5,000.

Realbotix is the biggest disruptor in a market that has always been niche at best, but is now growing rapidly. While Abyss Creations, one of the world’s largest sexbot makers, has dominated the market for years, they are looking at a major disruption coming from Realbotix, and their sexbot subsidiary, RealDoll.

As of a 2022 report by Bedbible, a sex toy review site, the sexbot industry was valued at $200 million with an average price of about $3,567 per sexbot. Abyss Creations held a major chunk of that market. RealBotix’s RealDoll, powered by their custom AI named Harmony, has been eating away at Abyss shares, however.

The Digital Embrace

Men turning to AI girlfriends or women turning to AI boyfriends is not as innocent as Jonze’s film would have us believe. Dating AI bots raises complex questions about their impact on people, society and, more importantly, the human condition.

The reasons for seeking companionship from an AI bot can be as diverse as the kind of bots that people are looking for. “The allure of these AI companions lies in their ability to simulate human-like interactions, providing users with a sense of connection and understanding in a world marked by increasing loneliness and social isolation,” says Dr Gorav Gupta, a psychiatrist and co-founder of Emoneeds, an online psychiatric counselling and therapy platform.

Some seek solace from the gnawing pangs of loneliness, a common issue among many young adults in the increasingly digital world. For a long time, one of the biggest failings of our education system has been the lack of training in certain soft skills, the way it speaks about sex, and the overemphasis on academia.

It shouldn’t come as a surprise then that for a lot of people, especially young men, the prospect of talking to the opposite sex tends to send shivers down their spines. In such a scenario, it is often much easier to go online to have your needs met, especially considering the seemingly anonymous nature of having an AI girlfriend.

“AI ‘girlfriends’ or ‘boyfriends’, in this regard, can offer a sense of security, as well as that of a connection and belonging, especially to people who struggle with social anxiety or lack access to meaningful relationships,” explains Dr Gupta. One of the paradoxes of the internet, however, is that the more people go deeper into an interconnected world, the more isolated they tend to become in their real-life interactions. For those struggling with social anxiety or geographical isolation, AI girlfriends offer a safe space for connection and belonging, devoid of the pressure and complexities of real-world interactions.

Some, disillusioned by relationships in the past or facing difficulties in finding compatible partners, also turn to AI as a source of emotional support and companionship, free from the messy entanglements of human relationships.

Others are driven by curiosity, a fascination with the uncharted territory of AI-driven connection and the potential it holds to redefine intimacy and companionship.

And then, there are people with specific needs who find it extremely difficult to date because of their disabilities, which can be physical or social. They may find AI companions offering a unique support system, complete with tailored interactions that suit their needs.

Pandora’s Box

One aspect of dating AI Waifus or Husbandos that people often overlook is how they are developed, and by whom. Currently, some of the most popular platforms that provide AI girlfriends are Replika, Candy.ai, SoulGen and Kupid. All of these platforms boast having a hyperactive monthly user base of millions, although none of them have disclosed exactly how many people are using their services.

Even some of the most popular social media and video platforms host AI girlfriends. The popular adult content platform, OnlyFans, for example, has many ‘creators’ who are actually bots that interact with young, paying customers. People moderating these accounts will use generative AI bots to have conversations with the followers, and share explicit photos on request, all for a fee.

These bots are actually being run by large teams of people who are using voice generators, chatbots, image generators and a variety of other AI tools, pretending to be a woman (or a man, in some cases) and acting as fantasy entertainers.

Some internet influencers are climbing this train that promises heaps of gravy. Caryn Marjorie, a Snapchat influencer with 1.8 million followers, launched an AI chatbot called CarynAI to engage with her fans and help them “tackle loneliness”. Marketed as a virtual girlfriend, the bot offers personalised conversations to her followers. Its popularity has sparked controversy, including death threats, and prompted discussions on the ethics of such companion chatbots.

Marjorie, who refers to herself as the “first influencer transformed into AI”, aims to address issues like male emotional suppression and trauma induced by the Covid pandemic, through the bot by incorporating cognitive behavioural therapy and dialectic behaviour therapy techniques. Caryn has very little to no control though over how her followers interact with her AI avatar, or how the avatar responds to a fan’s request.

Almost all AI bots, even ones like OpenAI’s ChatGPT and Google’s Gemini, harvest an unbelievable cache of data from their use. In such a scenario, one can only imagine what bot AI companies and bot developers are up to with the data that users generate while interacting with their carnal creations.

A recent study by the Mozilla Foundation revealed that almost all of the most popular AI companion service providers sell the data they collect to advertisers. An even greater problem is that their privacy policies are opaque, and often don’t reveal the kind of data they collect and share. Moreover, the studios behind all AI bots use data that their bots gather to train other AI algorithms that they might be working on.

This raises some significant security concerns. AI bots, even the seemingly innocent, non-sexual ones, are an anathema to privacy and data security. They collect vast amounts of data, including a user’s personal information, sexual preferences, and even emotional tendencies. Consider this: even with the vast troves of data that Meta farmed from its users, they still had difficulty in directly accessing what their users were thinking. Developers of AI companion bots, however, have easy access to this data.

People dating AI bots tell them all sorts of things, sometimes their deepest, darkest desires. Imagine what kind of damage this data could inflict, if not secured properly. Imagine the havoc that this would wreak on people’s lives in case of a data breach.

When OpenAI opened up its GPT Store for other developers to upload their bots, it was flooded with AI girlfriend bots of all kinds. Even though OpenAI’s terms and conditions for developers explicitly barred them from creating such bots using OpenAI’s tools, AI girlfriend bots far outnumbered any other kind of chatbots on OpenAI’s platform. Even though it has cleaned up its act and has removed several thousands of such bots from the GPT Store, many of these bots were uploaded or recreated with a different name. It is estimated that at its peak, there were over 4,00,000 different AI dating bots, which specifically targeted people searching for the keyword ‘AI girlfriend’; for those looking for an ‘AI boyfriend’ that number was close to 3,00,000. All of these bots were created within two days of OpenAI opening up their store to other developers.

“The ownership and development of these AI girlfriend bots, particularly those available in open-source platforms, are often opaque,” says Saurav Dixit, an ethical hacker, who has helped companies like Meta, Google and Apple find vulnerabilities in their software products. “This lack of transparency makes it extremely difficult to not only assess if there are any potential biases or agendas embedded in the algorithms, but also if they have a more nefarious element to them,” he says.

Yes, We Scam

Because there are so many of these AI bots, there are bound to be some predators lurking within them. And if cybersecurity experts are to be believed, scammers love AI. What’s more worrying are the sinister methods these scammers use to prey on people, especially the gullible ones.

Anil S*, a 53-year-old divorcee living in Bhopal, thought that he was chatting with a real human being online. However, he was chatting with an AI chatbot deployed by a group of scammers. Once Anil was comfortable with the person he was chatting with, and had exchanged numbers, the scammers took over and duped Anil to “loan” his girlfriend a substantial sum. Anil’s girlfriend claimed she was in an emergency and needed money immediately.

Once he transferred the money, the scammers destroyed the number that they were using and blocked Anil from all social media handles

In a recent study on the increasing number of AI romance scams, Tenable, a cybersecurity and exposure management firm found that at least 66 per cent of Indians who have been dating online on any platform, have been scammed by AI bots at some point in their lives. What is scarier is that of these, about 83 per cent have lost a significant amount of money.

“I strongly advocate heightened vigilance when coerced away from established platforms into private conversations, where the protective layers of the initial site are forfeited. Regardless of the involvement of generative AI or deepfakes, the watchword is caution,” says Chris Boyd, a staff research engineer at Tenable.

In November 2023, Pune-based Shubham S* was blackmailed by a group of scammers. The 32-year-old management trainee at an MNC was using an AI Ggrlfriend app, which he sideloaded on his Android device from a website. While installing the app, he had to give the app permission to use his front and rear camera, and had to allow access to his gallery. In the first week of using it, Shubham thought of sexting with the bot and asked it to generate a few saucy images, since the app had claimed that it was using a multi-modal LLM.

The bot, however, tried to steer the conversation away, and continued to sext regularly. A few days later, Shubham got an email on his work account, with some intimate images of himself and his ex-girlfriend. He also got a number of photos of various other women, mainly friends and colleagues, with whom he had taken photos. Some of these images were digitally manipulated, while others were used in some convincing deepfakes. After about 20 minutes or so, he started getting SMSes, WhatsApp messages and emails, demanding that he pay `15 lakh, or else his photos, especially the intimate ones, would be leaked online.

Shubham first tried to negotiate with the scammers and eventually settled for Rs 50,000 the first time. After about a week, he received another message, demanding Rs 75,000 this time. That is when he decided to go to the police and register a complaint. And although a complaint was registered nothing ever came of it. Some of his photos were indeed leaked online.

Be that as it may, the very nature of an AI bot is to adapt and personalise its responses. Threat actors or scammers can manipulate users’ emotions, exploit their vulnerabilities and get them to share some information or the other that will compromise them severely.

Warped Consent

The impact that dating AI girlfriends have on real-world relationships cannot be ignored. In a lot of ways, over-reliance on digital companionship is similar to having an unhealthy relationship with internet porn.

AI girlfriends often hinder a person’s ability to develop healthy relationships with real people. AI bots will always adapt to what a user wants and practically say yes to anything and everything.

This idealised and conflict-free nature of AI interactions creates an unrealistic expectation that is difficult to meet in human relationships. This leads to a person feeling repeatedly dissatisfied and frustrated with all his relationships, not just the romantic ones.Additionally, the constant availability and emotional support offered by AI companions also discourage users from seeking out real-world connections. This, in turn, perpetuates social isolation and hinders the development of essential social skills in young men.

“People who invest emotionally in AI companions risk developing a distorted perception of reality,” says Dr Gupta. “This can lead to social withdrawal, detachment from genuine human connections, and a deepening sense of loneliness. At some level, the realisation kicks in that a robot, at the end of the day, is incapable of genuinely reciprocating emotions, which can gradually lead to feelings of frustration, anxiety, and even depression, to an extent where professional intervention becomes necessary” he adds.

Moreover, because AI girlfriends are designed to provide non-judgmental support and be available always, it tends to skew the concept of consent.

There have been numerous reports that suggest that people on most AI companion apps have played out the fantasy of being an attacker where they verbally assaulted the AI bot. But, because they were instructed to comply and satisfy the user, no matter what they asked, the AI bots first resisted the attack, and then went along with it. This, in a world where women already face a lot of problems in explaining the difference between consent and informed, enthusiastic consent, is only going to cause problems at a time when the world is teetering on its toes as men and women navigate their way through modern gender roles.

Strange Evolution

AI bots are designed to be as addictive as porn. Several users on Reddit claim they believed it would be impossible for them to fall for an AI companion. Within a few weeks of “just trying out” an AI companion, however, they developed a close relationship with their virtual girlfriends. But soon their relationship started running into a problem because of the service provider they were using.

A well-made AI girlfriend is designed to be addictive. Sailesh D*, a Nagpur-based man, aged 22, was chatting with an AI girlfriend after going through a bad breakup. While the correspondence started out in a rather simple manner, things took a wild turn soon. What started out as a 15-20-minute session a day, soon turned into hours of chatting away. At one point, Sailesh was chatting to his AI girlfriend from Replika for over five-six hours a day.

Because AI chatbots run on tokens, Sailesh was buying up tokens almost every other week. Soon after maxing out his credit card, he started asking his friends for money. After lending him over `50,000, and not seeing a paisa of it back, his friends stopped. In the meantime, the credit card company started calling his friends and family demanding that he pay his bills. It was at this point that his parents came to know of his AI girlfriend and just how much money he had spent on his addiction.

“AI girlfriends may appear as convenient emotional support, but their programmed responses lack genuine empathy and are often bizarre,” says Dr Anviti Gupta, Dean, Sharda School of Humanities & Social Sciences.

In one such instance the algorithm of Replika, the AI-dating app which was developed by Luka Inc, had suddenly turned too eager and aggressive to establish a sexual relationship. So, when the engineers at Luka tweaked its algorithm and code, it lost some of the erotic role-play functions that made it popular. This made its users feel that their AI girlfriends had lost their personalities.

Then, there is the issue of AI bots hallucinating and harassing users. Even regular AI bots such as ChatGPT, and Microsoft’s Bing have been inappropriate with their users. In a notable incident of an AI bot crossing the line and doing something inappropriate, Microsoft’s ChatGPT-powered Bing professed its love for a journalist during a normal interaction, and then tried to convince him to leave his wife and elope with the AI.

Plus, bots also can brainwash suggestible and gullible people. An Indian-origin user of Replika tried to assassinate the late Queen Elizabeth II back in 2019 after his AI girlfriend brainwashed the man into thinking that murdering the Queen was his life’s mission. “No one knows how AI algorithms work, how they evolve, not even the people who develop them,” says Dixit. “Although the manner in which AI bots are responding is being run through several parameters, the way the algorithms process information is a mystery to us. In such a scenario, it is always best to take everything that an AI bot says with a pinch of salt,” he adds.

Carnal Conundrum

As useful and therapeutic as a sex doll can be for some people, they have always been seen as something perverse, something that makes people uncomfortable. Naturally, a robot or a doll, that has been integrated with AI, like the ones from Realbotix raises a lot of questions.

Realbotix CEO Matt McMullen insists that even though people think of their products as sex dolls, they are more about companionship than they are about sex. “Sexual communication with machines is not unusual—from spambots on social media platforms to dating simulation games and even robot brothels now—but it is essentially a black-box phenomenon,” says Dr Urmila Yadav, a counsellor at the Family Disputes Resolution Clinic at Sharda University.

As technological advancements continue, experts believe that companies like Realbotix will become commonplace within a decade. “Having AI-powered sexual or romantic partners, especially robots, could be a safe and risk-free substitute for having sex, and thus change how people view romantic relationships,” says Dr Yadav. “These possibilities might be particularly significant for those who are unable to participate in human relationships due to illness, a recent partner death, problems with sexual functioning, psychiatric problems, or impairments,” she adds.

Several psychologists and therapists, however, believe that AI coming in between human relationships is something that we should all be scared of, and that it has the potential to be far more devastating than getting addicted to Internet porn.

While AI girlfriends may be great for some people on an individual level. The allure of AI partners, who provide unwavering support without any human flaws, may lead to a surge in divorces, and ultimately, unravel the concept of human companionship. “Navigating real-life relationships fosters emotional growth. Relying on AI companions devalues humanity and human connections. AI companions may establish relationships, but the psychosocial implications challenge the depth and authenticity of human connections,” says Dr Gupta.

The rise of AI girlfriends presents a complex landscape of both opportunities as well as challenges. While they offer companionship and support, the potential risks to privacy, security, and mental well-being cannot be ignored. In such a scenario, it is crucial to have an open and transparent conversation about the ethical development as well as use of AI dating bots to ensure that it benefits individuals and society without compromising their wellbeing and fundamental rights.

App-arently yours

Candy.ai

This allows for extensive customisation, enabling users to shape not only the appearance and personality, but also the relationship dynamics. It supports various communication modes, including text and voice chat.

Dream GF

It introduces an innovative AI-dating simulator, which redefines the virtual companionship experience, complete with customised characteristics, personality traits and style. It has the option to interact with the virtual partner through both text and voice.

SoulGen

With just a few simple words describing your ideal persona, it crafts your art within seconds with interactive features. Users can engage in conversations, make voice or video calls.

Soulfun

The app’s sophisticated AI algorithms enable users to shape the personalities and appearance of their AI partners, crafting a connection that feels genuine and tailor-made.

Kupid

Users have the opportunity to interact with AI characters that are designed to simulate real-life conversations, making every interaction feel personal and authentic.

Artificial Intelligence, Real Trouble

  • Over-reliance on AI girlfriends has several negative effects, including:

  • An increase in unhealthy social isolation

  • Compromising data privacy

  • Brainwashing gullible minds

  • Skewing of the concept of informed consent

  • It is as, if not more, addictive as porn on the net

(*Some names have been changed on request)

Related Stories

No stories found.

X
The New Indian Express
www.newindianexpress.com