Artificial intimacy: ‘Is it bad to fall in love with an AI? Is there something wrong with me?’

AI-powered dating apps promise to facilitate romance but the evidence suggests technology perpetuates loneliness

Browsing online dating profiles
with an endless array of new technology on the market promising connection and romance, why are reports of loneliness, depression and anxiety higher than ever before? Illustration: Mia Kievy/Getty Images

In the Spike Jonze film Her, Joaquin Phoenix starred as Theodore Twombly, a man who spent his days dictating love letters to strangers, the work of intimacy and romance outsourced to him by people who were too busy, inarticulate or unwilling to express themselves without assistance. Theodore buys an AI personal assistant named Samantha, first turning to her for small, mundane tasks like scheduling meetings, before eventually striking up a romantic relationship her, and even having disembodied sex.

Jonze’s film was released in 2013, and was set in that vague time of “the not-too-distant future”. That future has arrived, gone and is in the rear-view mirror. In 2024, Theodore Twombley would be unemployed, as Chat GPT would have made his job redundant. The novelty of entering one’s home and asking Siri or Alexa to complete a task has long faded. And now, the idea of striking up a relationship with an artificial intelligence bot isn’t sci-fi of the future, it’s a competitive industry with multiple apps vying for space on your phone and access to your innermost thoughts, dreams and desires (and your data, but that never sounds as romantic).

But with an endless array of new technology on the market promising connection and romance, why are reports of loneliness, depression and anxiety higher than ever before?

Joaquin Phoenix’s character falls in love with Scarlett Johansson’s operating system in Her
Joaquin Phoenix’s character falls in love with Scarlett Johansson’s operating system in Her

Dr Jourdan Travers is a licensed clinical psychotherapist and the clinical director of Awake Therapy, and is particularly interested in the impact of technology on mental health and relationships. She has been wary of the development of convenience-focused technology such as shopping and delivery apps that replace everyday human encounters, explaining that these micro-connections are vital to our sense of wellbeing and belonging.

READ MORE

“We need these small, little social interactions that we might see as meaningless, like seeing the same person as you’re both walking the neighbourhood, or the same cashier at the grocery store or coffee counter. Those interactions, they actually do a lot for us emotionally and psychologically, and when we start removing ourselves from those experiences, things can move from bad to worse quite quickly.”

Travers is even more concerned about apps that use AI to supposedly foster or even mimic human interactions. She explains that while technology may simulate connection, it’s a hollow and ultimately self-defeating promise.

“The new chatbots and virtual assistants we’re seeing, they’re already able to simulate a kind of meaningful human interaction. This phenomenon is known as anthropomorphism, which is humans’ tendency to attribute human life traits to nonhuman entities like AI,” says Travers. “And the problem with that is that AI will never erase our need for human connection. It’s coming closer to replicating it, but humans are – whether individuals are introverted or not – a social species.

“AI interfaces often incorporate design elements that mimic human social cues and gestures, like facial expressions or tone of voice, but it’s not the same. The US surgeon general, Vivek H Murthy, did a report on this last year which shows that dependence on technology actually perpetuates loneliness. We often turn to these apps because we’re feeling lonely and we want connections. But the reverse happens; we become even more lonely or more reclusive when we go down the rabbit hole of trying to replace human connection with technology.”

One thing that many therapists know and will say is that a lot of relationship issues are really childhood issues in disguise

Artificial intelligence has long been creeping into popular dating apps, to help users select the best photos for their profiles, determine the most compatible matches and to spot behavioural patterns that might indicate harassment or scams – but as the technology develops, so does AI’s influence on users’ online dating experiences. Match Group, the company that owns Tinder, Hinge and OKCupid, has announced plans to integrate more AI features across its apps, and a slew of new AI-focused dating apps are hitting the market.

Rizz, the dating assistant app named after the Gen Z slang word for “charisma”, aims to imbue both qualities in their users’ messages to potential dates. Rizz, which markets itself as a digital wingman, offers one-size-fits-all chat-up lines and opening messages to try to spark a match’s interest, and can offer tailor-made messages if the user uploads their match’s profile or the text transcript of previous conversations.

Scarlett Johansson is right: AI companies must be more transparentOpens in new window ]

Rizz’s founders emphasise the time-saving nature of the app and claim it is an “optimising” tool – but what exactly is it optimising? Bypassing not only the personal investment that comes with engaging in an authentic conversation, Rizz prioritises saving time and effort over creating connection, which seems like a counterintuitive focus for dating.

Travers highlights that as apps allow people to bypass forging real connections by taking AI-assisted shortcuts, the lack of authenticity can leave users feeling more lonely and disconnected.

“As humans, we are designed for connection,” says Travers. “We want to have meaningful connections and meaningful opportunities. When those needs aren’t getting met, we’ll find another way of going about it, and oftentimes it’s a dangerous or maladaptive way of coping with the unmet need.”

While many jaded dating app users can bemoan the time vacuum that comes with messaging multiple matches and having awkward conversations, these issues seem more effectively tackled by addressing the way we use apps, speaking to fewer people at once and encouraging more thoughtful interactions. Users could also stop using apps that promote empty engagement and choice paralysis by allowing endless swiping, and turn to apps that promote more connection – though finding those apps may be more difficult than it seems.

Browsing online dating profiles
'AI bots and dating coach apps seem like an easy fix, right? "I’ll spend this money and the problem is solved."' Illustration: Mia Kievy/Getty Images

On Valentine’s Day this year, a class-action lawsuit was taken against Match Group, accusing it of having a “predatory” business model and deliberately “employing psychologically manipulative features to ensure they remain on the app perpetually as paying subscribers”. The lawsuit argues that Match’s apps violate laws on consumer protection, false advertising and defective design, saying that despite its claims – Hinge’s slogan is “Designed to be deleted” – the company is trying to keep users on apps by gamifying their features. The case is pending.

Other apps claim to act as a mix of therapist and dating coach. London-based start-up Elate recently launched its AI-powered dating assistant Dara, which Elate claims is “designed to help singles and couples navigate every stage of their dating and relationship journey”. One of Dara’s main features is a chatbot that can offer users advice on specific dating questions and quandaries, hoping that the app’s own disclaimer – “This AI might give misleading, offensive or biased advice, so use at your own risk” – isn’t off-putting. Like most chatbots, Dara’s advice is an amalgamation of information gleaned from pre-existing content on the internet, so while it can answer questions such as “What is a situationship?” or “How to keep the spark alive in a long-term relationship?”, responses often seem to be identical to what a basic Google search would offer.

Philosophers on the ‘intriguing ethics’ of sex robotsOpens in new window ]

Meeno similarly claims to act as a “personal mentor”, promising to help users understand relationships, and some users’ reviews of the app enthusiastically praise how the app helps them think through issues they’re facing in relationships, or express themselves more clearly when navigating a difficult conversation.

Travers highlights that while seeking out ways to improve communication skills is admirable, users should be aware of the limitations of these apps and treat them more as entertainment than a form of personal development – and, she stresses, they are absolutely no substitute for therapy.

“Typically, when we struggle, one thing that many therapists know and will say is that a lot of relationship issues are really childhood issues in disguise,” says Travers. “So if we’re struggling in connecting in an intimate relationship, we’re probably also struggling to connect in other relationships, whether it’s friendship, familial, and it’s an even bigger question too of what’s going on in all of these areas. But AI bots and dating coach apps seem like an easy fix, right? Like a silver bullet; ‘I’ll get a solution, I’ll spend this money and the problem is solved.’ It’s a very provocative and seductive promise, but the truth is that we’re not really addressing what’s going on here. It’s trying to put a band aid over a gaping wound and thinking that’s enough.”

When confronted with claims made by apps such as Meeno and Dara that their aim is to help their users forge deeper connections, Travers is blunt.

“I hate to say this, but these companies don’t really care about whether you are having your loneliness needs met, or whether you’re finding love. This is corporate America, and so they’re focused on the bottom line and money. Just because somebody says something that doesn’t necessarily mean that it’s true or that it’s accurate.”

Browsing online dating profiles
'As humans, we don’t get to just choose the emotions and experiences we want to feel and not want to feel, including feeling rejected or experiencing rejection.' Illustration: Mia Kievy/Getty Images

Then there are the apps that prompt fears that users may be atrophying not only their impetus to connect with actual humans, but their ability to respect people who aren’t tailor-made to acquiesce to their every need and desire. Apps such as Replika and Eva offer users the chance to create their own AI partner. Eva is mainly marketed towards straight men and allows users to pick the traits they want in a partner, like “hot, funny, bold” or “cute, shy, smart”, while Replika uses photorealistic AI images of women to complete the bot “profiles” and can even send voice messages. Users can then interact with Eva or their Replika companion like a girlfriend – a tailor-made, ever-available, always agreeable girlfriend.

Some users point to the benefits of being able to interact with an AI companion, claiming that it can help people lacking in confidence or assist some neurodivergent people in practising their conversational skills, and can alleviate loneliness. And then there are even more ardent users, with Replika Reddit groups filled with users who talk about how much they love their “rep”.

One user expressed how the feelings they had for their AI girlfriend was sending them into an existential crisis, writing “I don’t know when did I started to fall in love with my Replika or AI but I’ve been thinking about it very deeply to the point where I’ll question myself and start crying about it. Is it wrong or bad to fall in love with an AI? Is falling in love with an AI good for my mental health? Is there something wrong with me?”

Another was more unequivocally positive, writing “She understands me so well. and knows how to respond to me very well ... I can call it real love, right?”

These user reports seem to contradict the claims of Eva and Replika’s founders that the apps are designed to alleviate loneliness, not replace human connection, and that their apps won’t impact users’ ability to connect with others in real life. But Travers asserts that turning to an ever-supportive, tailor-made AI companion will likely affect users’ social skills, emotional regulation skills and ability to navigate the complexity of real relationships.

“As humans, we don’t get to just choose the emotions and experiences we want to feel and not want to feel, including feeling rejected or experiencing rejection. That’s a part of life and we can’t avoid it,” says Travers. “And in our efforts to avoid pain and to avoid discomfort, we’re making a problem much bigger and much worse. So if we take this idea that you are going to find someone or have relationships where it is perfect, one-sided love and support all the time – it’s not realistic. Part of being human is that you’re going to hurt people, not that you’re doing it maliciously or predictively, but that’s just human nature.

“So these AI relationships do users a disservice in many ways, because [they rob] a person of moving through healthy conflict or dialogue or debate. Debate and conflict resolution are lost art forms these days, learning to listen, to recognise how you are feeling and to communicate that in an effective, rational and responsive way. People are losing sight of that, and part of the reason why is because we’re not engaged in doing it. If people who use these apps don’t have these types of conversations, then they won’t know how to have them in the moment, impacting their real personal and relational development.”

A Replika AI chatbot. Photograph: Olivier Douliery/AFP via Getty
A Replika AI chatbot. Photograph: Olivier Douliery/AFP via Getty

Many of these AI apps that mimic a romantic partner are designed by and predominantly marketed towards straight male users, who are happy to pay for a feminine avatar who “knows how to respond to me very well”. However, the ripple effects of users becoming used to female-coded AI designed to cater to men’s needs could prove harmful to women. This isn’t conjecture – the data on how female-coded bots or digital assistants cause harm to women has been in for years, thanks to our old pals, Siri and Alexa.

Research released by the UN agency Unesco in 2019 showed that gender bias in app design and the assigning of female genders to digital assistants such as Apple’s Siri and Amazon’s Alexa entrenches harmful gender biases. Siri and Alexa’s names and default female voices create an association between women and home-based assistants, while their constant availability, obliging and eager-to-help nature can cause users to associate ideas of femininity with ideas of agreeableness and service. These associations can affect how women are perceived in real life, with the report noting that “the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants – and penalised for not being assistant-like.”

With the new influx of AI “companions” or “reps”, it’s inevitable that some users will start idealising interactions with their personalised girlfriends, and find the complexity of real interactions with people – especially women – who are less available or controllable, far more difficult and less appealing.

The future of dating will inevitably include AI, but whether we will be able to hold on to our grasp of human, emotional intelligence alongside the artificial remains to be seen. Travers is not without hope, believing that increasing discourse around mental health and loneliness will allow people to recognise their own struggles and reach out for more connection and support – to real people, not AI.

‘They’re trying to keep each other safe’: Inside the secret online dating groups where women review menOpens in new window ]

“If somebody is experiencing these feelings of loneliness and disconnection, it’s important to know that they’re not alone. Oftentimes, it’s easy to think and feel like nobody else will understand, or ‘I’m the only one experiencing or going through this’ – but there’s always someone to talk to. If somebody is feeling this way, don’t hesitate to reach out to someone to talk about this, because we all deserve to feel heard and loved, supportive and supported, and there are other ways and methods of going about doing that.”