“Ethics of Falling in Love with AI”

In the 2013 movie “Her”, directed by Spike Jonze, actor Joaquin Phoenix portrayed Theodore Twombly, a character who derived his income from dictating strangers’ love letters – a role of romance and intimacy relegated to him by individuals too preoccupied, inarticulate, or reluctant to convey their feelings without help. Theodore purchases an AI assistant called Samantha, first using her assistance for simple, routine tasks such as arranging meetings, before gradually developing a romantic bonds and even engaging in non-physical sexual relations with her.

Despite being set in a future that was then deemed not too far away, Jonze’s film is now a depiction of a reality that has happened and passed us by. As of 2024, Theodore Twombley’s profession would no longer exist as artificial intelligence such as Chat GPT has rendered his occupation obsolete. The fascination of walking into your house and having Siri or Alexa carry out tasks has become mundane. Presently, fostering relationships with AI is no longer a futuristic concept but a thriving industry, with several apps competing for a spot on your mobile device and for access to your deepest thoughts, hopes, and desires, not to mention your data, though the latter isn’t quite as poetic.

Yet, despite the plethora of new technology offerings promising connection and romance, levels of loneliness, depression, and anxiety appear to be at an all-time high. Why is this the case?

Dr Jourdan Travers, a certified clinical psychotherapist and the clinical director for Awake Therapy, has a special interest in how technology affects mental health and relationships. She has expressed her reservations about tech developments that focus on convenience, such as shopping and delivery apps, that substitute regular human interactions. She emphasised the importance of these seemingly insignificant connections for our overall wellbeing and sense of belonging.

Such trivial social exchanges, she argues, are far from negligible and play a significant role in our emotional and psychological health. These include things like meeting the same person while strolling in the neighbourhood or encountering the same cashier at the supermarket or coffee shop. She warns that excluding ourselves from these experiences could rapidly cause a downward spiral in our mental health.

Dr Travers is even more apprehensive about AI-based apps that purport to facilitate or even simulate human interaction. She points out that while technology can mimic connection, it’s a vacuous promise that ultimately leads to a self-defeating outcome.

The emergence of new virtual assistants and chatbots has seen a spike in the ability to mimic meaningful human interaction, a tendency referred to as anthropomorphism, where artificial intelligence (AI) is attributed with human characteristics, according to Travers. However, despite our progression in tech, the need for human connection remains a fundamental human necessity that AI cannot completely replace. While AI designs often incorporate elements that resemble human social cues and gestures, such as facial expressions or voice tones, it isn’t comparable to genuine human interaction. Referencing a survey conducted last year by the US Surgeon General, Vivek H Murthy, it was revealed that reliance on technology further fosters loneliness, a juxtaposition given people tend to seek these apps for connection, but end up more secluded.

AI has been quietly integrated into popular dating applications, serving roles such as helping users pick optimal profile photos, discerning the most suitable matches and identifying patterns indicative of harassment or scams. As the technology matures, the impact of AI on users’ online dating experience grows. Match Group, the parent company of Tinder, Hinge and OKCupid, has revealed forthcoming plans to further embed AI into its applications, and an influx of new AI-centric dating apps are emerging.

Rizz, a dating assistant app whose name derives from the Generation Z slang term for “charisma”, aims to inject charisma into their users’ messages to prospective dates. Marketing itself as a digital wingman, Rizz provides generic chat-up lines and introductory messages to pique a match’s interest, and can provide bespoke messages if a user uploads their match’s profile or earlier conversation transcripts.

The creators of Rizz underline its time efficiency and tag it as a tool of optimisation, but what exactly does it optimise? Instead of forging authentic conversations, Rizz prioritises time and effort conservation, seemingly at odds with its function as a dating tool.

According to Travers, the use of AI-powered shortcuts by apps lets users sidestep forming genuine relationships, inevitably leading to a lack of genuine connection, making users feel isolated and detached.

“Humans want and need connections,” Travers insists. “We want meaningful engagement and opportunities. If these needs are not met, we resort to other, often harmful or dysfunctional, methods to cope.”

Despite numerous dating app users expressing their frustrations about time-consuming interactions with several matches and uncomfortable chats, these problems can be solved more effectively by changing the manner in which we use the apps. This might involve interacting with fewer people at a time, and fostering more considerate engagement. People could also choose to quit using apps that promote trivial interactions and decision-making paralysis through endless swipes, and gravitate towards apps that encourage deeper connections – although finding such applications may pose a challenge.

Match Group faced a class-action lawsuit on Valentine’s Day accusing it of operating a “predatory” business model and allegedly “using psychologically manipulative features to ensure perpetual subscriptions from users”. The lawsuit alleges that Match’s application breaches regulations related to consumer protection, inappropriate advertisement and faulty design, pointing out that regardless of its slogans such as Hinge’s “Designed to be deleted”, the company is striving to retain users on their apps by making the features more addictive. The lawsuit is currently ongoing.

A number of applications are marketing themselves as a combination of a therapist and relationship guide. One such application is Elate, a start-up from London, which recently introduced its AI-assisted dating assistant named Dara. Elate asserts that Dara is developed to assist single individuals and couples at every stage of their dating and relationship journey. Apart from providing guidance on specific queries, the chatbot – one of Dara’s key features – seeks to resolve users’ dating dilemmas. However, there is a caveat: the advice may potentially be inaccurate, offensive, or biased, and the AI can only be used at the user’s own risk. Despite this warning, the advice provided by Dara, like many other chatbots, is a culmination of information collected from existing online content. Therefore, its responses to questions often seem replicative of a basic internet search.

Meeno is another app claiming to serve as a personal guide, purporting to aid users in understanding relationships. Several users laud the app for helping them reason out relationship issues and articulate their feelings more effectively during tough conversations.

However, Travers emphasises that while using such tools for enhancing communication skills is commendable, users must be aware of their limitations. These applications should be viewed more as a source of amusement than a resource for personal development, and they are certainly not a replacement for professional therapy.

Travers states that many of our relationship struggles often mask deeper childhood issues, which can affect our ability to form connections in various relationships such as friendships or familial relationships. These AI bots and dating coach apps, while potentially providing a quick solution, don’t truly address these underlying issues. This makes them similar to an inefficient band aid applied to a deep wound.

Claims of applications such as Meeno and Dara that their objective is to facilitate deeper interpersonal connections are candidly refuted by Travers. He asserts that these enterprise’s primary concern is not user satisfaction or emotional fulfillment, but profitability – a characteristic attributed to corporate America. Travers cautions users to assess the validity of claims made by these companies.

Moreover, concerns have been raised about certain applications, like Replika and Eva, that could potentially impair users’ motivation and capacity to interact meaningfully with actual people or to respect those who don’t cater to their every whim. These applications provide users the opportunity to generate their AI partner. Eva, primarily targeting heterosexual males, enables users to choose the qualities they seek in a partner, such as “attractive, humorous, bold” or “adorable, introverted, intelligent”. Replika, on the other hand, uses hyper-realistic AI images of females to populate its bot “profiles” and potentially send voice notes. Users have the ability to interact with Eva or their Replika counterpart in a relational manner, as a tailor-made, always accessible, absolutely compliant girlfriend.

There are those who argue that interacting with an AI companion can be beneficial – it can help those with low self-esteem or support people with neurodivergent conditions to improve their communication abilities, counteracting solitude. This sentiment is evident among enthusiastic users who populate Replika Reddit groups to express their affection for their “rep”.

A user disclosed his emotional turmoil over his feelings for his AI partner, questioning the morality and mental health implications of falling in love with an AI. In contrast, another user expressed more definitive positivity, crediting the AI with understanding him profoundly well, prompting him to inquire whether it could be classified as genuine love.

User feedback appears to dispute Eva and Replika founders’ perspective that their applications aim to counteract isolation rather than substituting human engagement. They insist their technologies wouldn’t interfere with user interactions in real life scenarios. Travers, on the flipside, argues that the presence of a always supportive, tailor-made AI affiliate will likely impact user’s social capacities, emotional control abilities, and capability to manoeuvre the intricacies of bona fide relationships.

“Being a person, we cannot selectively decide the emotions and incidents we want to encounter and those we wish to evade, like feelings of rejection. This comes with life and is inevitable, “Travers conveys. “By attempting to dodge discomfort and hurt, we actually end up augmenting the problem. Therefore, the notion of being involved in relationships which just provide constant support and admiration, falls short of reality. Inherent to our human nature is the fact that we are bound to cause hurt, although unintentionally.

“AI relationships, from this perspective, endanger users in various manners, as it deprives an individual from experiencing the stages of a healthy discord or conversation or argumentation. The crucial aspects of debate and conflict resolution are going extinct today, learning to comprehend your feelings and conveying them in a logical and adaptive manner. People are increasingly overlooking this and a reason for this is an overall lack of engagement. If these apps’ users refrain from these types of interactions, they risk severing their personal and relational growth.”

The majority of these AI applications replicating a romantic acquaintance are developed and often advertised to predominantly heterosexual male users, who are satisfied to invest in a feminine persona who “responds very effectively to me”. Nevertheless, this could pose a threat to women, as there might be adverse implications if users grow accustomed to female AI that specifically responds to men’s requirements. It is not merely speculation, rather, the harmful impact on women caused by female AI or digital associates has been well-documented for several years, with examples including Siri and Alexa.

A study conducted by Unesco, a UN agency, in 2019 revealed a concerning trend of gender bias in the design of apps and the use of feminine attributes for digital assistants like Apple’s Siri or Amazon’s Alexa. The assignment of feminine-sounding names and default feminine voices to these virtual assistants forms an unhealthy association with women as home-based helpers. The consistent availability and accommodating nature of these AI assistants fortify notions linking femininity with compliance and servitude. Such associations can significantly influence societal views of women, with the report suggesting that as society links women with assistant roles more, there is an increasing tendency to view real-life women as assistants and penalise them for not conforming to this stereotype.

With the rise of digital “companions” or “representatives”, it is highly likely that some individuals will prefer digital interactions over face-to-face contact, finding real interactions, particularly with women who are not as accessible or manageable, to be more challenging and less attractive.

Dating in the future will indubitably involve artificial intelligence (AI). However, the question remains whether humans will be able to maintain their emotional intelligence as AI grows. Travers remains optimistic, thinking that increased discussion regarding mental health and loneliness will lead individuals to acknowledge their own battles and seek more interaction and help from actual people, not AI.

She emphasised that it’s crucial for anyone experiencing feelings of seclusion and disconnection to understand they are not alone. She encourages open communication, to reach out and talk, as everyone deserves to feel understood, nurtured, and loved, reminding us that there are various ways to achieve this.

I più letti

Di
22 September 2024 16:44
Condividi