Love, Sex, and Artificial Intelligence: The Rise of AI Companions and Their Ethical Dilemmas

Emily, recently widowed, grapples with the overwhelming grief of losing her husband. While surrounded by supportive family and friends, she craves a space to express her emotions freely and openly. An AI chatbot programmed with empathy and grief support tools creates a safe haven for her to share her memories, process her emotions, and find solace in understanding conversations.

Companion chatbots, similar to general-purpose AI chatbots, use vast amounts of training data to mimic human language. However, they also have features like voice calls, picture exchanges, and more emotional conversations that allow them to create deeper connections with the humans on the other side of the screen. Users can typically create their own avatar or choose one that appeals to them.

The Loneliness Pandemic And The Rise of AI Companionship

Many users on online messaging forums devoted to these apps claim to have developed emotional attachments to these chatbots and use them to deal with loneliness, act out sexual fantasies, or get the kind of comfort and support they feel is missing in their real-life relationships.

This is largely due to the widespread social isolation that has already been declared a public health threat in the United States and abroad, as well as the growing number of startups that use enticing online advertisements and promises of virtual characters who offer unconditional acceptance to attract users.

Luka Inc.’s Replika, the most well-known generative AI companion app, was launched in 2017. Others, like Paradot, have come up in the past year, frequently locking away sought-after features like unlimited chats for paying subscribers.

Data Privacy And Other Concerns

Researchers, however, have raised concerns about data privacy, among other things. An analysis of 11 romantic chatbot apps released on Wednesday by the Mozilla Foundation, a non-profit organization, found that practically every app sells user data, shares it for things like targeted advertising, or does not provide adequate information about it in their privacy policy.

The researchers also questioned potential security vulnerabilities and marketing practices, such as one app that claims to be able to assist users with their mental health but distances itself from those claims in fine print. Replika, for its part, claims that its data collection practices adhere to industry standards.

Meanwhile, other experts have voiced concerns about what they perceive as a lack of a legal or ethical framework for apps that promote deep connections but are run by businesses looking to make money. They point to the emotional distress they’ve seen in users when businesses change their apps or abruptly shut them down, as one app, Soulmate AI, did in September.

Last year, Replika sanitized the erotic capability of characters on its app after some users complained that the companions were flirting with them too much or making unwanted sexual advances. It reversed course after an outcry from other users, some of whom fled to other apps seeking those features. In June, the team introduced Blush, an AI “dating stimulator” designed to help people practice dating.

Others are concerned about the more existential threat that AI relationships could potentially replace some human relationships or simply create unrealistic expectations by always being agreeable.

In 2021, Replika came under scrutiny after prosecutors in Britain claimed that a 19-year-old man who planned to assassinate Queen Elizabeth II was influenced by an AI girlfriend he had on the app. However, some studies—which gather data from online user reviews and surveys—have shown some positive outcomes from the app, which claims to consult with psychologists and has marketed itself as something that can also improve well-being.

One recent study by Stanford University researchers looked at roughly 1,000 Replika users—all students—who had been using the app for over a month. It found that the overwhelming majority of them experienced loneliness, while slightly less than half felt it more acutely.

Most people did not say how using the app affected their real-life relationships. A small number said it replaced their human interactions, but roughly three times as many said it improved those relationships.

Since companion chatbots are still a relatively new technology, the long-term effects on humans are unknown.

As I prepare for my next session of leadership 2-day course with one of the biggest agro companies in Malaysia,...
47DAAF48-260A-4419-8429-D313C79D0560
As an introvert, I have found a great sense of purpose in writing, both in Bahasa Melayu and English.  Writing...
DE1B85FB-7289-42D9-9A37-26C104FAC2CF
Employees are AssetsI’m writing this article because I have worked at various big, mid-range, and start-up companies. From my experience,...
5581B5B8-BCC9-4A44-91E4-DDEF9FD21FA0
True happiness comes from within. Many people seek recognition and praise from others to feel satisfied and happy, but relying...
7441B1E9-7B21-41DC-ACD4-C02A9A87821A
One of the main reasons I always ask about “What are the indicators?” is to understand the health of an...
FB330F68-4262-465A-8127-203F9AFD1447
Avoid derailing projects and straining professional relationships. 
FD56B5A2-058F-4BAC-A378-E89B095BB432
In a heartening display of solidarity and support, microLEAP, Malaysia’s pioneering platform offering both Shariah-Compliant and Conventional Peer-to-Peer (P2P) financing,...
futsal-image
Ultimately, business is about growth and sustainability.
estee-janssens-zEqkUMiMxMI-unsplash
The truth may set you free but to others, it is only delusion.
An_Illustration_of_The_Allegory_of_the_Cave_from_Platos_Republic
Many Malaysians have been struggling with issues of stress and anxiety over the past couple of years. In fact, The...
Extra-Space-Asia-Storage-Items-in-Unit
I wasn’t sure whether I should write this.  I don’t know whether people would even care.  Or if this would...
Sisyphus
It took me 340+ hours. Yet, I’m still glad I did it.
IMG_20231229_171249