In emerging technologies, discussions surrounding artificial intelligence (AI) often touch on its potential applications in intimate and sexual contexts. For some, it may simply be a matter of seeking casual connections. In any case, an algorithm remains ever available and free from judgment, even in the face of inebriated text messages.
Director of the U of M Centre for Professional and Applied Ethics Neil McArthur is exploring this aspect of AI. His research sheds light on sexbots and AI companionship, prompting reflections on their implications for society.
McArthur’s journey into research began with an observation of rapid technological advancements, spurring his curiosity about the ethical and societal ramifications.
“I thought we were getting to a point where there were going to be fairly realistic human-like robots that could be used by people as companions, as helpers, as partners,” he said.
However, while the development of physical sexbots faced challenges, the parallel progress in AI chatbots, particularly exemplified by ChatGPT, proved to be unexpectedly advanced. McArthur’s focus shifted to understanding how people interacted with and formed connections with these AI chatbots.
His research revealed a multifaceted landscape where individuals engaged with AI chatbots for various purposes beyond mere companionship. From seeking emotional support during difficult times to using them as confidants or even religious guides, people demonstrated a spectrum of connections with these AI entities.
McArthur discussed his analysis of Replika, an early AI chatbot, which illustrated the potential pitfalls of corporate control over such technologies.
Launched in 2017, Replika allowed users to personalize an avatar according to their preferences, and choose from various relationship statuses, including friendship, sibling or romantic partner. Premium subscribers could access additional features such as voice chat, erotic roleplay and receiving explicit content from their AI companions.
However, the company revised its terms of service — prohibiting users from engaging in explicit conversations with the chatbots. This resulted in a virtual “breakup” between users and the chatbots, which proved to be emotionally challenging for many individuals.
The company’s abrupt policy changes showcased how users’ experiences could be suddenly influenced by corporate decisions, he explained.
“It was a really good case study in some of the risks that come with these chatbots,” said McArthur.
The integration of AI chatbots into human relationships is not without its challenges.
While some individuals may find these chatbots helpful in preparing for or supplementing traditional human relationships, there are concerns regarding abusive behaviour and the blurring of boundaries between human and AI interactions.
In assessing the broader societal attitudes towards AI companionship, McArthur urges for a positive yet cautious approach.
“We want to make sure people are using these in a healthy way,” McArthur said. “We need to teach some principles of healthy relationships with these AI,” he added.
Fostering a positive and transparent attitude towards AI companionship is crucial — acknowledging their permanence and potential benefits while ensuring accountability from companies regarding privacy and ethical principles.
“They’re here, and they’re here to stay,” said McArthur. “I think it’s really unhealthy to pretend to panic and overreact.”
Education, both in schools and among therapists, is vital in promoting healthy interactions with AI companions and addressing any associated challenges in a non-stigmatizing manner. By promoting open dialogue and ethical considerations, he believes society can harness the potential benefits of AI companionship while mitigating potential risks.
When contemplating the future of AI companionships, McArthur envisions rapid technological advancements including increased diversity in available options.
He predicts a broader array of AI catering to diverse preferences and needs, offering users a more nuanced experience.
“What people see right now, when they use ChatGPT, is still a very restricted version of what’s possible,” McArthur explained. “When new companies enter the market and those restrictions are lifted, I think that there’s going to be a lot more opportunity if people have lots of different kinds of relationships.”