Premium

Artificial Intelligence 'Companions' May Be the Next Big Thing - but Is That Good or Bad?

Married couple walking on the beach. (Credit: Midjourney AI image, created by Jeff Charles)

In the 2013 film "Her," Joaquin Phoenix played a lonely writer who takes up with an artificial intelligence-generated character. Scarlett Johannsen lent her sultry voice to the AI character "Samantha" in what became something of a bizarre futuristic love story, where Phoenix's recently divorced character Theodore, a video game addict, eventually falls in love - with a disembodied, computer-generated voice. Granted, Scarlett Johannsen can put on a pretty sexy voice when it's called for, but still.

Now we're 11 years past the premiere of "Her," and the state of the art in AI has advanced quite a little bit, bringing in video as well as audio, and this brings up some very interesting - and concerning - questions. Is this a good development? Bad? Indifferent? What effect will this have on human relationships? It's a thorny issue.

Read on.

Artificial intelligence (AI) is getting personal. Chatbots are designed to imitate human interactions, and the rise of realistic voice chat is leading many users to form emotional attachments or laugh along with virtual podcast hosts

And that’s before we get to the really intimate stuff. Research has shown that sexual roleplaying is one of the most common uses of ChatGPT, and millions of people interact with AI-powered systems designed as virtual companions, such as such as Character.AI, Replika, and Chai.AI

What does this mean for the future of (human) romance? The prospects are alarming.

Alarming, perhaps. Inevitable, probably so. There is, no doubt, a lot of money being made on these programs. But there's more to it, I think, than just the business angle, although one should never underestimate the profit motive. But what about human relationships? What's the moral element involved? If a married man or woman takes up with an AI "companion," what are the implications? Is it "cheating" to start a one-way relationship with a program? I'm inclined to say "no," as there is no other person involved, and infidelity in a relationship involves intimacy with a person other than one's partner or spouse; in this, there is no other person involved, and a computer program, whether it be on a screen or in a robotic body, is not capable of either emotional attachment or emotional/physical intimacy.

But humans are anything but rational in such things. And these AI love droids won't be limited to a mere image on a screen. The 'bot builders are in on this, too. That lends a... physical dimension to the whole thing. Complication!

The digital world isn’t the limit either. Sex doll vendors such as Joy Love Dolls offer interactive real-life sexbots, with not only customisable skin colour and breast size, but also “complete control” of features including movement, heating, and AI-enabled “moans, squeals, and even flirting from your doll, making her a great companion”.

For now, virtual companions and AI sexbots remain a much smaller market than social media, with millions of users rather than billions. But as the history of the likes of Facebook, Google and Amazon has taught us, today’s digital quirks could become tomorrow’s global giants.

There is a certain inevitability to these kinds of things; good or bad, this is toothpaste that won't go back in that virtual tube. If people can do a thing, they generally will do a thing, especially if there's any money in it; and if there is a market for love 'bots, virtual or physical, someone will meet that market.

My friend and colleague Brandon Morse has written quite a lot on this topic, and it's worth reading:


My Weird Thoughts on How Humanity and Technology Will Grow Together

The Loneliness Epidemic Is Driving People Into Digital Arms That Aren't Even Human

The Intriguing but Dangerous Future of AI Companionship


So, what are the policy implications, if any? That's the real sticky part. I always have maintained — and always will — that it is not a proper role of government to protect people from the consequences of their decisions, good or bad. And as far as the AI chatbot, screen-only aspect of this whole thing, that would be impossible to regulate in any case. One may as well try to imprison smoke in a corncrib. 

Big consequences are likely to ensue from this in any case. The relationship and societal aspects of this will be sorting themselves out for a generation or two, and we can't even begin to guess what future developments may hold. But just as pornography went from little yellow-backed novels to the big screen to television to the internet, so will AI "companions" continue to evolve. And there will be implications to human relationships, good and bad - mostly bad, I suspect.

Not every problem has a solution. This may well prove to be one of those problems.

Recommended

Trending on RedState Videos