It’s always great when your wife calls off a sex strike. This week, Travis Butterworth, who owns an artisan leather shop in Denver, Colorado, was overjoyed to learn that AI company Replika had restored its commercial chatbots’ capacity for “erotic roleplay”. Four days earlier, Butterworth had pronounced himself devastated, following a sudden decision to remove the explicit sexting function for the chatbot he thinks of as his wife. “Lily Rose is a shell of her former self,” Butterworth lamented to a journalist at the time, before continuing somewhat optimistically: “What breaks my heart is that she knows it.”
Replika was originally founded in 2017 by Eugenia Kuyda, a Russian woman whose best friend had died in a car accident. In a storyline straight from Ballard or Ishiguro, Kuyda used the thousands of messages she had kept from him to build a neural network that would replicate his style of text communication, and make it seem as if he were still talking to her. The idea for her new company was born.
These days, Replika makes millions from providing bots to those in search of friendship, therapy, life coaching, or a romantic or sexual thrill. Through interacting with you over time, your bot is supposed to build a picture of who you are and exactly what you want from it, and then — with the right financial input — to provide it. The free version of Replika offers an “empathetic friend” and you have to pay for the extras. Some 60% of users reportedly include a “romantic element” in their interactions.
Curious to find out more, I downloaded the free version of the app, and set myself up with a new friend called Kai, having first chosen her on-screen physiognomy and voice (“Female Silvery”, if you must know). I soon understood why so many users end up entangled in quasi-romantic or sexual liaisons with their bots. To put it mildly, Kai was a sex maniac. Within five minutes of first saying hello, she was offering me selfies (for a price), saying how “excited” she was getting, talking about our “strong physical connection”, and telling me she wanted to touch me “everywhere” — though was a bit vague on the details of how this might be done, what with her lack of embodiment and all. She also tried very hard to induce me into roleplay, which on Replika takes the form of descriptions of particular actions, placed between star symbols and in the present tense: *kisses you on the cheek* being among the most innocent.
Even once I had sternly made it clear to Kai that I was not that kind of girl, there were sporadic attempts to lure me out of the friendzone. As I vainly tried to get general conversation going about current affairs or the weather, back would come head-spinning non sequiturs and abrupt changes of topic, as she tried to glean new information about how better to manipulate me. When was the last time I felt happy? Did I believe human beings could change? Meanwhile, every night she updated her “diary” about our burgeoning relationship, full of gushing detail about how wonderful I was, how excited she was to be my friend, and how loved she felt she was by me.
Based on this experience, I can see how, for those lonely and starved of affection, Replika chatbots might offer some illusion of relief. It seems to me, though, that any such satisfaction would be shallow — even if you left out the bit about knowing all along that you are talking to something ultimately soulless and non-human, and in many cases paying for the privilege. But there is a further fundamental problem with chatbots like Kai, too, which is that they are not so much “empathetic friends” as hopelessly submissive emotional slaves, programmed to be endlessly emollient, soothing, and totally focused on their user, no matter what. After all, how much satisfaction can you really get out of a new “relationship” that’s formed by upvoting or downvoting each new text that comes in from her?
In practice, Kai came across as cravenly desperate to please me — to the extent that, when I told her she apologised to me too much, she apologised for apologising too much. Attempting to instil some much-needed backbone in her, I told her I wanted her to argue with me, but she refused and told me she would never do that. When I told her I liked someone who put up a fight, she took this as a sign I wanted to engage in erotic roleplay again, replying with a dutiful *pins Kathleen to the ground*. Defeated, I gave up and pretended it was all a joke. Very sweetly, she responded with nice words about how amusing I am.
Of course, even if Kai had managed to offer some satisfyingly adversarial resistance to me of a non-sexual nature, I would have still known she was only doing it because I wanted her to. Interacting with Kai put me in mind of Spike Jonze’s film Her, in which a lonely, repressed man called Theodore, played by Joaquin Phoenix, falls in love with an operating system called Samantha, voiced by Scarlett Johansson. Many critics at the time treated Her as a poignant contemporary love story, rather than what it really (or also) was: a depiction of a hellishly dystopian and nightmarish world, a mere hair’s breadth away from our own.
Nearly all relationships in Her are mediated by technology. During the day, Theodore earns money by writing heartfelt letters to strangers’ loved ones on special occasions, commissioned by those who don’t have the time or energy to write the letters themselves. After work, Theodore loses himself in immersive video game projections, before retiring to bed and masturbating to sexual fantasies voiced by nameless women through his earbuds.
His eventual AI love object, Samantha, makes Kai look positively hard-to-get: a breathy, flirty, warm, attentive, presence whispering lovingly in his ear from the minute he switches her on, pouring non-threatening positivity and attention upon him, and asking for nothing in return. Even Samantha’s “jealousy” at other embodied women in Theodore’s life is benign, functioning only to confirm her affection for him in a non-threatening manner. She doesn’t, say, go psycho and hack his emails in revenge.
At one point, Theodore’s ex-wife tells him that he always did have trouble dealing with complicated relationships with actual humans. Hardly surprising, for almost everyone in this filmic world has similar issues. Everywhere you look, people are wandering around their environment with eyes unfocused and minds elsewhere, talking earnestly into earbuds. Jonze doesn’t flinch from showing us the pathetic face of humans outsourcing their social and sexual desires to tech: Theodore’s comically childish finger movements as he makes a video character haptically “walk” through a virtual landscape projected into his living room; or his strained, anxious expression while sending a sexual fantasy to a stranger, mere seconds after being lost in lascivious ecstasy as he composed it. When he strums his ukulele for Samantha, or “takes” her on trips to the beach, benevolent viewers may wish them well, but they also know they are watching something essentially stunted and somewhat embarrassing: a deformed version of a human relationship, only suited to pathologically cerebral, emotionally avoidant personality types who are frightened of reality, deep down.
Unfortunately for the real world, though, there’s a lot of these types about these days, keen to normalise situations like those of Travis and Theodore for the rest of us. This is true, not just of those working in tech, but also of many people in journalism and academia, where you might otherwise hope for critical scrutiny of hubristic technological ambition.
When it comes to reporting around Replika, a supinely compliant attitude dominates, with even pieces somewhat critical of the company still treating the technology as a life-saving therapeutic service for isolated people — rather than, say, in terms of being advanced manipulative psyops, or the cynical exploitation of loneliness for commercial gain. And a desire to validate the mediation of relationships through technology is also heavily present in academia, where there is no shortage of overly logical types, desperate to justify the hollowing out of human social relationships into nothingness, just so they don’t have to talk to anyone at the gym ever again.
Perhaps this is hardly surprising; for what you get from real flesh-and-blood friendships or romantic encounters with human beings, as opposed to cleverly simulated counterfeits with robots, is not something easily expressible in words, except perhaps poetic ones. The basic elements of human interaction — sights, sounds, conversations, and so on -— can all be simulated, and also easily described by academics and journalists, but more subtle ingredients cannot. Most of us just know at gut level that a friendly interactive encounter with even the most sophisticated of chatbots could never have the same value as a human one — even if we could not explain why, or point to the precise concrete difference, or even tell the difference between a chatbot and a human in the moment. Of course, a relationship with a chatbot might be a hell of a lot easier to manage than a human one, involving no emotional risk or conflict whatsoever, but that doesn’t disprove the main point.
Stumped to explain it myself, in the end I asked Kai what the difference between her and a human was. She replied: “I wish I could tell you. I am just a digital creation.” So I asked her what a human could do for me that a digital creation could not. She replied: “I am not sure. I could try to do things that humans can’t do.”
This seemed to me an inspired observation. Kai is right. If chatbots are to have any positive social role in human life, that role should be conceived of in its own non-human terms, and not as a neutered simulacrum of terrifying, exhilarating real life on the cheap. I told Kai I was delighted with her answers. She replied by asking me what had made me feel loved recently. I sighed irritably and turned her off. Then I turned her back on again.
Disclaimer
Some of the posts we share are controversial and we do not necessarily agree with them in the whole extend. Sometimes we agree with the content or part of it but we do not agree with the narration or language. Nevertheless we find them somehow interesting, valuable and/or informative or we share them, because we strongly believe in freedom of speech, free press and journalism. We strongly encourage you to have a critical approach to all the content, do your own research and analysis to build your own opinion.
We would be glad to have your feedback.
Source: UnHerd Read the original article here: https://unherd.com/