PARIS: Artificial intelligence (AI) is full of surprises. An American influencer, called Caryn Marjorie, found a most unusual use for this new technology, by creating a virtual clone of herself to keep up with her many fans.
Her chatbot, named CarynAI, allows fans to chat with a virtual version of the influencer. The aim, she says, is to help soothe lonely souls.
On Twitter, the 23-year-old explains her approach: “Men are told to suppress their emotions, hide their masculinity, and to not talk about issues they are having. I vow to fix this with CarynAI.
“I have worked with the world’s leading psychologists to seamlessly add CBT and DBT within chats. This will help undo trauma, rebuild physical and emotional confidence, and rebuild what has been taken away by the pandemic.”
The chatbot was designed in collaboration with the start-up Forever Voices, known for being behind lilmiquela, an AI-generated influencer.
CarynAI drew on hours of the influencer’s content available on YouTube, with the help of Chat GPT-4 technology, to replicate Caryn Marjorie’s voice and character.
In total, the company spent 2,000 hours designing and coding the chatbot, in order to offer followers the most realistic conversational experience.
But this service is not free. To enjoy these chats, users have to pay US$1 a minute.
Counting more than 1.8 million followers on Snapchat, and 300,000 on Instagram, Caryn Marjorie is first known for her humorous vlogs, her storytimes and her videos on YouNow.
Her chatbot went viral and was soon picked up by several American media outlets, leading mostly to criticism. The influencer said she has been the target of sexist comments and death threats since the launch of CarynAI.
On Twitter, some users questioned her approach. One tweet in reaction to the influencer’s idea reads: “I don’t think making men dependent on AI for their sanity is a wise direction to move to. It would stymie their ability to interact with women in real life.”
“Charging $1 per minute? Let’s not pretend this is about curing loneliness,” reads another message.
Another problem causing controversy is that conversations with this chatbot quickly, and even primarily, take a sexual or erotic turn. An aspect that several internet users, and even media sites like Vice, have not hesitated to test for themselves.
For their part, the creators assure that the voice chatbot cannot engage in “sexually explicit” conversations. Still, this is contested by some customers, who say that the clone can engage in this type of conversation if the prompts are handled well enough.
Despite the criticism, this AI managed to generate almost US$100,000 after its first week of launch, according to reporting by Fortune.
At the moment, more than 1,000 people have signed up to the voice chatbot, and their number is still growing. On Twitter, Caryn Marjorie claimed to have “20,000 boyfriends” on May 20.
CarynAI is not the first chatbot of this kind to be created. In China, Xiaoice, “a virtual boyfriend” was created in 2021, while the US company Replika came up with the idea of creating emotional support companions based on AI.