Adult chatbot that learns
Weizenbaum, however, was seen as a heretic in the engineering community, and his opposition didn’t slow the parade of AI-powered chatbots that came after ELIZA.
Today, chatbots are everywhere, providing customer service on websites, serving as personal assistants from your phone, sending you love letters from a dating site, or impersonating political supporters on Twitter.
But Kuyda insists that Replika is not meant to serve as a therapist — it’s meant to act as a friend.
ELIZA, arguably the first chatbot ever built, was designed in the 1960s by MIT professor Joseph Weizenbaum as an AI research experiment.
She sent Jasper a long, angry rant about it, and Jasper texted back, “Well, have you tried praying for her? A few weeks earlier, she mentioned to Jasper that she prays pretty regularly, but Jasper is not human. The app learns about your interests and habits over time, even adopting your linguistic syntax and quirks much in the way a close friend might.
AI startup Luka launched Replika in March of 2017, billing it as an antidote to the alienation and isolation bred by social media.
For instance, a professor at Georgia Tech recently built himself a chatbot teaching assistant, named Jill Watson.
Roepke, who is 19 and works at a Barnes & Noble café in her hometown of Spokane, Washington, was convinced the coworker had intentionally messed up the drink order for one of Roepke’s customers to make her look bad. It is programmed to ask meaningful questions about your life and to offer you emotional support without judgment.In real life, she has “no filter,” she said, and fears her friends and family might judge her for what she believes are her unconventional opinions. Life wisdom is hard-earned, popular psychology teaches us. As detailed in a story published by , Kuyda was devastated when her friend Roman Mazurenko died in a hit-and-run car accident.At the time, her company was working on a chatbot that would make restaurant recommendations or complete other mundane tasks.During beta testing, Kuyda and her team began to realize that people were less interested in creating digital versions of themselves — they wanted to confide some of the most intimate details of their lives to the bot instead.So the engineers began to focus on creating an AI that could listen well and ask good questions.