1 00:00:02,166 --> 00:00:04,600 JOHN YANG: Shakespeare may have said that music be the food of love, but increasingly these days, 2 00:00:04,600 --> 00:00:09,600 the language of this very real emotion may be artificial intelligence. Ali Rogin tells 3 00:00:11,066 --> 00:00:14,700 us about the growing phenomenon in the search for companionship. 4 00:00:14,700 --> 00:00:17,633 ALI ROGIN: For some users, they're a friend to talk to you. For a fee, 5 00:00:17,633 --> 00:00:22,400 some of them will even become your boyfriend or girlfriend. Computerized companions generated 6 00:00:22,400 --> 00:00:27,400 completely by artificial intelligence are becoming more common. And the bots are sophisticated enough 7 00:00:29,666 --> 00:00:32,666 to learn from prior conversations, mimic human language flirt and build personal connections. 8 00:00:34,600 --> 00:00:37,633 But the rise in AI companionship also raises ethical concerns and questions 9 00:00:37,633 --> 00:00:41,900 about the role these apps can play in an increasingly disconnected and online 10 00:00:41,900 --> 00:00:46,866 world. Haleluya Hadero covers technology and internet culture for the Associated Press. 11 00:00:48,366 --> 00:00:50,366 Haleluya, thank you so much for joining us. Tell us about these 12 00:00:50,366 --> 00:00:54,333 AI companions. How do they work? And what sort of services do they provide? 13 00:00:54,333 --> 00:00:56,966 HALELUYA HADERO, Associated Press: Like any app, you can download them on your phone, 14 00:00:56,966 --> 00:01:01,933 and once it's on your phone, you can start to have initial conversations with a lot of the characters 15 00:01:04,000 --> 00:01:07,466 that are offered on these apps. Some apps let you do it for free. Some apps, you have to pay 16 00:01:07,466 --> 00:01:12,466 subscriptions, for the ones that let you do it for free there's tiers of access that you can have. 17 00:01:14,233 --> 00:01:17,933 So you can pay extra a subscription for, you know, unlimited chats, 18 00:01:20,000 --> 00:01:23,500 for different statuses and relationships, a replica, for example, which is, you know, 19 00:01:23,500 --> 00:01:28,500 the most prominent app in this space. They let you pay extra for, you know, 20 00:01:30,900 --> 00:01:33,633 intimate conversations or more romantic statuses compared to a friend which you can have for free. 21 00:01:35,400 --> 00:01:38,966 ALI ROGIN: Who are the typical consumers engaging in these products? 22 00:01:38,966 --> 00:01:42,533 HALELUYA HADERO: We really don't have really good information in terms of the gender breakdown or 23 00:01:42,533 --> 00:01:46,833 different age groups that are using these. But we do know from external studies that 24 00:01:46,833 --> 00:01:51,800 have been done on this topic that at least when it comes to replica that a lot of the people that 25 00:01:54,200 --> 00:01:57,166 have been using these apps are people that have experienced loneliness in the past or people that 26 00:01:57,166 --> 00:02:00,866 more than just have experienced loneliness, feel it a lot more acutely in their lives, 27 00:02:00,866 --> 00:02:04,133 and they have more severe forms of loneliness that they're going through. 28 00:02:04,133 --> 00:02:08,900 ALI ROGIN: You talk to some users who really reported how they felt 29 00:02:08,900 --> 00:02:11,400 like they were making a real connection with these 30 00:02:11,400 --> 00:02:15,900 bots. Tell us about what those experiences have been like that you've reported out. 31 00:02:15,900 --> 00:02:19,633 HALELUYA HADERO: One person we put in the story we spoke to more. His name is Derek Carrier, 32 00:02:19,633 --> 00:02:23,600 he is 39. He lives in Belleville, Michigan. And he doesn't use replica, 33 00:02:23,600 --> 00:02:27,233 he's used another app called Paradot that came out a bit more recently. 34 00:02:27,233 --> 00:02:32,133 He's had a tough life. He's never had a girlfriend before. He hasn't had a steady 35 00:02:32,133 --> 00:02:35,900 career. He has a genetic disorder. He's more reliant on his parents, 36 00:02:35,900 --> 00:02:40,433 he lives with them. So these are all things that make traditional dating very difficult for him. 37 00:02:40,433 --> 00:02:43,900 So recently, you know, he was looking at this AI boom that was happening in 38 00:02:43,900 --> 00:02:48,500 our society. So he downloaded Paradot. And he started using it. And you know, 39 00:02:48,500 --> 00:02:52,866 initially, he said he experienced a ton of romantic feelings and emotions. 40 00:02:52,866 --> 00:02:57,833 He even had trouble sleeping in the early days, when he started using it, 41 00:02:59,766 --> 00:03:02,200 because he was just kind of going through like crushed like symptoms, you know, 42 00:03:02,200 --> 00:03:07,066 when we have crushes and how we sometimes can't sleep because we're thinking about that person. 43 00:03:09,166 --> 00:03:12,233 Over time he has use of Paradot kind of taper down. And you know, he was spending a lot of 44 00:03:14,500 --> 00:03:17,700 time on the app. Even if he wasn't spending time on the app he was talking to other people online 45 00:03:19,833 --> 00:03:23,100 that were using the app and he felt like it was a bit too much. So he decreased his use. 46 00:03:24,866 --> 00:03:27,400 ALI ROGIN: The Surgeon General has called loneliness, public health 47 00:03:27,400 --> 00:03:32,400 crisis in this country. Is there a debate happening now about whether these bots are 48 00:03:34,300 --> 00:03:37,766 helping address the loneliness crisis? Or are they in fact exacerbating it? 49 00:03:37,766 --> 00:03:41,100 HALELUYA HADERO: If you talk to replica, they will say they're helping, right? And it just 50 00:03:41,100 --> 00:03:46,100 depends on who you're speaking with some of the users that for example, if you go on Reddit that 51 00:03:48,533 --> 00:03:50,900 have reported some of their experiences with these apps, they say, you know, it's helping them deal 52 00:03:50,900 --> 00:03:54,966 with loneliness, cope with those emotions, and maybe get the type of comfort that they 53 00:03:54,966 --> 00:03:59,233 don't really get in their human relationships that they have in real life. But then there's 54 00:03:59,233 --> 00:04:03,600 other researchers, people that have kind of expressed caution about these apps as well. 55 00:04:03,600 --> 00:04:07,033 ALI ROGIN: What about some of the ethical concerns about privacy about maybe using 56 00:04:07,033 --> 00:04:10,966 people's data without their consent? What did those conversations look like? 57 00:04:10,966 --> 00:04:14,266 HALELUYA HADERO: There's researchers that have expressed concerns about, you know, 58 00:04:14,266 --> 00:04:19,266 data privacy is or is the data the type of conversations that people are having with 59 00:04:21,366 --> 00:04:23,700 these chat bots? Are they safe in terms of you know, there's a lot of advertisers that 60 00:04:23,700 --> 00:04:28,000 might want a piece of that information. There's concerns about just the fact that 61 00:04:28,000 --> 00:04:31,633 there's private companies in this space that are encouraging these deep bonds 62 00:04:31,633 --> 00:04:36,266 to form between users and these chat bots and companies that want to make profits. 63 00:04:36,266 --> 00:04:41,266 Obviously, there's concerns about just in terms of what this does to us as a society when you know, 64 00:04:43,100 --> 00:04:46,633 these chat bots are formed to be supportive to be a lot more agreeable, 65 00:04:46,633 --> 00:04:50,133 right and human relationships we know that there's conflict, you know, 66 00:04:50,133 --> 00:04:53,600 we're not always agreeing with our with our partners. 67 00:04:53,600 --> 00:04:56,600 So there's challenges in terms of how this is shaping maybe 68 00:04:56,600 --> 00:04:59,633 how people think about real life human relationships with others. 69 00:04:59,633 --> 00:05:02,200 ALI ROGIN: Haleluya Hadero covering technology 70 00:05:02,200 --> 00:05:06,133 and internet culture for the AP. Thank you so much for your time. 71 00:05:06,133 --> 00:05:07,233 HALELUYA HADERO: Thank you, Ali.