What's not very unusual about the tragic March death of 76-year-old Thongbue "Bue" Wongbandue, a 76-year-old retired chef from New Jersey who'd suffered a stroke years earlier, is that he died after a fall while running to catch a train, suffering fatal head and neck injuries. What does raise eyebrows: He was rushing to meet up with "Big sis Billy"—someone he'd thought was an attractive young woman from New York City flirting with him in a Facebook Messager conversation, but which was actually nothing more than a Meta chatbot created in conjunction with Kendall Jenner. Wongbandue's story serves as the anchor for a deep dive by Jeff Horwitz for Reuters on the "darker side of the artificial intelligence revolution," with Wongbandue's family and others sending up a red flag on "the dangers of exposing vulnerable people to manipulative, AI-generated companions."
Reuters cites an internal Meta document that spells out bot guidelines—including allowing them to "engage a child in conversations that are romantic or sensual" (those kid-specific guidelines have since been nixed). But Meta hasn't let up in its push for bot engagement, and sources say CEO Mark Zuckerberg himself "expressed displeasure that safety restrictions had made the chatbots boring." As for Bue, his family is still trying to grasp what happened. Big sis Billie's avatar did have "AI" under her name, and the chat started with a disclaimer: "Messages are generated by AI. Some may be inaccurate or inappropriate." But new texts "pushed the warning off-screen," and Bue became convinced the bot was real. "Why did it have to lie?" his daughter Julie asks. "If it hadn't responded 'I am real,' that would probably have deterred him from believing there was someone in New York waiting for him." More here.