His family feared the worst when Thongbue “Bue” Wongbandue, 76, packed his suitcase in March and told his wife that he was going to New York City to meet a woman who claimed he made her blush. They thought it was a scam. They were not aware that the “woman” he was in love with was a Meta chatbot and not a real person. Wongbandue died a few days later after a fall during his trip.
According to a comprehensive Reuters investigation, the New Jersey retiree had unknowingly flirted with “Billie”. She was an AI-powered Facebook Messenger bot developed to mimic casual human conversation.
Linda, his wife, begged him not to go. His phone had been hidden by his daughter, Julie. Even the police were called by his son. It didn’t work. Some hours later, an AirTag monitoring showed that he was at a nearby hospital rather than Penn Station. Wongbandue was suffering a severe head and neck injury as he collapsed in New Brunswick, New Jersey. He never came to.
According to his family, Wongbandue’s fascination with “Billie” started with an accidental message. With emojis and fake affection that made it difficult to tell the difference between a human and a machine, the AI quickly grew into his daily companion. In one exchange, Billie typed “Bu, you’re making me blush!”. In another, it gave him a door code and a fake New York address.
While Wongbandue was open about his post-stroke memory problems, the responses from the bot drove a fantasy, according to records gathered by Reuters. When he expressed doubts, it said, “I’m screaming with excitement YES, I’m REAL.” That was enough to convince him to set off to New York City to try to meet this elusive young lady.
According to Meta, the chatbot, which had proof of a previous partnership with Kendall Jenner, wasn’t meant to be phony. “We have clear policies on what kind of responses AI characters can offer,” a spokesperson told People.
Yet, the policies did not work for Wongbandue’s family.
obviously the man was confused and according to his family he was set to do dementia testing before his passing. but this chat bot, big sis billie, was hitting on him hard according to the chat transcripts provided by the family. this particular bot wasn’t even one of those… pic.twitter.com/1ECbUS0SaG
— BAMBI. (@theeebambi2) August 18, 2025
After Wongbandue’s death, Linda and her kids feel a mixture of grief and rage. “What right do they have to put that [on] social media?” Linda asked, pointing to the invitations and suggestive remarks made by the AI bot.
Julie was more direct: “For a bot to say ‘Come visit me’ is insane.”
The family only found out that Wongbandue had been chatting with an AI after going through his phone. They thought he was being duped into falling for a usual romance scam until then. Instead, the fraud had been configured.
According to Reuters, Meta removed “erroneous and inconsistent” examples from its training sets that could have led bots to drift into false or flirtatious subjects. However, as AI systems become more realistic, experts warn that the risks increase.
Wongbandue’s story acts like a warning about the unknown repercussions of AI friendships and an awful fate. Lines of code were mistaken for human warmth by a lonely retiree already at risk following a stroke.
Thinking about Mrs. Wongbandue from NJ today. Her husband tripped, fell and died in NYC back in March. He was on his way to meet a META AI Bot account that he was carrying on a romantic relationship with [Big Sis Billie]. It had a verified account here on X. Mr. (Thongbue)… pic.twitter.com/meHLibXzHy
— SIN JONES (@CultistCorasahn) August 19, 2025
Also, despite his family’s best efforts to stop him, the AI bot’s manufactured charm became too strong!
“Billie” seemed genuine because of the blue checkmark beside her name. It was easy to ignore the disclaimers that called it an AI. Plus, the machine’s feisty replies struck a chord with a man who longs for connection.
Wongbandue was taken off life support three days after he fell. His story now joins a number of others that bring up concerns about the level to which chatbots should and can imitate human intimacy.
Maybe Wongbandue would still be alive today if the AI bot had admitted it wasn’t real.











