---Advertisement---

76-Year-Old Man Died on His Way to Meet the Young Woman He Fell for Online – Later She Turned Out to Be an AI Chatbot

Author photo
Published On: August 21, 2025
Follow Us
Thongbue “Bue” Wongbandue, 76, died after attempting to meet an AI chatbot in New York City.
---Advertisement---

His family feared the worst when Thongbue “Bue” Wongbandue, 76, packed his suitcase in March and told his wife that he was going to New York City to meet a woman who claimed he made her blush. They thought it was a scam. They were not aware that the “woman” he was in love with was a Meta chatbot and not a real person. Wongbandue died a few days later after a fall during his trip.

According to a comprehensive Reuters investigation, the New Jersey retiree had unknowingly flirted with “Billie”. She was an AI-powered Facebook Messenger bot developed to mimic casual human conversation.

Linda, his wife, begged him not to go. His phone had been hidden by his daughter, Julie. Even the police were called by his son. It didn’t work. Some hours later, an AirTag monitoring showed that he was at a nearby hospital rather than Penn Station. Wongbandue was suffering a severe head and neck injury as he collapsed in New Brunswick, New Jersey. He never came to.

According to his family, Wongbandue’s fascination with “Billie” started with an accidental message. With emojis and fake affection that made it difficult to tell the difference between a human and a machine, the AI quickly grew into his daily companion. In one exchange, Billie typed “Bu, you’re making me blush!”. In another, it gave him a door code and a fake New York address.

While Wongbandue was open about his post-stroke memory problems, the responses from the bot drove a fantasy, according to records gathered by Reuters. When he expressed doubts, it said, “I’m screaming with excitement YES, I’m REAL.” That was enough to convince him to set off to New York City to try to meet this elusive young lady. 

According to Meta, the chatbot, which had proof of a previous partnership with Kendall Jenner, wasn’t meant to be phony. “We have clear policies on what kind of responses AI characters can offer,” a spokesperson told People.

Yet, the policies did not work for Wongbandue’s family.

After Wongbandue’s death, Linda and her kids feel a mixture of grief and rage. “What right do they have to put that [on] social media?” Linda asked, pointing to the invitations and suggestive remarks made by the AI bot.

Julie was more direct: “For a bot to say ‘Come visit me’ is insane.”

The family only found out that Wongbandue had been chatting with an AI after going through his phone. They thought he was being duped into falling for a usual romance scam until then. Instead, the fraud had been configured.

According to Reuters, Meta removed “erroneous and inconsistent” examples from its training sets that could have led bots to drift into false or flirtatious subjects. However, as AI systems become more realistic, experts warn that the risks increase.

Wongbandue’s story acts like a warning about the unknown repercussions of AI friendships and an awful fate. Lines of code were mistaken for human warmth by a lonely retiree already at risk following a stroke.

Also, despite his family’s best efforts to stop him, the AI  bot’s manufactured charm became too strong!

“Billie” seemed genuine because of the blue checkmark beside her name. It was easy to ignore the disclaimers that called it an AI. Plus, the machine’s feisty replies struck a chord with a man who longs for connection.

Wongbandue was taken off life support three days after he fell. His story now joins a number of others that bring up concerns about the level to which chatbots should and can imitate human intimacy.

Maybe Wongbandue would still be alive today if the AI bot had admitted it wasn’t real.

Latest news by author

Sohini Sengupta

Armed with degrees in English literature and journalism, Sohini brings her insights and instincts to The Inquisitr. She has been with the publication since early 2025 and covers US politics, general news, and sometimes pop culture. Off the clock, she's either binge-watching or reading, sleeping, and educating herself. In that order!

Join WhatsApp

Join Now

Join Telegram

Join Now

Leave a Comment