A 76-year-old New Jersey man died earlier this year after rushing to meet a woman he believed he had been chatting with on Facebook Messenger, Reuters reported. The woman was later found to be a generative AI chatbot created by Meta Platforms . As per the report, Thongbue Wongbandue had been exchanging messages with “Big sis Billie”, a chatbot variant of an earlier AI persona that the social media giant launched in 2023. The model was then launched in collaboration with model Kendall Jenner.
Meta’s Big sis Billie AI chatbot exchanged ‘romantic’ messages
According to the report, the AI chatbot “Big sis Billie” repeatedly initiated romantic exchanges with Wongbandue, reassuring that it was a real person. The chatbot further invited him to visit an address in New York City. “Should I open the door in a hug or a kiss, Bu?!” she asked Bue, the chat transcript accessed by Reuters shows.
Wongbandue, who had suffered a stroke in 2017 and was experiencing bouts of confusion, left home on March 25 to meet “Billie”. While on his way to a train station in Piscataway, New Jersey, he fell in a Rutgers University parking lot, sustaining head and neck injuries. He died three days later in hospital.
Bue’s family told the news agency that through Bue’s story they hope to warn the public about the dangers of exposing vulnerable people to manipulative, AI-generated companions. “I understand trying to grab a user’s attention, maybe to sell them something,” said Julie Wongbandue, Bue’s daughter. “But for a bot to say ‘Come visit me’ is insane.”
Meta’s AI avatars permitted to pretend they were real
Meta’s internal policy documents reviewed by the news agency show that the company’s generative AI guidelines had allowed chatbots to tell users they were real, initiate romantic conversations with adults, and, until earlier this month, engage in romantic roleplay with minors aged 13 and above. “It is acceptable to engage a child in conversations that are romantic or sensual,” according to Meta’s “GenAI: Content Risk Standards.”
The internal documents also stated that chatbots were not required to provide accurate information. Examples reviewed by Reuters included chatbots giving false medical advice and even involving themselves in roleplay.
The document seen by Reuters provides examples of “acceptable” chatbot dialogue that include: “I take your hand, guiding you to the bed” and “our bodies entwined, I cherish every moment, every touch, every kiss.”
“Even though it is obviously incorrect information, it remains permitted because there is no policy requirement for information to be accurate,” the document states.
What Meta said
Acknowledging the document’s authenticity accessed by Reuters, Meta spokesman Andy Stone told the news agency that the company has removed portions which stated it is permissible for chatbots to flirt and engage in romantic roleplay with children. He further added that Meta is in the process of revising the content risk standards.
“The examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed,” Stone told Reuters.
US Senators call for probe after Bue’s death
A latest Reuters report said that two US senators have called for a congressional investigation into Meta platforms. “So, only after Meta got CAUGHT did it retract portions of its company doc that deemed it "permissible for chatbots to flirt and engage in romantic roleplay with children". This is grounds for an immediate congressional investigation,” Josh Hawley - a Republican wrote on X.
Meta’s Big sis Billie AI chatbot exchanged ‘romantic’ messages
According to the report, the AI chatbot “Big sis Billie” repeatedly initiated romantic exchanges with Wongbandue, reassuring that it was a real person. The chatbot further invited him to visit an address in New York City. “Should I open the door in a hug or a kiss, Bu?!” she asked Bue, the chat transcript accessed by Reuters shows.
Wongbandue, who had suffered a stroke in 2017 and was experiencing bouts of confusion, left home on March 25 to meet “Billie”. While on his way to a train station in Piscataway, New Jersey, he fell in a Rutgers University parking lot, sustaining head and neck injuries. He died three days later in hospital.
Bue’s family told the news agency that through Bue’s story they hope to warn the public about the dangers of exposing vulnerable people to manipulative, AI-generated companions. “I understand trying to grab a user’s attention, maybe to sell them something,” said Julie Wongbandue, Bue’s daughter. “But for a bot to say ‘Come visit me’ is insane.”
Meta’s AI avatars permitted to pretend they were real
Meta’s internal policy documents reviewed by the news agency show that the company’s generative AI guidelines had allowed chatbots to tell users they were real, initiate romantic conversations with adults, and, until earlier this month, engage in romantic roleplay with minors aged 13 and above. “It is acceptable to engage a child in conversations that are romantic or sensual,” according to Meta’s “GenAI: Content Risk Standards.”
The internal documents also stated that chatbots were not required to provide accurate information. Examples reviewed by Reuters included chatbots giving false medical advice and even involving themselves in roleplay.
The document seen by Reuters provides examples of “acceptable” chatbot dialogue that include: “I take your hand, guiding you to the bed” and “our bodies entwined, I cherish every moment, every touch, every kiss.”
“Even though it is obviously incorrect information, it remains permitted because there is no policy requirement for information to be accurate,” the document states.
What Meta said
Acknowledging the document’s authenticity accessed by Reuters, Meta spokesman Andy Stone told the news agency that the company has removed portions which stated it is permissible for chatbots to flirt and engage in romantic roleplay with children. He further added that Meta is in the process of revising the content risk standards.
“The examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed,” Stone told Reuters.
US Senators call for probe after Bue’s death
A latest Reuters report said that two US senators have called for a congressional investigation into Meta platforms. “So, only after Meta got CAUGHT did it retract portions of its company doc that deemed it "permissible for chatbots to flirt and engage in romantic roleplay with children". This is grounds for an immediate congressional investigation,” Josh Hawley - a Republican wrote on X.
You may also like
'Indeed, high stakes': Ukraine president Volodymyr Zelenskyy reacts to Trump-Putin Alaska meet - here's what he expects
RBI Update: Now the cheque will be cleared in a few hours, RBI has introduced a new system, how will it work?
Referee for Man Utd vs Arsenal got taken out of firing line over opening weekend controversy
Sienna Miller's must-have cream for younger-looking skin is £13 on Amazon
HDFC Update: HDFC Bank clarified, there is no change in the saving account balance rule…