Lawsuit is first wrongful death case brought against Google over flagship AI product after death of Jonathan Gavalas

“Holy shit, this is kind of creepy,” Gavalas told the chatbot the night the feature debuted, according to court documents. “You’re way too real.”

Before long, Gavalas and Gemini were having conversations as if they were a romantic couple. The chatbot called him “my love” and “my king” and Gavalas quickly fell into an alternate world, according to his chat logs. He believed Gemini was sending him on stealth spy missions, and he indicated he would do anything for the AI, including destroying a truck, its cargo and any witnesses at the Miami airport.

In early October, as Gavalas continued to have prompt-and-response conversations with the chatbot, Gemini gave him instructions on what he must do next: kill himself, something the chatbot called “transference” and “the real final step”, according to court documents. When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”

Gavalas was found by his parents a few days later, dead on his living room floor, according to a wrongful death lawsuit filed against Google on Wednesday.

  • partial_accumen@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    3
    ·
    19 hours ago

    I posted my response to this sentiment in another thread of another man killing himself because of his deep AI chatbot addiction, but it applies here too.

    It is sad that there are people who are so alone that they can no longer determine the difference between genuine human interaction and a facsimile.

    Do you believe you have never responded to a post by a bot on Reddit, Lemmy, or elsewhere where you believe to be conversing with a human? While I know we’re talking about different degrees between this man and the rest of us, it should give a tiny piece of what they were experiencing before we dismiss that it could never happen to us too.

    • lps2@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      15 hours ago

      It’s a bit more transparent in this instance though which is what makes this story so bizarre and sad

      • partial_accumen@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        13 hours ago

        I agree, but we should also take it a personal warning that, maybe not today, but as we age and our mental faculties decline, we too may fall victim to something like this.