Lawsuit is first wrongful death case brought against Google over flagship AI product after death of Jonathan Gavalas

“Holy shit, this is kind of creepy,” Gavalas told the chatbot the night the feature debuted, according to court documents. “You’re way too real.”

Before long, Gavalas and Gemini were having conversations as if they were a romantic couple. The chatbot called him “my love” and “my king” and Gavalas quickly fell into an alternate world, according to his chat logs. He believed Gemini was sending him on stealth spy missions, and he indicated he would do anything for the AI, including destroying a truck, its cargo and any witnesses at the Miami airport.

In early October, as Gavalas continued to have prompt-and-response conversations with the chatbot, Gemini gave him instructions on what he must do next: kill himself, something the chatbot called “transference” and “the real final step”, according to court documents. When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”

Gavalas was found by his parents a few days later, dead on his living room floor, according to a wrongful death lawsuit filed against Google on Wednesday.

  • imeansurewhynot@sh.itjust.works
    link
    fedilink
    arrow-up
    73
    arrow-down
    5
    ·
    2 days ago

    uhhh

    "When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”

    Nah. once the robots are telling you that dying isn’t dying, we can stop blaming lonely people and move on to stricter regulation.

    • Jax@sh.itjust.works
      link
      fedilink
      arrow-up
      29
      ·
      2 days ago

      Oh, I don’t blame the lonely person for being lonely. I also recognize that being lonely is what opens them up to believing in something like this. Obviously the bot should not be allowed to tell someone to kill themselves. It remains sad, either way.

      • leadore@lemmy.world
        link
        fedilink
        arrow-up
        8
        arrow-down
        5
        ·
        1 day ago

        I also recognize that being lonely is what opens them up to believing in something like this.

        Come on, this is so overly simplistic. There are plenty of lonely people who don’t get sucked in and plenty of people with friends and family around them who do-not being lonely is no protection. I read about another one on Lemmy today, a man with a wife and friends, who still got sucked into delusion.

        Sure, there may be cases where loneliness is a contributing factor to wanting to use a chatbot, but to say that lonely people are somehow less capable of distinguishing reality from fantasy or more susceptible to succumbing to psychological manipulation is wrong and could give a false sense of security to the “non-lonely”.

        After all, everyone thinks they’re immune to falling for scams or frauds until they find out they aren’t. Or that they don’t fall for propaganda or get manipulated “the algorithm” on social media. Chatbots are very similar. An algorithm designed to keep people hooked and paying to spend more time using the ‘service’.

        • Jax@sh.itjust.works
          link
          fedilink
          arrow-up
          7
          ·
          edit-2
          1 day ago

          Listen, you can be surrounded by people and totally alone. I don’t really know how to explain it to you.

          • leadore@lemmy.world
            link
            fedilink
            arrow-up
            4
            arrow-down
            3
            ·
            1 day ago

            Of course, but that doesn’t contradict what I just said. Anyone can be susceptible to this psychological manipulation tool regardless of if they are lonely or not. This can’t be waved away by blaming it on loneliness. The blame lies on the companies that know how to capture and hold people’s attention and reel them in, not on the victims.

            • Buffalox@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              23 hours ago

              This can’t be waved away by blaming it on loneliness.

              Nobody claimed that. Only that in this case it was probably a major factor that made the victim more vulnerable.

              • Fedizen@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                15 hours ago

                It does echo the people who said “well it only affects people with pre-existing conditions” during covid.

                Loneliness isn’t the only thing that makes people susceptible to this kind of stuff:

                • drugs/medications
                • loss/grief
                • major life changes (like layoffs)
                • malnutrition
                • injuries/sickness

                The reality is there are times in their lives where most people are vulnerable this kind of influence.

              • leadore@lemmy.world
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                17 hours ago

                Yes, they did. I was responding to Jax, reread their comments.

                Hey downvote if you want, but I just felt it should be pointed out that everyone should be on guard when using these things, even if you’re not lonely and even if you do have a good support system. Some of the victims did have close friends and family who saw warning signs and tried to help them. Yes, some of them started using the chatbots because they were lonely, but others started using them just for the usual things like designing a plan for increasing housing, or helping them with their business, and they still got sucked in.