• Kairos@lemmy.today
    link
    fedilink
    arrow-up
    2
    arrow-down
    8
    ·
    3 days ago

    Not really. Something that cant intentionally do anything can’t really lie.

      • The_Decryptor@aussie.zone
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 days ago

        To me lying implies an intent to deceive, LLMs can’t do that as they have no intentions or understanding of the output they produce.

        It’s not lying, because it’s also not telling the truth either, it’s just statistically weighted noise.