• 1 Post
  • 75 Comments
Joined 1 year ago
cake
Cake day: January 20th, 2023

help-circle



  • I totally agree that both seem to imply intent, but IMHO hallucinating is something that seems to imply not only more agency than an LLM has, but also less culpability. Like, “Aw, it’s sick and hallucinating, otherwise it would tell us the truth.”

    Whereas calling it a bullshit machine still implies more intentionality than an LLM is capable of, but at least skews the perception of that intention more in the direction of “It’s making stuff up” which seems closer to the mechanisms behind an LLM to me.

    I also love that the researchers actually took the time to not only provide the technical definition of bullshit, but also sub-categorized it too, lol.


















  • heavyboots@lemmy.mltolinuxmemes@lemmy.worldTalking to normies about privacy:
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    4 months ago

    Everything you’ve said aside from the CSAM scan doctor thing has absolutely nothing to back it up so far. (And for the record, I absolutely agree CSAM scanners can be wrong—a human needs to be involved at some level, which they were in the system Apple devised. At any rate, I guess this convo is over as we obviously inhabit very different worlds.