• 8 Posts
  • 120 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle












  • Limeey@lemmy.worldtoScience Memes@mander.xyzHuh
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    4
    ·
    8 months ago

    It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.