• BigFig@lemmy.world
      link
      fedilink
      English
      arrow-up
      53
      arrow-down
      1
      ·
      1 year ago

      What do you do when ChatGPT just makes shit up or answers incorrectly to yes or no questions, you’d have no way of knowing it was wrong

      • gridleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        1 year ago

        ChatGPT is most useful when you may not know the right answer, but you know a wrong answer when you see one. It’s very useful for technical issues. Much quicker for troubleshooting than searching page after page for a solution.

      • IronKrill@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        While this is an important thing to understand about AI, it’s an overstated issue once understood. For most questions I ask AI, it doesn’t matter if it’s correct as long as it pulls some half useful info to get me on track (i.e programming). For other questions, I only ask it if I need to figure out where to look next, which it will usually do just fine.

        The first page of my search results is all AI generated garbage articles anyway, at least I know what I am getting with GPT and can take it as such.

        • Womble@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Yup, as long as you are aware that it could be wrong and look at it critically LLMs at GPT scale are very useful tools. The best way I’ve heard it described is having a lightning fast intern who often gets things wrong but will always give it a go.

          So long as you’re calibrated to “how might this be wrong” when looking at the results it is exceptionally useful.

      • Otter@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Not the other commenter:

        I usually have an idea about the thing I’m asking, and if not then I’ll look up the topics mentioned after some guided brainstorming

        I’ve also found that asking the same question again, after resetting the chat, can give you an idea of what is happening

    • thorbot@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      I’m curious what you use it for, because I try to use it daily for IT related queries and it gets less than half of what I ask correct. I basically have to fact check almost everything it tells me which kind of defeats the purpose. It does shine when I need really abstract instructions though, the other day I asked it how to get into a PERC controller on some old server and Google had nothing helpful, and ChatGPT laid out the instructions to get in there and rebuild a disk perfectly. So while it has some usefulness I generally can’t really trust it fully.

      • Womble@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        The point you have to remember is that it is trained on bulk data out there in a very inefficient manner, it needs to see thousands of examples in order to start getting any sort of understanding of something. If you ask it “how do I do {common task} in {popular language}” you will generally get excellent results, but the further you stray from that the more likely to be error prone it is.

        Still it is often good to get you looking on the right track when you are unsure to start, and is fantastic for learning a new language. I’ve been using it extensively in learning C# where I know what I want to code but not exactly how to use existing features to do it.

      • cybersandwich@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        6
        ·
        edit-2
        1 year ago

        But generally you can’t (shouldn’t) trust web search results fully either. At the end of the day, the onus is on you as the user to do your due diligence.

        I’ve seen ChatGPT give me wrong information, and sometimes it would be bad to execute the code or command it generated it, but I know enough to say “are you sure thats correct?”. Hell, you can just challenge it each time or open a new session and ask it “what does this code do: insert-code-it generated here”.

        You shouldn’t just paste a search result command from stack overflow into your terminal either. And at least with chatgpt you can ask it to explain the command or code in detail and it will walk you through what each step does.

        Also, pasting that command from stack over flow into chatgpt and adding your specific context around it is HUGE. Thats why I say they are different products/use cases but they work well in concert. They just dont work well combined together like bing and google have been doing.

        edit: I guess lemmy escapes certain characters and it ate my post.

    • jacktherippah@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      ChatGPT is not a search engine. It takes random shit from the Internet and stitches it together. It can often get things wrong in my experience. It’s best to always fact check.

    • BaroqueInMind@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I thought ChatGPT can’t search the internet and is using a LLM snapshot from 2021?

      And I thought Bing’s ChatGPT model is allowed to search the internet live?

      Doesn’t that make Bing’s version of ChatGPT superior?

    • IronKrill@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Keyword searches worked fine and pulled up exactly what I wanted for years, I swear to god. Somewhere in the last decade though websites have gamed the system and now I can’t find anything no matter how I word my search. It’s depressing.

    • eran_morad@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I prefer that stack looks the same as it did way back when. And stack is usually where i find my answers.

    • Supervisor194@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      1 year ago

      I use ChatGPT every day too. Because Google is being such a shit about YouTube I am in the process of moving away from Google altogether. I use DuckDuckGo for search, which indirectly uses Bing. It’s mostly OK. Sometimes I’m forced to try Google, it usually doesn’t help. But for programming, yeah, StackOverflow feels downright regressive now.

      I’m honestly kind of surprised about this news, considering how horrible Google’s results are now.

      • BaroqueInMind@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I thought ChatGPT can’t search the internet and is using a LLM snapshot from 2021?

        And I thought Bing’s ChatGPT model is allowed to search the internet live?

        Doesn’t that make Bing’s version of ChatGPT superior?

        • Chreutz@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          GPT4 on ChatGPT was recently (last week ish) updated to include data up to April 2023.

        • Supervisor194@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I’ve found this to be kind of subjective. Bing/Bard is more current than ChatGPT but yet I just find ChatGPT to be better. It’s snappier and more conversant with context. It seems to understand you when you chide it for not quite doing what you asked it to do, and it responds in kind. I mostly use it for programming to be fair, but even for other stuff, ChatGPT just somehow feels more… real? I can’t quite put my finger on it.

          There was a short time where Bing chat was kind of frighteningly real. Took them five seconds to nerf that shit and it’s never been anywhere near the same.

          Edit: I expect this answer to be out of date within 3 months. Things keep moving.