Chat GPT often makes mistakes. They call them “hallucinations”. And at one point it completely made up court cases that got two lawyers sanctioned for using.
Yep, no doubt. I have used chat gpt extensibly and have found it hallucinating on my own questions. It was not the case when it referred to the 2002 event, but i know it does that. It is a tool like google. And google puts pseudoscience and conspiracy theories at the top of the list sometimes too when trying to fact find. You have to know the limitations of what it is capable of. Case in point, when i asked about this event, i didn’t assume gpt answer was correct, google gave links exclusively to coverage of the the 2002 event, completely ignoring the Vietnam portion of my query. And i still returned to ask the poster for more info to get context. I don’t know what more people could have wanted from me.
Chat GPT often makes mistakes. They call them “hallucinations”. And at one point it completely made up court cases that got two lawyers sanctioned for using.
https://www.forbes.com/sites/mattnovak/2023/05/27/lawyer-uses-chatgpt-in-federal-court-and-it-goes-horribly-wrong/
Chat GPT is not a search engine no matter how much Bing tries to tell you it is.
Yep, no doubt. I have used chat gpt extensibly and have found it hallucinating on my own questions. It was not the case when it referred to the 2002 event, but i know it does that. It is a tool like google. And google puts pseudoscience and conspiracy theories at the top of the list sometimes too when trying to fact find. You have to know the limitations of what it is capable of. Case in point, when i asked about this event, i didn’t assume gpt answer was correct, google gave links exclusively to coverage of the the 2002 event, completely ignoring the Vietnam portion of my query. And i still returned to ask the poster for more info to get context. I don’t know what more people could have wanted from me.