Earlier this week I discussed an example of ChatGPT giving ‘placeholder’ answers in lieu of real answers. Below is an example of what that looks like. I could swear this didn’t used to happen, but it basically just ‘doesn’t’ answer your question. I’m interested how often other people see this behavior.
You must log in or register to comment.