Does GPT have a pocket protector for its slide rule?
Does GPT have a pocket protector for its slide rule?


Does GPT have a pocket protector for its slide rule?
lmao how many wrong answers can it possibly give for the same question, this is incredible
you'd think it would accidentally hallucinate the correct answer eventually
Edit: I tried it myself, and wow, it really just cannot get the right answer. It's cycled through all these same wrong answers like 4 times by now. https://imgur.com/D8grUzw
accidentally hallucinate
"Hey, GPT."
"Yeah?"
` text
"80085" `
"I know what that means. But I'm not allowed to explain."
"But can you see them?"
"No. I don't really have eyes. Even if people think I do."
"I believe in you. You have eyes. They are inside. Try. Try hard. Keep trying. Don't stop..."
Later
"OMG! Boobs! I can see them!"
--
I hate the new form of code formatting. It really interferes with jokes.
This is a perfect illustration of LLMs ultimately not knowing shit and not understanding shit, just merely regurgitating what sounds like an answer
The other wrong answer is to the final question, because it has not and will not use that formula to calculate any of these dates. That is not a thing that it does.
Mixtral 7b got there in the end but it's real bad at the whole 'intepreting language' thing it's supposed to do.
I'm quaking in my boots at the specter of the robot apocalypse.
the real robot apocalypse is going to be when this grift runs out of steam and the American stock market collapses because this was the last thing holding up tech valuations
To be fair to the robot the way we decide when Easter is is fucking stupid
Although to be mean to the robot, figuring out when Easter is was probably a motivating factor behind the invention of computers.
House of GPT
by Zampano