"This isn’t just a problem for Stack Overflow. In pretty much every other example where you see ChatGPT screwing up basic facts, it does so with absolute self-assurance. It does not admit a smidgen of doubt about what it’s saying. Whatever question you ask, it’ll merrily Dunning-Kruger its way along, pouring out a stream of text.
It is, in other words, bullshitting."
Effortless bullshit. At scale. What could possibly go wrong?