

291·
4 days agoI think it’s a bit more than that. A known failure mode of LLMs is that in a long enough conversation about a topic, eventually the guardrails against that topic start to lose out against the overarching directive to be a sycophant. This kinda smells like that.
We don’t have many informations here but it’s possible that the LLM had already been worn down to the point of giving passively encouraging answers. My takeaway is once more that LLMs as used today are unreliable, badly engineered, and not actually ready to market.
The one in the middle is a floater. The ones on the top right and bottom left are psychedelic fractals that are very much not floating around in people’s eyeballs.