I dunno if it’s because I have a warped thought process, or because I have a background in Psychology, or because I’m wrong. But this always felt to me like the natural progression.
Assuming that a deeper thinking broader contexed, being with more information would be more accurate is actually counter-intuitive to me.
Your last line made me think of telescopes: bigger mirrors bring in more light, but they’re harder to keep in focus due to thermal distortion.
Same with ChatGPT. The more it knows about you, the richer the connections it can make. But with that comes more interpretive noise. If you're doing scientific or factual work, stateless queries are best so turn off memory. But for meaning-of-life questions or personal growth? I’ll take the distortion. It’s still useful and often surprisingly accurate
Assuming that a deeper thinking broader contexed, being with more information would be more accurate is actually counter-intuitive to me.