Interesting. Starting a new chat can be a useful workaround when ChatGPT becomes fixated on a particular idea and fails to shift focus, even when instructed to do so. In that context, retaining all conversations in memory could potentially make the issue worse?
Yup. It seems like they're chasing diminishing returns, attempting to elicit smarter responses by feeding in more context. But many times I don't want a contextual tool, I want a more or less deterministic one I can iterate with.
Chats frequently turn into a tree for me. It offers N options and I explore one of them down the line until it doesn’t work. Trying to get it back to an earlier branch is usually impossible, so starting a new chat is often the only way to approach the other options.
In learning and intelligence, forgetting is often just as importance as remembering. This idea of remembering everything is just another piece of spaghetti tossed at the wall.
Interesting. Starting a new chat can be a useful workaround when ChatGPT becomes fixated on a particular idea and fails to shift focus, even when instructed to do so. In that context, retaining all conversations in memory could potentially make the issue worse?
Yup. It seems like they're chasing diminishing returns, attempting to elicit smarter responses by feeding in more context. But many times I don't want a contextual tool, I want a more or less deterministic one I can iterate with.
Chats frequently turn into a tree for me. It offers N options and I explore one of them down the line until it doesn’t work. Trying to get it back to an earlier branch is usually impossible, so starting a new chat is often the only way to approach the other options.
In learning and intelligence, forgetting is often just as importance as remembering. This idea of remembering everything is just another piece of spaghetti tossed at the wall.