Context window problem
Helo Helo
I'm using a Gemini 3.1 pro model to write some stories, and I got a problem. When I got to a certain count of messages, the whole memory of chat just collapses. And I'm not talking about "oh, it forgot the first message", no, I'm talking about forgetting the full chat history. I'm using project, because it's a way I'm sure it won't forget about character info, but it would be a too long custom instructions if I put a summary of chat here(my character descriptions are pretty detailed). Here are some more details, because maybe I'm not understanding how it works. The custom instructions is 9503 tokens long. The chat history after which it totally forgots where were we is 95309 tokens. I know it's a lot, and as I said, I'm okay with losing some first messages from memory, but not the entire chat.
Juan R. Arroyo Yap
Hello, the custom instructions are not meant to have such a long set of instructions. What you need to do is right a core set of instructions and then have the details in a text content file.
In the custom instructions, set a rule that directs the LLM to read the file named [insert your file name] to get the extra details.
That's the way I use it and it works perfectly.
As for the LLM forgetting context, that's a normal case of all LLMs after 10 to 20 exchanges in the same chat the memory fails. The project should help you maintain context longer but you will still find a need to use multiple chats for a given topic once the memory starts failing.