Allow long contexts
planned
midia
I was have the same problem . Claude 3.7 sonent forgets message after four rounds of dialogue that I can hardly communicate on a continuous basis.
endu
Hi Digital Dexter, thanks for sharing your feedback! Could you please share specific instances where you felt it was forgetting context, the model you were using, and the use case you're trying to solve? This would help us understand and improve. Appreciate your input! 😊
a
alex m.
endu I was going to ask for this implementation as well, it would be nice to use a merlin chat as personal diary/journal and thinking to add context for months or years on end, we know models have a token limits but we also know that with a vector database and embeddings, you can add "artificial memory" to each merlin chat, it would be nice if you could add this function and it'll also likely make people stay for the long run because they know they can store all their conversation way above the limits of every LLMs so whenever a new competitor pops up, you'll have the advantage of having a database that increase the memories
endu
Hi Alex! Totally agree with you – that’s exactly what we realized after using Merlin extensively. It makes perfect sense, which is why we’re already working on building a personalization memory feature into Merlin. Your idea is absolutely spot-on, and if feasible, it could definitely evolve into a more advanced feature down the line. Thanks for sharing this – it’s awesome to see the same vision! 😊
X
X
alex m. That would be super expensive token/credit wise, but some power users would be willing to pay for it, and there are some clever memory experiments going on that will bring the cost down. I suggested one called Charlie Mneumonic a while back that looks promising.
endu
marked this post as
planned