BYOK for models apart from OpenAI?
closed
endu
closed
DL
endu Hi, sorry can I know if this is closed because the integration to other models is completed?
P
Philip Ulrich
endu same question.
P
Philip Ulrich
Best way to do this would be integrating something like open router so you don't have to support a bunch of individual services. Definitely worth adding. I'd also like to see BYOK support added to the web ui.
endu
Merged in a post:
OpenRouter Integration
B
Balu Farook
Please add OpenRouter Integrations to Merlin. Currently only have OpenAI which is useless for me as I need to use BYOK for other LLM models like Google Gemini, etc. Please consider.
endu
Merged in a post:
BYOK (Bring Your Own Key) when Monthly Query Runs Out
DL
I am trying to be a heavy user of Merlin by using it for everything daily. I also want to use it for coding interpretation more since it has multi-LLM and code interpretator integrated. However when we use code interpretation, it eats up query really fast.
To make the cost realistic, can you kindly allow BYOK when our monthly credit run out? Since you mentioned you are no profiting from the credits, this will be a Win/Win for you and the users.
Thank you!
endu
Merged in a post:
Ollama Key Integration
Bob Gatchel
Some of us are using the local Ollama and would love to have this option for the Bring Your Own Key option as well!
endu
Hi Bob Gatchel, We're currently considering integrating this as part of a Pro feature, where you would pay a constant fee for using the Merlin built features. I'd love to hear your thoughts on that..
endu
under review
endu
in progress
B
Banumurti Danardono
When we do chat pdf , the ouput is cut off. I think , it is caused by maximum capacity of LLM Output.
If we can use OpenAI Router , we can select the LLM that have long output ie, Sonnet 3.5 beta that have 8K output.
C Y
OpenRouter support would be amazing!!!
Load More
→