OpenAI has just released the GPT-4.1 family—GPT-4.1, GPT-4.1 Mini, and GPT-4.1 Nano, featuring significant improvements in coding, instruction following, and long-context comprehension. These models support up to 1 million tokens of context and outperform GPT-4o across various benchmarks, all while being more cost-effective .​
Could Merlin integrate these new models into the platform? Their enhanced capabilities would be invaluable for complex tasks and workflows. It would be fantastic to have access to these advancements directly within Merlin!