2 min read

OpenAI’s "Garlic" Outshines Gemini 3 as Mistral 3 Unleashes 675B-Parameter Open Models

New AI Models & Releases

OpenAI’s Upcoming Model ("Garlic") Outperforms Gemini 3 and Opus 4.5
OpenAI is set to release a new AI model codenamed "Garlic" next week, which internal benchmarks show outperforms Google’s Gemini 3 and Anthropic’s Opus 4.5. Expected to launch as GPT-5.2 or GPT-5.5 early next year, the model underscores OpenAI’s push to maintain leadership in AI.


Mistral AI Launches Mistral 3: Open-Weight Models from 3B to 675B Parameters
Mistral AI released Mistral 3, a full family of open-weight models ranging from 3B to 675B parameters, including a sparse Mixture-of-Experts (MoE) architecture (Mistral Large 3). The models support multimodal tasks, on-device reasoning, and enterprise-scale applications under the Apache 2.0 license.


Mistral AI Releases Ministral-3: Pruned, Lightweight Variants of Mistral Small 3.1
Mistral AI introduced Ministral-3, smaller models pruned from Mistral Small 3.1, available in Base, Instruct, and Reasoning variants (3B–14B parameters). These models support local fine-tuning and are optimized for efficiency, with technical docs expected by August 2027 per EU guidelines.


DeepSeek V3.2 Speciale Excels in Math Benchmarks at 1/15th the Cost of GPT-5.1 High
DeepSeek V3.2 Speciale outperforms GPT-5.1 High in math benchmarks while being ~15× cheaper, highlighting cost-efficient alternatives in the AI model landscape.


AI Infrastructure & Tools

Mistral Large 3 Now Available on AWS Bedrock
Mistral Large 3, a 673B-parameter MoE model (39B active), is now deployable on AWS Bedrock, targeting enterprise use cases like RAG systems, scientific workloads, and production assistants.


Bifrost: Open-Source LLM Gateway with Adaptive Load Balancing
Bifrost is a new open-source LLM gateway that dynamically routes requests across providers based on real-time conditions, improving stability without manual configuration.