OpenAI’s "Garlic" Outshines Gemini 3 as Mistral 3 Unleashes 675B-Parameter Open Models
New AI Models & Releases
OpenAI’s Upcoming Model ("Garlic") Outperforms Gemini 3 and Opus 4.5
OpenAI is set to release a new AI model codenamed "Garlic" next week, which internal benchmarks show outperforms Google’s Gemini 3 and Anthropic’s Opus 4.5. Expected to launch as GPT-5.2 or GPT-5.5 early next year, the model underscores OpenAI’s push to maintain leadership in AI.
- Altman memo: new OpenAI model coming next week, outperforming Gemini 3
- OpenAI's new model is codenamed "Garlic". Internal benchmarks show it beating Gemini 3 and Opus 4.5.
Mistral AI Launches Mistral 3: Open-Weight Models from 3B to 675B Parameters
Mistral AI released Mistral 3, a full family of open-weight models ranging from 3B to 675B parameters, including a sparse Mixture-of-Experts (MoE) architecture (Mistral Large 3). The models support multimodal tasks, on-device reasoning, and enterprise-scale applications under the Apache 2.0 license.
- Mistral just released Mistral 3 — a full open-weight model family from 3B all the way up to 675B parameters
- Introducing Mistral 3
- Mistral 3 Release: New Open-Source Multimodal AI Models from Mistral AI
Mistral AI Releases Ministral-3: Pruned, Lightweight Variants of Mistral Small 3.1
Mistral AI introduced Ministral-3, smaller models pruned from Mistral Small 3.1, available in Base, Instruct, and Reasoning variants (3B–14B parameters). These models support local fine-tuning and are optimized for efficiency, with technical docs expected by August 2027 per EU guidelines.
- Ministral-3 has been released
- Ministral 3 models were pruned from Mistral Small 3.1
- You can now Run & Fine-tune Ministral 3 locally!
DeepSeek V3.2 Speciale Excels in Math Benchmarks at 1/15th the Cost of GPT-5.1 High
DeepSeek V3.2 Speciale outperforms GPT-5.1 High in math benchmarks while being ~15× cheaper, highlighting cost-efficient alternatives in the AI model landscape.
AI Infrastructure & Tools
Mistral Large 3 Now Available on AWS Bedrock
Mistral Large 3, a 673B-parameter MoE model (39B active), is now deployable on AWS Bedrock, targeting enterprise use cases like RAG systems, scientific workloads, and production assistants.
Bifrost: Open-Source LLM Gateway with Adaptive Load Balancing
Bifrost is a new open-source LLM gateway that dynamically routes requests across providers based on real-time conditions, improving stability without manual configuration.