MiniMax-M2 & LLaDA2.0 Redefine Open-Source AI as Apple’s M5 Chip Smashes Speed Records
New AI Models
MiniMax AI Releases MiniMax-M2: A Cost-Effective, High-Performance Open-Source LLM
MiniMax AI has open-sourced MiniMax-M2, a 230B-A10B model optimized for coding and agentic workflows. Priced at 8% of Claude Sonnet’s cost and ~2x faster, it excels in multi-file editing, tool orchestration, and low-latency engineering tasks, with benchmarks positioning it as the best open-source LLM currently available.
- open-sourcing MiniMax M2 — Agent & Code Native, at 8% Claude Sonnet price, ~2x faster
- 🚀 New Model from the MiniMax team: MiniMax-M2, an impressive 230B-A10B LLM
- MiniMaxAI/MiniMax-M2 · Hugging Face
- The performance of Minimax-m2 is truly impressive!
InclusionAI Launches LLaDA2.0-flash-preview: A 100B-Parameter Text Diffusion Model
InclusionAI released LLaDA2.0-flash-preview, a mixture-of-experts (MoE) text diffusion model with 100B total parameters (6B active). Designed for long-context understanding, it is one of the largest open-source text diffusion models available.
AI Hardware & Optimization
Llama.cpp Benchmarks Show M5 Neural Accelerator Delivers ~2.4x Faster Prompt Processing
New benchmarks for Apple’s M5 Neural Accelerator via Llama.cpp reveal a ~2.4x speedup in prompt processing, partially validating Apple’s claim of 6x faster time-to-first-token. The results highlight significant performance gains for local LLM inference.
AI Tools & Developer Workflows
OpenSkills CLI: Sync Claude Code Skills with Any Coding Agent
OpenSkills CLI is a new tool that allows users to integrate Claude’s Code Skills into any coding agent, expanding their capabilities for tasks like code generation, debugging, and automation.
Dao Studio: Natural Language Programming for Automation
Dao Studio is an open-source project enabling natural language as executable scripts, aiming to simplify automation for both domain experts and programmers. The project is in early development and seeks community feedback.