ARC-AGI 2 Cracked by Poetiq as Prime Intellect Drops Open-Source 100B+ MoE Model
AI Benchmarks & Breakthroughs
ARC-AGI 2 Benchmark Solved by Poetiq: Poetiq achieved state-of-the-art results on the ARC-AGI-2 benchmark using a combination of open-source models and newly released proprietary models like Gemini 3 and GPT-5.1. The solution was developed without prior exposure to ARC-AGI-2 problems, demonstrating significant performance gains at increased computational cost.
New Model Releases
Prime Intellect Releases INTELLECT-3 (100B+ MoE): A 100B+ parameter Mixture-of-Experts (MoE) model trained with large-scale reinforcement learning, achieving SOTA performance in math, code, science, and reasoning. The full training framework, including weights, datasets, and RL environments, has been open-sourced.
Abliterated Gemma 3-27B-It Released by YanLabs: A norm-preserving abliterated version of Google’s Gemma 3 27B Instruct model, removing refusal mechanisms while preserving reasoning capabilities. The model is available on Hugging Face, with community efforts underway for quantized versions.
ArliAI Derestricts OpenAI’s gpt-oss-20b: Using a norm-preserving biprojected abliteration technique, ArliAI removed refusal mechanisms from gpt-oss-20b while maintaining its reasoning abilities. This follows prior work on GLM-4.5-Air and expands uncensored model accessibility.
Hardware & Infrastructure
Asus & Nvidia Unveil 20 PFLOPS AI Desktop PC: The ExpertCenter Pro ET900N G3, co-developed with Nvidia, features 784GB coherent memory (496GB LPDDR5X CPU + 288GB HBM3e GPU) and 20 PFLOPS AI performance, based on the DGX Station architecture. Targets high-end AI workloads with massive memory bandwidth improvements.