GPT-5.4 Solves Frontier Math Milestone as Anthropic Launches Mobile Claude Control
Major AI Milestones
GPT-5.4 Pro Solves Frontier Math Open Problem: Epoch AI and the original problem author have confirmed that GPT-5.4 Pro successfully solved a Frontier Math Open Problem for the first time. This achievement highlights the advanced reasoning and complex problem-solving capabilities of the latest generation of large language models.
Product Announcements & Research
Anthropic Launches Dispatch for Mobile: Anthropic has announced Dispatch, a new mobile feature that allows users to control their Claude cowork sessions and computer use directly from their phone. This update significantly improves the accessibility of AI assistants, enabling users to manage tasks while on the move.
Anthropic Explores AI Scientific Capabilities in "Vibe Physics" Research: A new study from Anthropic titled "Vibe Physics" examines how current AI models can function as research assistants in scientific tasks. While the models show significant progress in assisting with research, the study notes they still lack the intuition required to identify which specific research paths are most likely to be successful.
Local LLM Models & Fine-tunes
New Qwen3.5 Model Variants and Fine-tunes Released: Several new models based on the Qwen3.5 architecture have been released, including the RYS II series featuring repeated layers for improved performance and two "Neo" fine-tunes. These releases focus on fast, efficient reasoning and represent ongoing advancements in local model architecture and specialization.
- RYS II - Repeated layers with Qwen3.5 27B and some hints at a 'Universal Language'
- Two new Qwen3.5 “Neo” fine‑tunes focused on fast, efficient reasoning
Technical Optimizations & Performance
Breakthroughs in AI Inference Speed and Local Deployment: The release of FlashAttention-4 brings massive speed gains, outperforming Triton by 2.7x on Blackwell GPUs. Simultaneously, the new FOMOE method enables users to run flagship-level 397B parameter models on modest consumer hardware, making high-end AI more accessible to local users.