SynapseWire

AI News

3 articles

Concept Art of Baidu Ernie 5.0 Native Multimodal Architecture
AI News

Baidu Ernie 5.0 Officially Released: A Quantum Leap with Native Multimodal Architecture and 2.4 Trillion Parameters

Baidu has officially launched Ernie 5.0 (Wenxin 5.0), a native multimodal large language model with a staggering 2.4 trillion parameters. This article provides a deep dive into its unique 'Native Multimodal + MoE' architecture, explores its breakthroughs in cross-modal understanding, code generation, and creative writing, and benchmarks it against top global models like Gemini-2.5 and GPT-5, revealing new heights for Chinese AI technology in 2026.

TranslateGemma Architecture and Performance Overview
AI News

Google Unveils TranslateGemma: A New Era of Open Translation Models Built on Gemma 3 Architecture

Google has officially released TranslateGemma, a groundbreaking series of open translation models based on the Gemma 3 architecture. Available in 4B, 12B, and 27B parameter sizes, these models leverage advanced two-stage fine-tuning to deliver state-of-the-art performance across 55 languages. This article provides a comprehensive technical deep dive into the architecture, training methodology, and the transformative potential of running high-fidelity translation on edge devices.