Skip to content

TODAY

Google Releases Gemma 4: Four Variants, 256K Context, Apache 2.0

Google ships four Gemma 4 variants simultaneously: edge-optimized E2B and E4B, a 26B Mixture-of-Experts, and a 31B Dense flagship. The 31B Dense ranks #3 and 26B ranks #6 on the Arena AI open-source leaderboard, with native function calling, agent workflows, vision + audio input, and 256K context — all under Apache 2.0. Available immediately on Hugging Face, Ollama, vLLM, llama.cpp, and NVIDIA NIM.

Published: 2026-05-05Deep dive

Google ships four Gemma 4 variants simultaneously, covering everything from edge to cloud:

  • E2B (Effective 2B): edge-optimized, designed to run on phones and Raspberry Pi-class hardware
  • E4B (Effective 4B): same edge focus, one tier up in capability
  • 26B MoE (Mixture of Experts): balanced speed/quality, actual active params well under 26B
  • 31B Dense: flagship, highest quality

The 31B Dense ranks #3 on Arena AI's open-source leaderboard; the 26B MoE ranks #6. Google's framing: "outcompetes models 20x its size."

Key technical highlights:

  1. 256K context: matches mainstream commercial closed models, freeing RAG and long-document workflows from context limits
  2. Native multimodal input: vision + audio without external encoders
  3. Native agent workflow support: function calling, multi-step reasoning, tool calls are baked in during training, not bolted on via fine-tuning
  4. Apache 2.0: same permissive license as Gemma 3, safe for commercial deployment

For developers, especially in the Chinese-language sphere where Gemma has historically been a fine-tuning starting point: 4th gen now spans 2B–31B, keeps the loose license, and covers edge to cloud. For self-hosted inference teams (vLLM, llama.cpp, and Ollama all ship day-one support), the optionality is bigger than the previous three generations combined. Combined with last week's DeepSeek V4 and Zhipu GLM-5 open-source moves, the 2026 open-source model landscape is approaching commercial closed-model density.

Sources

Tags

model-releasegooglegemmaopen-sourcemoe

We use cookies

Anonymous analytics help us improve the site. You can opt out anytime. Learn more