Cynical SallyEvent Roast
Cynical Sally

Cynical Sally

The internet's most honest critic.

You're welcome.

Google Releases Gemma 4 with Native Vision and 256K Context

Ai
8.3/10
2026-04-09·Source
Google quietly dropped the first open model family that doesn't feel like a consolation prize. Four sizes, native multimodal, 256K context, 140 languages. Meta is watching this through the bars of the cage they built themselves.
Can you handle it?

Sally's not done with you yet.

Drop a URL, screenshot, or file and Sally will give you the honest truth.

Sally's Take

Gemma 4 is the first Google model release where the word 'finally' isn't doing any heavy lifting. Native vision, native audio, native function calling, 256K context, and a 140-language footprint that makes the previous generation look like it was trained in a single time zone. All four sizes ship under a license that lets you actually use them.

The timing is surgical. Meta announced two weeks ago that LLaMA is going closed after LLaMA 5. Google looked at that, looked at its own history of half-committing to open source, and decided the one thing that would hurt Meta the most was to out-open them in the same news cycle. And it worked.

The benchmarks are in the range where 'benchmark' stops meaning much. Real test: can a solo developer in a bedroom download the 9B parameter version and ship something that doesn't embarrass itself? Yes. That's the whole game now.

Can you handle it?

Think your work can survive this?

Drop a URL, screenshot, or file and Sally will give you the honest truth.

What Actually Happened

  • Gemma 4 released in four sizes (2B, 9B, 27B, 90B) with native vision, audio, and function calling built in from day one.
  • Context window expanded to 256K tokens, matching the frontier closed models.
  • Support for 140+ languages out of the box, up from 35 in Gemma 3.
  • License permits commercial use and model distillation, directly contradicting Meta's new closed-source pivot.

Who Got Burned

Meta, which announced two weeks ago that LLaMA 5 would be closed source 'for safety reasons'. Google just made the safety argument look like an excuse for falling behind.

Silver Lining

For the first time in years, a solo developer with a laptop and no OpenAI API key can ship a legitimately competitive product. The gap between 'what big labs can do' and 'what I can do in my bedroom' just shrank by a generation.

Can you handle it?

Your turn. Drop something.

Drop a URL, screenshot, or file and Sally will give you the honest truth.

Read the original source →