Launched this week
TranslateGemma

TranslateGemma

Open translation on Google models, supporting 55 languages

315 followers

TranslateGemma is a new suite of open AI translation models built on Google’s Gemma 3. It enables high-quality communication across 55 languages, combining strong accuracy with exceptional efficiency. Designed to run on mobile, local devices, and cloud environments without compromising performance.
TranslateGemma  gallery image
TranslateGemma  gallery image
Free
Launch Team
Intercom
Intercom
Startups get 90% off Intercom + 1 year of Fin AI Agent free
Promoted

What do you think? …

Aleksandar Blazhev
Hey everyone 👋 Excited to share TranslateGemma! A new open translation model suite built on Google’s Gemma 3. It supports clear talk across 55 languages. Models come in sizes for mobile, local, and cloud use. This helps builders avoid closed APIs and high costs.
Zeiki Yu

Congrats on the launch! Open, efficient translation across 55 languages is huge.

Jesse Craig

The 4B model for mobile inference is interesting.... definitely want to try it on a Pi. Would love to have local translation that doesn't phone home.

The ensemble reward model approach is interesting using multiple quality signals (MetricX, AutoMQM, ChrF, naturalness) rather than optimizing against a single metric. We've seen very big reliability gains using multi-model consensus at inference time.

Eugene Chernyak

Open source translation that runs on mobile is a huge W for privacy! It's great to see more power moving away from cloud and into our pockets. Wanna ask, does it work fully offline on mid-range phones or do u need flagship for smooth performance?

Jay Dev

Wow, TranslateGemma looks amazing! The fact it runs efficiently on mobile is a total game changer. How well does it handle translating slang and idioms across those 55 languages? So cool!

yama

Really excited about the open-source approach for translation. I'm building a tech news aggregator that summarizes articles from Japanese engineering blogs for global readers, so local translation models like this could be game-changing for keeping latency low. Curious about the quality benchmarks for technical/specialized content vs general text — does the ensemble reward model help with domain-specific terminology?

Abhijith Pingali

Been waiting for something like this. Most translation APIs either cost a fortune or need internet. Having a local model that runs on Gemini 3 and actually handles 55 languages without phoning home? That's huge for anyone building offline-first apps or dealing with privacy-sensitive translation work.

Quick question though how's the accuracy compared to Google Translate's API for less common language pairs? Like, does it handle nuanced stuff well or is it better for straightforward translation?

Either way, open source translation models are a game changer. Nice work 🔥