← Back to Radar
AdoptAutomate

Gemma

Google's open-weight model. Best local reasoning at its size.

Gemma 3 is Google's open-weight model family that runs efficiently on consumer hardware via Ollama. The 4B parameter variant delivers surprisingly capable reasoning for its size — good enough for summarisation, Q&A, and conversational RAG without cloud API costs.

For local AI applications, Gemma fills the gap between tiny models that struggle with nuance and frontier models that require GPU clusters. We use it for document Q&A where the data cannot leave the network and the query volume makes API pricing impractical.

aillmlocalopen-source