Ollama

Ollama

The easiest way to run large language models locally

5.0
β€’25 reviewsβ€’

1.1K followers

Run Llama 2 and other models on macOS, with Windows and Linux coming soon. Customize and create your own.
This is the 3rd launch from Ollama. View more
Ollama Desktop App

Ollama Desktop App

The easiest way to chat with local AI
Ollama's new official desktop app for macOS and Windows makes it easy to run open-source models locally. Chat with LLMs, use multimodal models with images, or reason about files, all from a simple, private interface.
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Free
Launch Team
NMI Payments
NMI Payments
Don’t Integrate Payments Until You Read This Guide
Promoted

What do you think? …

Alexandre Droual
Finally !! Congrats to the whole team on this huge achievement. Can’t wait for next iteration, maybe a vibe coding extension ?
Shane Mhlanga
Absolutely brilliant update! I first started with ollama and OpenWebui. Until I found other native apps. But this has been the core. It was annoying having to do extra steps to just run a local model quickly, now this is it! Well done!
vivek sharma

Ollama v0.7 quietly rewrites the rules for running multimodal AI on local machines. Llama 4 and Gemma 3 in vision mode? Huge. Improved memory management, reliability, and accuracy make this more than just a version bump it’s a fresh foundation for the next wave of local-first LLMs.

Mu Joe

Love that you can just drag and drop images and chat with vision models nowβ€”no more command line headaches, this is super smart, tbh. Makers really nailed it!

μž₯μ—°μ£Ό
Pretty design
Ajay Sahoo

The convenience has become more impactful over and over usage of new tech tools for personal and professional operations, and even from tip to toe even i have a query for me or for someone else, or for something i am using i am using to get solutions to the doubt from Ollama and other previously using tasks based chatbots. Wonderful and embarked usability and preference of all LLMs in one.

Quinn

Ollama v0.7 bringing Llama 4 & Gemma 3 to your desktop is impressiveβ€”but does anyone know if it's smooth on mid-range rigs yet? And uh, Linux friendsβ€”did you encounter that odd bug where the v0.7 download still runs v0.6.2? Would love to hear how that played out!"