Ollama

Ollama

The easiest way to run large language models locally

5.0
25 reviews

1.1K followers

Run Llama 2 and other models on macOS, with Windows and Linux coming soon. Customize and create your own.
This is the 3rd launch from Ollama. View more
Ollama Desktop App

Ollama Desktop App

The easiest way to chat with local AI
Ollama's new official desktop app for macOS and Windows makes it easy to run open-source models locally. Chat with LLMs, use multimodal models with images, or reason about files, all from a simple, private interface.
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Ollama Desktop App gallery image
Free
Launch Team
Migma AI
Migma AI
Lovable for Email
Promoted

What do you think? …

Zac Zuo

Hi everyone!

When Ollama walks out of your command line and starts interacting with you as a native desktop app, don't be surprised :)

This new app dramatically lowers the barrier to running top open-source models locally. You can now chat with LLMs, or drag and drop files and images to interact with multimodal models, all from a simple desktop interface. And most importantly, it's Ollama, which is one of the most trusted and liked products for users who care about privacy and data security.

Bringing the Ollama experience to people who aren't as comfortable with the command line will undoubtedly accelerate the adoption of on-device AI.

Gabe Moronta

@zaczuo Love it!!!! I've been an Ollama user for a while now, and have told many others about it, but they've never been as comfortable with it, so just finished sending this out to everyone I know! 💪

André J

Needs MCP and agentic features. Maybe soon? 🙏

Marco Visin
@sentry_co wanted to ask the same
André J

@marco_visin Agentic use is perfect for local models. As you don't need speed. You can Que up some tasks overnight in different branches. And let it cook.

Marco Visin
@sentry_co yep. And together with MCP, that would make it a powerful machine
Ivo Dimitrov

That's a significant update! Thank you for your product, really like it 🙌

Will the UI be open source as well, so we can adjust/modify the way it works?

Mcval Osborne

awesome, just downloaded.


I've used tools like LLMStudio in the past but this is super slick.

Question: Is there a place to get an overview of best use cases for different models? I see the overview of models on the home page but contexualizing what certain models are best for would be massively helpful to me.

Quinn Comendant

I really like the UI of Ollama, especially the CLI. There's a lot to love there. Unfortunately, on macOS it's not the best option because it doesn't support MLX, which runs models 10% to 20% faster, and with lower memory usage. There is an open ticket with a pull request for adding a MLX backend from 2023, but it's been stalled for awhile. If you use mac, try LM Studio, mlx-lm, or swama instead.

QUEEN EMMY

It's great and convenient!

Gin Tse

Running top vision models *locally* is huge—no more waiting on cloud stuff or privacy worries, tbh. This update is realy next-level, hats off to the team!

123
Next
Last