LlamaChat

LlamaChat

Chat with your favourite LLaMA models right on your Mac

39 followers

LlamaChat allows you to chat with LLaMA, Alpaca and GPT4All models, all running locally right on your Mac. Support for Vicuna and Koala coming soon. NOTE: LlamaChat requires obtaining model files separately, adhering to each source's terms and conditions.
LlamaChat gallery image
LlamaChat gallery image
LlamaChat gallery image
LlamaChat gallery image
Free
Launch Team
AssemblyAI
AssemblyAI
Build voice AI apps with a single API
Promoted

What do you think? …

Alex Rozanski
Maker
šŸ“Œ
Hey everybody šŸ‘‹ I'm Alex, and over the past few weeks I've been building LlamaChat as a native, open-source macOS app to allow you to easily chat with LLaMA models locally. šŸ”® Alongside the ChatGPT boom of the last few months, there has been a flurry of projects building on top of Meta's LLaMA LLM to make it more amenable to conversations. Projects include Stanford's Alpaca, Nomic AI's GPT4All and Vicuna. šŸš€ Thanks to the excellent llama.cpp kickstarted by Georgi Gerganov, it is now easier than ever to run these models locally on your device. LlamaChat wraps this up in a native macOS interface, with a simple chat UI and support for importing and converting your raw PyTorch model checkpoints. šŸ’¬ Up next, I'd like to add support for other LLaMA-based projects, like Vicuna and Koala, as well as non-English language models like Chinese LLaMA/Alpaca and Vigogne. For the ML dabblers I'd like to add to the existing simple debugging and tweaking support (currently LLamaChat allows you to inspect the existing model context including raw tokens).
yg
מחפש עזרה ×‘×ž×Ŗ×ž×˜×™×§×”