LlamaChat

LlamaChat

Chat with your favourite LLaMA models right on your Mac

39 followers

LlamaChat allows you to chat with LLaMA, Alpaca and GPT4All models, all running locally right on your Mac. Support for Vicuna and Koala coming soon. NOTE: LlamaChat requires obtaining model files separately, adhering to each source's terms and conditions.
LlamaChat gallery image
LlamaChat gallery image
LlamaChat gallery image
LlamaChat gallery image
Free
Launch tags:Macโ€ขOpen Sourceโ€ขArtificial Intelligence
Launch Team
Wispr Flow: Dictation That Works Everywhere
Wispr Flow: Dictation That Works Everywhere
Stop typing. Start speaking. 4x faster.
Promoted

What do you think? โ€ฆ

Alex Rozanski
Maker
๐Ÿ“Œ
Hey everybody ๐Ÿ‘‹ I'm Alex, and over the past few weeks I've been building LlamaChat as a native, open-source macOS app to allow you to easily chat with LLaMA models locally. ๐Ÿ”ฎ Alongside the ChatGPT boom of the last few months, there has been a flurry of projects building on top of Meta's LLaMA LLM to make it more amenable to conversations. Projects include Stanford's Alpaca, Nomic AI's GPT4All and Vicuna. ๐Ÿš€ Thanks to the excellent llama.cpp kickstarted by Georgi Gerganov, it is now easier than ever to run these models locally on your device. LlamaChat wraps this up in a native macOS interface, with a simple chat UI and support for importing and converting your raw PyTorch model checkpoints. ๐Ÿ’ฌ Up next, I'd like to add support for other LLaMA-based projects, like Vicuna and Koala, as well as non-English language models like Chinese LLaMA/Alpaca and Vigogne. For the ML dabblers I'd like to add to the existing simple debugging and tweaking support (currently LLamaChat allows you to inspect the existing model context including raw tokens).
yg
ืžื—ืคืฉ ืขื–ืจื” ื‘ืžืชืžื˜ื™ืงื”