Launching today

Localmee
Private AI chat: files, voice, vision. 18 local models.
7 followers
Private AI chat: files, voice, vision. 18 local models.
7 followers
Localmee is a private, offline AI chat workspace for iPhone, iPad, and Mac. Run 18 curated open-source models and counting, including Qwen3.5 and Gemma 4, powered by Apple’s on-device MLX framework. Chat with files, images, and voice. Summarize PDFs, EPUBs, DOCX, and 500+ page books. Use OCR, vision, and ready-to-run prompts for work, study, writing, and coding. Localmee recommends models for your device. On Mac, connect your own Ollama models. No backend, no tracking, no account required.









Congrats on the launch, wishing you all the best with it!
Just downloaded it and have been playing with it for a bit. The fact that everything runs locally on the device is a huge plus for me, response times are surprisingly snappy and I love that I can use it with no signal. Voice quality is also better than I expected for on-device TTS.
Quick question: any rough timeline for adding new models, or is there a public roadmap I can follow? Curious what's coming next on the model side.
Great work, will keep using it.
@sadegazoz Thank you so much, this really means a lot on launch day!
Glad the on-device experience and voice landed well for you, those were two of the hardest things to get right.
On the roadmap side, me and @davidcs have already started working on the next chapter: pushing Localmee to its full potential as a truly private, fully offline assistant that can handle a lot more than chat. Not ready to share all the details yet, but there is a lot coming.
Thanks again for trying it out and taking the time to write this.
Tuba