
Lapis
Secure, offline AI chat assistant on your device
57 followers
Secure, offline AI chat assistant on your device
57 followers
Lapis brings a unique twist by running powerful open-source LLMs locally on your device, giving you a fully offline, private AI chat experience. Unlike cloud-based assistants, Lapis never sends your data to servers or collects it. You can load any Hugging Face model (like Gemma, Phi, LLaMA, GPT-OSS…) by pasting its URL. Manage your model library, optimize performance, and chat securely all free, no subscription needed












Tweny
Zivy
Congrats on the launch @lvrpiz Curious , how smoothly does Lapis handle bigger workloads while running fully offline on-device?
Tweny
@harkirat_singh3777 It will depend on the model you are using and the device that runs the inference. I've tested it on an iPhone 11 and I get 75 tokens/s with some models
Looks promising! Congrats on the launch!
Any plans on supporting android users in the future?
Tweny
Thanks! @info_team3 , not really in the near future. The inference runs on native swift solutions