Locl.chat

Locl.chat

The snappiest, local-first Deepseek chat

9 followers

Local-first LLM chat using Groq’s Deepseek distill LLaMa 70B for instant UI and super snappy responses. Built on top of Basic’s user-owned data store technology, so that all your data lives in your data store in US servers and is permanently in your control!
Locl.chat gallery image
Locl.chat gallery image
Locl.chat gallery image
Locl.chat gallery image
Free
Launch Team / Built With
AssemblyAI
AssemblyAI
Build voice AI apps with a single API
Promoted

What do you think? …

Abhiroop CVK
Hey we're Abhi and Rashid, makers of locl.chat! We were frustrated by the slowness of the Deepseek app and website, so we decided to build a local-first version of it on top of our Basic database and sync tech. We chose the Groq Deepseek distill model because we felt it had the best balance of speed and accuracy. This is an experimental open-source project that we threw together as quickly as we could, so please bare with us with any janky UI and bugs you face (happy to fix them as you point them out), but we personally have started using this instead of any of the other models and chat interfaces just because of how fast everything is! We hope this can be a pleasant contribution to your workflow 😋
Ujjwal Singhania

Much faster than Deepseek!