Launched this week
Bodhi Chat - Powered by Bodhi App

Bodhi Chat - Powered by Bodhi App

PoC for Bodhi Platform, create webapps powered by Local LLMs

2 followers

Bodhi Chat is a PoC showing the future of AI Apps with Local LLMs. Connects to your locally running Bodhi App securely via OAuth 2.1. Webapps hosted anywhere can access your local AI - enabling agentic chat + deep research entirely on your machine. Zero backend infrastructure needed, zero API costs, 100% privacy. Developers: build similar AI apps powered by Local LLMs? Sign up for waitlist at developer.getbodhi.app
Bodhi Chat - Powered by Bodhi App gallery image
Bodhi Chat - Powered by Bodhi App gallery image
Bodhi Chat - Powered by Bodhi App gallery image
Free
Launch Team / Built With
Famulor AI
Famulor AI
One agent, all channels: phone, web & WhatsApp AI
Promoted

What do you think? …

Amir
Maker
📌
Hey Product Hunt! 👋 I built Bodhi Chat to prove something that doesn't exist yet: web apps securely accessing YOUR local AI models with OAuth 2.1. **The Problem:** Local LLM solutions (Ollama, LM Studio) let you run models privately, but you need to manually configure apps, download them, or run via Docker. You're left with generic ChatGPT-like interfaces. And they can't safely expose APIs to web apps - no granular permissions, no access control. **What Bodhi Changes:** • **Bodhi App** = first local LLM server with OAuth 2.1 security built-in • Web apps request specific permissions (inference, embeddings, tools) • You grant granular access based on scopes and roles • Apps can only access what you authorize • You can revoke access anytime This opens variety of use-cases where web apps rely on AI-APIs provided by users. Apps like OpenWebUI/Open Notebook can be built with zero local install requirements, just like any other webapp. **Bodhi Chat: The Proof of Concept** Static site on GitHub Pages doing deep research + agentic chat with web search on YOUR local model. No backend infrastructure needed from developer's side. **How It Works:** 1. Run Bodhi App locally 2. Load your preferred AI model 3. Web app requests specific permissions via OAuth 2.1 4. You review scopes (inference? embeddings? download-models? tools?) and grant access 5. Chat runs entirely on your machine with your model **For Developers:** Using bodhi-js SDK, you can build web apps that access users' local AI models with proper security. This pattern doesn't exist anywhere else in the local LLM ecosystem. We're opening our developer platform for selected beta users. 🔗 Try demo: https://bodhiapps.github.io/chat/ 🛠️ Build your own: https://developer.getbodhi.app **I'd love your feedback:** - Would you grant web apps access to your local LLMs with OAuth? - Developers: what apps would you build on this platform? Thanks for checking it out!