Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU.
No server-side processing - your data never leaves your pc!
This project is meant to be the closest attempt at bringing the familarity & functionality from popular AI interfaces such as ChatGPT and Gemini into a in-browser experience.
You're more than welcome to contribute:
https://github.com/addyosmani/ch...
We started this project because we're big fans of fast, privacy-friend on-device AI. We think there's a lot of opportunity here for chat interfaces and beyond. Chatty is built using WebLLM, Radix, Xenova Transformers and Next.js. We're excited for you to try it out and would love to hear any feedback!
This is impressive guys. I feel like you should be doing better than the current position you are in here on ProductHunt. Is this using WASM under the hood?
congrats on the launch. running llms in-browser using webgpu seems efficient for privacy and performance. how do you see chattyui impacting developers specifically looking to integrate similar functionalities into their projects?
Replies
ChattyUI
ChattyUI
DevDojo
ChattyUI
ChattyUI
ChattyUI
ChattyUI
SiteForge - AI Website Wireframer