A platform that enables the running of AI models locally in the Browser(JavaScript Env) and allows developers to easily embed WebAI models into their Web Apps and save 95% on AI hosting cost. We're unlocking what's possible in the future of the AI. 🚀

















As models get smarter and smaller, and edge devices get more powerful, what use cases in the future do you see for on-device AI?
I went through your demos, and they're really good. However, is it feasible for every user to download a 500MB model locally?
@rajpurohit_vijesh Hi, that's a very good question. There's indeed more work to be done on the adaption and education on the normal users, and find the exact use cases(product market fit) for different models. We have another tool called localai.tools in which a lot of power users pay Premium to use whisper large models which is 1.5GB. So this is a bit of new business model like games, where users with better configurations can enjoy games with better graphics and users with less can still enjoy the game but less good experience. That said, this is still a very new technology and we are still finding the use cases for different models.
@peng_zhang15, I can see where this is going. Local AI will be very useful for everyday tasks where users need to call multiple APIs frequently. I would love to connect with you and discuss this in more detail. :)
Sounds interesting. Will check it out.
AI on the browser is really easy