Text-Generator.io

Text-Generator.io

Self Hostable OpenAI Alternative

63 followers

Self hostable, OpenAI Compatible, Text Generator that combines crawling, image analysis into a single API where linked documents/images are downloaded and analysed to generate better text. Also, speech to text over 8x cheaper than Google... and embeddings.
Text-Generator.io gallery image
Text-Generator.io gallery image
Free Options
Launch Team
Framer
Framer
Launch websites with enterprise needs at startup speeds.
Promoted

What do you think? …

lee
Maker
📌
Hi all, i created https://Text-Generator.io because i saw the usefulness of the new large language models by OpenAI but noticed difficulty prompt engineering e.g. it's still hard to build something doing text/image/link analysis. There's an OpenAI Compatible endpoint so switching is a one line change, The API also combines crawling, image analysis into a single API where linked documents/images are downloaded and analysed. It's self hostable starting at $1000 USD a year. We also have a Cloud hosted option with a free tier and don't store any private text sent to the API so it is anonymous/secure. The API supports a massive amount of use cases including speech to text 8x cheaper than Google Cloud, generating most human languages and code, analysing linked images/receipts, embedding many languages/images/code in the same space. Also checkout the blog: https://text-generator.io/blog Real examples: https://text-generator.io/use-cases Self Hosting Docs: https://text-generator.io/self-h... Super excited to see what you all build with it! Left me know how i can help out
Naveed Rehman
so cool! congratulations on the launch
Elisha Terada
Congrats! What LLM alternative is powering behind it? Or is that your proprietary trained model? Also, what does it mean to "Self Host" for your product?
lee
@elishaterada It's proprietary trained models, theres around 40G of various models that get downloaded including ones for the prompt engineering/analysing links/images/OCR then some main models for the code/instruct/multilingual LLM stuff, they are packed onto the GPU and used selectively per request which makes the quality better than any standalone model that could run on a single 24G VRAM GPU right now. With self hosting you get the docker image and can effectively do what you want with it really, there's docs for hosting it on Kubernetes as an example https://text-generator.io/docs/k... or just running the docker container https://text-generator.io/docs/d...
Elisha Terada
@leeleepenkman Got it, does the $1,000 part just coming down to license fee to use your model then?
lee
@elishaterada Yea that's correct, let me know if that feels reasonable or too high etc