trending
Ssekirya Cosmah

20d ago

FlowSpec AI is going Open-Source! 🔓

Updates have been slow, but the vision has grown. FlowSpec AI is officially moving to a freemium, local-first model to support the React/Next.js community even during internet outages.

What s new?

  1. Local LLMs: Transitioned to self-hosted Dolphin3 on Ollama to keep costs at zero for users.

  2. Architecture Rebuild: Refactored the entire system using FastAPI and Qdrant for better scaling.

We are looking for contributors! We are currently optimizing our FastAPI brain to handle self-hosted model inference more efficiently. If you are a Python/FastAPI wizard who loves LLMs, we want you on the team.