Forums
Are we over-engineering AI memory? (Markdown vs. Vector DBs for small datasets)
Hey makers!
Lately, I ve been looking closely at how independent builders and small teams are managing AI knowledge bases. It feels like the default "industry standard" is to immediately reach for a complex RAG pipeline and a heavy, paid Vector Database.
But I'm starting to wonder if we are over-engineering this for 90% of standard use cases.
Vector DBs are incredibly powerful for massive scale, but for smaller or non-massive datasets, they can be expensive, complex to query, and act as complete black boxes. If a search returns a weird chunk, diagnosing it is often a nightmare.



