Launching today

Tenure
Local AI memory that knows what you chose and why
12 followers
Local AI memory that knows what you chose and why
12 followers
Most LLM memory systems store facts. Tenure stores what to do with them. Every belief carries a why_it_matters field that converts observations into instructions the model acts on directly, no additional inference required. Precision retrieval over similarity search. Fully local, encrypted at rest, nothing leaves localhost. Every belief is visible, editable, and correctable, the model isn't building a hidden profile. Retrieval claims are documented in a reproducible arXiv paper.








Tenure
We have all been there. You already told the model you use Fastify. That you want clarifying questions before long responses. That you chose the MongoDB raw driver over Mongoose for a specific reason. Next session, it has no idea and meets you as a total stranger.
The other thing that bothered me about existing systems: they store facts but not what to do with them. "Character doesn't like being yelled at" is a fact. "Use reactions to raised voices as the primary window into this character's interior life, and let that discomfort shape every confrontation scene without stating it explicitly" is an instruction the model can act on without additional inference. Every belief in Tenure carries a why_it_matters field that does this conversion at extraction time, when the model has the full conversational context to get it right.
Most memory systems hand the model a pile of vaguely related facts and hope it figures out what's relevant; because vector search can't tell your MongoDB decision apart from your Redis decision apart from your TypeScript preferences. They all live in the same semantic neighborhood. Tenure retrieves the specific belief that applies to the current moment.
Tenure is a local-first, privacy-focused proxy that sits between your client and any LLM. Next session, it already knows your stack, your preferences, and what you ruled out. It doesn't meet you as a stranger.