Launching today

OpenMolt
Let your code create and manage AI Agents (OpenSource)
104 followers
Let your code create and manage AI Agents (OpenSource)
104 followers
OpenMolt lets you build programmatic AI agents in Node.js that think, plan, and act using tools, integrations, and memory — directly from your codebase.





Hi everyone 👋
I'm @ybouane the creator of OpenMolt, the 4th project I'm launching on 2026! (Feel free to follow my build in public journey on X)
I started building it because most AI agent tools I tried were designed primarily as chat assistants. That works well for personal workflows, but it becomes harder to use them inside real applications.
For example, imagine a SaaS backend receiving a request like:
Instead of running a fixed pipeline, an agent could decide how to complete the task:
gather data
call APIs
generate outputs
update systems
That’s the idea behind OpenMolt.
It’s an open-source framework for programmatic AI agents in Node.js, where agents are defined directly in code with:
instructions
tools
integrations
memory
When triggered, the agent runs a planning + execution loop, deciding which tools to use to complete the task.
Some current features:
tool and API integrations
short-term and long-term memory
scheduling
CLI runner
capability-based permissions (agents only access the tools you explicitly allow)
The goal is to make AI agents behave more like software systems than prompt scripts.
OpenMolt is still early, and I’m really interested in hearing from developers:
Would you use agents like this inside a backend or SaaS product?
What integrations or capabilities would you expect?
Happy to answer any questions or dive deeper into the architecture.
Thanks for checking it out 🙏
@ybouane Congratulations on the launch! Just a quick quest: what's one real-world SaaS backend use case you've tested with OpenMolt, and how did its planning loop handle edge cases like API failures?
@swati_paliwal The project has just started, but here are two specific use-cases:
Blog content generator: Instead of building a whole workflow for generating blog content, you can just write a simple prompt, setup your agent and give it access to integrations like: NanoBanana, File System (You can specify exactly the scope/permissions you want to give it like a specific folder where it can create the blog posts / upload the images)
The AI Agent will then take care of everything for you like writing the blog post and injecting the images inside.
You might think this could have been done with a simple llm prompt to write the blog post, but not really, an llm cannot currently generate a blog post with images inside, you'd have to do it separately and figure out a way to inject images inside the blog post where appropriate. With OpenMolt this is done much more easily.
Customer support assistant: When a user submits a support ticket, an OpenMolt agent can analyze the request, search internal documentation (Notion, files, knowledge base), and generate a helpful response.
If needed, the agent could also:
- query your database
- check the user's account status
- call internal APIs
The agent can then either automatically respond to the user or suggest a response for a human agent to approve.
AI-powered API endpoints: OpenMolt agents can run directly inside backend endpoints.
For example:
The endpoint triggers an agent that:
- retrieves relevant data
- calls tools
- processes information
- returns structured output
SaaS for generating social content: An agent could automatically generate and publish content.
Example flow:
- generate post ideas
- create images using an image generation integration
- format posts for different platforms
- schedule publishing
The whole point is that OpenMolt shines when you need multi-step tasks, need retries, need to implement something where the conditions could change and need an agent to recover and adapt to those changes and still work.
I hope it clarifies the purpose of OpenMolt ;)
Snippets AI
Treating AI agents as backend services triggered by API endpoints rather than chat interfaces is the right abstraction for production use — most real-world automation needs to run headless without a human in the loop. The capability-based permissions model is a smart safety default — does OpenMolt support scoping agent permissions dynamically per request, or are they fixed at agent definition time?
@svyat_dvoretski Code-first agent definition is underappreciated. Most frameworks force you into YAML or proprietary DSLs. Using Node.js for instructions and tools means you get proper IDE support and can compose logic naturally. OpenMolt's planning + execution loop handles state through long-term and short-term memory stores with persistence callbacks, which helps agents carry context across multi-step workflows and sessions.