Ivo

PromptForge - Update AI prompts without redeploying your app

by
Stop hardcoding LLM prompts. PromptForge lets you write templates with {{variable}} syntax, version every edit automatically, and fetch prompts via REST API. No SDK needed. Pin to a specific version for production stability or always fetch the latest. Works with any LLM: OpenAI, Anthropic, Gemini, Llama, Mistral, and more. Plans from $9/mo. Every plan includes a 14-day free trial.

Add a comment

Replies

Best
Ivo
Maker
📌
Hey Product Hunt! 👋 I built PromptForge because I kept running into the same problem: every time I wanted to tweak a prompt in one of my AI-powered apps, I had to change a string in my code, push a commit, wait for CI, and deploy. For a single sentence change. It felt absurd. The turning point came when I was A/B testing prompt variations and realized I'd deployed 11 times in one afternoon, just to adjust tone and add a few context variables. There had to be a better way. PromptForge is that better way. It's a dead-simple API. You write prompt templates with {{variables}}, and your app fetches them at runtime via a single HTTP call. When you want to change a prompt, you edit it in PromptForge and it's live instantly. No deploy. No PR. No waiting. Every change is versioned automatically, so you can pin production to a specific version while experimenting with the latest in development. If something goes wrong, rolling back is changing a version number, not reverting a git commit. What makes it different: - No SDK required: it's a REST API. If your language can make an HTTP request, you're good. Python, Node, Go, Rust, curl, doesn't matter. - LLM-agnostic: PromptForge doesn't care which model you use. It manages the prompt text; you send it to whatever provider you want. - Version pinning: production stability without sacrificing iteration speed. I'd love to hear what you think, and I'm here all day to answer questions. If you're building with LLMs and tired of redeploying for prompt changes, give PromptForge a try. The 14-day trial is free. Thank you for the support! 🙏
Daniele Packard

Very cool idea - I struggle with this issue for a lot of OpenAI API uses in my app (translation, judgements, etc)

Works for mobile app in react native and Vercel backend?

Ivo
Maker

@daniele_packard Thanks!

Translation and judgement prompts are a great use case, especially since those tend to need constant fine-tuning once you see real user input.


And yes, it works with both. PromptForge is just a REST API, so anything that can make an HTTP request can fetch prompts. From your React Native app you could call it directly, or fetch prompts server-side in your Vercel backend and pass them to OpenAI from there. The second approach is usually better since it keeps your API key off the client.


The setup is the same either way: a single fetch call with your prompt ID and any variables you want to interpolate. Response comes back as JSON.


Would love to hear how it works with your translation prompts if you give it a try.