tambo

tambo

An AI orchestration framework for React. Open-source.

283 followers

Tambo is the interface layer between AI and React, giving AI the ability to generate and interact with your components, including charts, forms, dashboards, and workflows. It's open-source, used by developers, startups, and Fortune 1000 companies.
tambo gallery image
tambo gallery image
tambo gallery image
tambo gallery image
Free
Launch Team
Auth0
Auth0
Start building with Auth0 for AI Agents, now generally available.
Promoted

What do you think? …

Michael Magan

Hey Product Hunt! Tambo helps you build apps that work the way users think: natural language in, React UI out.

The Problem

  1. Feature discovery kills adoption. Users don’t know what features exist, and companies spend billions on docs, research, and support to help them figure it out.

  2. Natural language input is the answer, but most AI apps still return walls of text. Users end up reading instead of doing.

  3. For developers, it’s worse: wiring AI into React components, APIs, or MCPs is brittle and slow.

Teams spend weeks stitching flows that never feel production-ready. As one developer told us:

“I found Tambo on Friday, built a POC over the weekend, and demoed it to the entire eng org on Monday. It saved me so much time.”

What’s Tambo?

Tambo is your app’s interface layer for connecting AI to React components. Instead of wiring flows by hand, you register components once, and Tambo handles the rest.

  • Register components → Tambo knows how and when to render them.

  • Connect APIs or MCPs → expose actions and data through natural language.

  • Stay in sync → props, arrays, nested objects, and streaming updates just work.

It’s open source, already in use by indie hackers to Fortune 1000 teams.

It makes going from idea → demo a weekend project.

And from demo → scale without rewriting everything.

About Us

We met at a hackathon and realized we were both convinced AI would change the future of UX.

Michael Magán — ex-PM at Taxbit, Convoy, and Indeed; built a viral AI app in 2024.

Michael Milstead — ex-Microsoft engineer, creator of 14+ apps and open-source packages.

Backed by YC alumni and angels.

Try it Out

Get started at tambo.co, check out the docs at docs.tambo.co, or star us on GitHub: github.com/tambo-ai/tambo

Lluís Rovira

@mrmagan_ This is seriously impressive. Totally get the pain of spending weeks stitching together a fragile AI flow that still doesn’t feel solid. The “register once, let it flow” approach is such a breath of fresh air. At Protofy we prototype AI interfaces constantly, and this kind of orchestration layer would save so much time — especially when jumping from demo to something actually usable. Big congrats on the launch! Can't wait to try it out.

Michael Magan

@lluisrovirale Yes! This is what we aim for: developers can focus on design, function, and prompt engineering, while Tambo ensures everything works nicely together.

Let me know what you think. We'd be happy to help you in any way we can.

vivek shukla

Super fast to build Cursor style apps with tambo. Has inbuilt chat, streaming, generative ui in chat, controlling UI from chat, adding tools etc features out of the box.

Last week created a fully working AI APP builder in 4 hours.

Proof :P :- https://github.com/vivek100/zuuma

Michael Magan

@weebhek I love this demo. Thanks for sharing what you built. I can't wait to see where this goes.

I'm curious what the top things you think we should add to Tambo are?

vivek shukla

@mrmagan_ 

Most of the use cases are already covered by tambo, but few things i would like to see:-

  1. Inbuilt option to directly render the tool call's input and output with custom components in chat or as interactable component. This saves token and tool calls as this will allow us to make one tool call instead of two one tool call to get data and another to render the component.

  2. Inbuilt option to save tool output or large output like 10k rows as memory store key, which ai can use to call other tools instead of asking Ai to remember it all. For example I can pass 10k rows as a provider key to a table component and it can just use it, or pass it to a python sandbox to run complex code. Can be in memory on frontend, or in radix or actual DB.

  3. Option to plugin agents built with mastra and langgraph or option to add a agent as an tool

Michael Magan

@weebhek  these are all great ideas. I'm adding them to the roadmap.

Arjun Aditya

I’ve been exploring tambo over the past few weeks and it feels like a big thing for the llm industry when it comes to generating UI components.

waiting for the day when big ai companies adopt this sdk for their chat.

for now, I’ve been loving the dashboard and the octo mascot it’s just so cute 🐙

Congrats to the whole tambo team for the launch, launch on peerlist as well :3

Michael Magan

@nermalcat69, thanks for your open-source contributions to tambo. We are excited to have you as part of the community.

Alex Cloudstar

Wow, Tambo sounds like a game-changer for making intuitive apps with AI! I love the idea of turning natural language into seamless UI workflows. It's great to see a tool that not only saves developers time but also enhances user experience. Can't wait to see all the creative apps people build with this! 🌟

Akhilesh Rangani

one of the best devtools I came across recently! Super easy to use, and the best part is that it is intuitive to use too. Yes, there are a few other tools that are out there but I feel this one is the easiest to use, you do not have to worry about state management, LLM configs, and even context management- kudos to the team for building this!

Michael Magan

@akhileshrangani thanks! We are glad you checked it out.

Anand Tyagi

Tambo is the future. After playing around with it it's clear just how much potential there is here. It's easy to see that how we interact with computers is going to fundamentally change in the near future. But Tambo takes that vague notion of a cool potential future and turns it into reality. Gen UI is here and Tambo is paving the way!

Michael Magan

@anand_tyagi1 Thanks, Anand. I'm excited to see what you build with it. We have been thinking about this problem for a long time now and hope to provide shortcuts to better AI experiences.

Sneh Shah

👏 Awesome launch, Tambo team! I’m really curious — how does Tambo decide which React components to generate vs. reuse (when users issue natural-language prompts)? 🤔 And in cases where the AI’s understanding is fuzzy or ambiguous, what’s your fallback — tell the user, offer choices, or degrade gracefully?

Michael Magan

@sneh_shah these are great questions!

1. Tambo passes the component metadata, and the LLM just chooses which to render. Most of our magic is around how we help you handle streaming, and passing state back to the LLM after user interaction.


2. LLMs are really good at understanding fuzzy needs. For example, if you try our AI assistant on our dashboard, tambo.co/dashboard, you can ask it to update the "prompt" of a project it and it will correctly render the custom instructions UI. No magic here, just the LLM understanding intent.

https://youtu.be/NOb3o05856o

2.b In the future, we are planning to add systems that help product teams improve it through prompts, knowledge bases, and memory.

3. We have been thinking a lot about how to support multiple choices, tuning responses over time (gets it wrong once, then remembers and gets it right all subsequent requests, etc.)

Great questions keep them coming!

123
Next
Last