Matt Granmoe

Otto Engineer - Autonomous AI software engineer that runs in the browser

Otto Engineer is an autonomous AI sidekick that iterates and tests its own code until it works, using Web Containers to execute code safely in an isolated environment. Otto can install npm packages & work across multiple files, and requires zero setup!

Add a comment

Replies

Best
Matt Granmoe
Absolutely thrilled to introduce Otto Engineer to the Product Hunt community! 🚀 Otto is a truly autonomous AI coding sidekick that can use npm packages, write tests for its code, run commands, and iterate until things WORK. No more hallucinated code that doesn't even run and many rounds of back and forth with an LLM to get working code. Built on the cutting-edge Web Containers technology, it operates entirely within your browser, offering a safe, isolated environment for code execution without any setup required 🦾 Best of all: it's free for anyone to try! We can't wait for you to try Otto and see how it empowers you to build bigger! Dive in, start a new chat with Otto, and watch magic happen in the embedded terminal and editor. Partner with AI and help us shape the future of coding! 🚀👋
Aris Nakos
@remango wow. First of all, incredibly polished launch with a slick demo and clear pitch. I also learned about Web Containers, which makes so much sense...I will try your product out. Congratulations.
Matt Granmoe
@aris_nakos Thank you! 🙏🏻
Matt Granmoe
@anandreddyks It is currently stateless in the sense you describe. High on my list is to allow importing a git repo or folder as a starting point, then download the result once Otto is finished. This ties into the super (not so) secret plan to have Otto orchestrate multiple, child Otto agents in order to build entire apps 😎
Joep van den Bogaert
💡 Bright idea
Very cool, congrats on the launch! Love that you can just run this in the browser and get started without any setup, well done 👏🏻 Any plans to make Otto operate inside an existing repository as well?
Matt Granmoe
@jopie Yes! This is high on my list because this is the #1 thing that I myself want to use it for. There will definitely be size limitations, but I think there's still massive value in using it for mini to small to (maybe one day) medium repos 🙂
Joep van den Bogaert
@remango Awesome, I think that would immensely increase the value it can bring. Does the size limit come from the LLM context or something else?
Matt Granmoe
@jopie I agree! And yes, it's primarily a limitation of the LLM context length, but also cost comes into play since if we were to simply feed the entire chat history to the LLM on each request, the number of tokens used increases drastically. I have some techniques for vastly minimizing the number of tokens used to work on a *new* codebase written by Otto, but this only works if it's an AI-first codebase (did I just make up that term? 😝), not an existing one where we don't control how the code is structured
Sandra Djajic
Sounds like very helpful tool, I'm thrilled for all of the developers to try it out! Congratulations on a launch, and I hope you never stop growing and developing! 🎉💪🏼
Matt Granmoe
@sandradjajic Thank you, Sandra!!
Chris W
Ah wow. It's like Devin but we can actually use it. I think in the browser is probably better too. How complex a project/task are you able to give it at the moment?
Matt Granmoe
@cwbuilds1 Exactly 😂 And yes, running in the browser via web containers means: 1) zero setup, super easy to jump into a chat 2) code is executed safely in isolation and file changes don't affect your real file system
Matt Granmoe
@cwbuilds1 So what I've found Otto really excels at at the moment is a small yet complex/hairy TypeScript problem, which is what inspired me to build it in the first place (so I optimized it for this, intentionally or unintentionally). I would often get hung up trying to build a complex type (say something with type inference and generics and mapped types), and I would try to use tools like ChatGPT to help, but I would get code that doesn't work at all and have to go through many rounds of iteration. The key difference that makes Otto able to give you *actually* working code, is that it runs and tests its own code, looks at any errors, and improves the code on autonomously. I think my favorite example, though it's kind of a smaller scope one, is that snake case TS utility type that I have in my Loom demo (and also on the landing page) 🙂 It's not that crazy of an example, but it's super *useful* for me in my day to day workflow as an engineer.
Abhilash Chowdhary
Going to give this a try, team Otto Engineer. Looks interesting.
Matt Granmoe
@abhilash_chowdhary Thank you! And by the way, the team is just me!
Andy Haynssen
Super cool! Can't wait to see where this goes next! Love that you can trust but verify way more than just a raw untested hallucination.
Sabari Nathan
An AI sidekick that iterates and tests its own code, plus zero setup required? That's amazing! Congrats for your launch!
Matt Granmoe
Top Product
@sabari_nathan5 Thank you!
Shushant Lakhyani
Congratulations on the launch! How is this better than other AI software development tools?
Matt Granmoe
@shushant_lakhyani Otto is the first AI coding agent that both 1) runs code safely in isolation without affecting your file system 2) lets the AI execute and test its own code and 3) requires zero setup. Usually, you have to sacrifice one of these. For example, if the agent runs in your IDE, then it's making changes to your actual file system and executing your code in an unsafe environment, so it would be way too risky to just let the agent run wild and make multiple rounds of changes. If the agent runs in isolation via Docker or similar, setup is much more laborious, startup is slower, and it adds a lot of complexity. As for tools like plain LLM chats, they don't have the ability to execute their code and iterate based on the results. Otto strikes a great balance!
André J
What's the wildest thing you have seen Otto build on his own?
Matt Granmoe
@sentry_co Honestly I really love some of the elegant implementations of complex TypeScript types! This is obviously smaller scale, but it's the reason I built Otto, and it offers clear value in my day to day workflow as an engineer. Here's one of its implementations of that: type SnakeCase = S extends `${infer First}${infer Rest}` ? Rest extends Uncapitalize ? `${First}${SnakeCase}` : `${First}_${SnakeCase>}` : S;
Sadath N
Congratulations, you guys did a great job! 🎉 .
Matt Granmoe
@sadath_n Thank you so much! And by the way, the team is just me!