Launching today

Built for Devs
See how developers really experience your product
154 followers
See how developers really experience your product
154 followers
Three tools. One platform. Complete developer adoption intelligence. Time-to-value tracking, screen-recorded evaluations with real ICP-matched devs, and an AI engine that tells you exactly what's broken and how to fix it. The intelligence compounds. You've watched the dashboards. Developers still drop off. Now you'll know why.













Congrats on the launch! 🎉 The insight about user interviews being unreliable is spot on — people describe a smoothed-out version of what actually happened, not the real confusion. Screen recordings of unscripted first experiences are a completely different signal. About to launch OceanMind, an AI-powered breathwork iOS app, and the onboarding drop-off problem is exactly what keeps me up at night. Curious whether the platform works for mobile app onboarding too, or is it primarily focused on web-based dev tools and SDKs? The ICP-matched developer evaluation piece is the part I find most compelling — getting real first impressions from people who match your actual user before you’ve burned your launch day on guesswork.
Built for Devs
@alexeyglukharev the language is very developer based for your users, but I don't see why you can't. I'm using it for Built for Devs and my product itself isn't a dev tool. I would think it should work the same. Let me know if you try it. tessa at builtfor.dev
As for developers providing the opinions vs traditional users, there isn't a replacement for that, but developers know software better than anyone else, so it might end up being fruitful, if I can get developers matched to something thats not a dev tool. I don't see I couldn't find them that would be interested.
Built for Devs
Built for Devs is the result of productizing a service that drove the greatest results I've ever seen in my career.
I brought real developers in to screen record themselves naturally trying a client's dev tool—no scripts, no hand-holding. Just honest, unfiltered first experiences. Those recordings shaped findings reports that told founders exactly what was broken and what to fix. Red flags for what needed to be addressed first. Quick wins for low-effort opportunities. The full story of how developers experienced every stage of the product.
The results were unlike anything else I'd produced.
One client fixed a handful of friction points and hit Product Hunt #1 product of the day and week. Another completely pivoted their market using insights from 10 developer segments. The recordings alone kept roadmaps full for months.
The service was such a success that it became a platform—and then some.
Dev tool founders now get continuous journey tracking so they always know where developers slow down and disappear. Real developer evaluations matched to their exact ICP—screen recorded, unscripted, and incredibly revealing. And an AI engine that analyzes every data point including video, drafts findings reports richer than anything I could produce by hand, and recommends exactly what to fix.
Not a one-time audit. A living system that gets smarter over time. The more data it collects, the more precise the recommendations get. Founders stop guessing. They know exactly what's broken and exactly what to do about it.
Built for Devs is developer adoption intelligence for dev tools. It shows founders exactly where developers drop off, why they leave, and what to fix—continuously.
If you're a developer reading this—those screen recordings don't record themselves. Built for Devs pays developers to try dev tools. No meetings, no scripts, no hand-holding. Just your honest first experience with a product.
HOW IT WORKS
Built for Devs leverages a tracking script that captures the users entire journey—from first visit through interaction—tracking pageviews, clicks, form submissions, time spent, errors, rage clicks, and scroll behavior to measure the dev tools TTV (time to value). It uses that data to provide a fully-detailed developer journey with your touch points mapped to the right stage. It leverages everything that it learns to constantly provide recommendations for improvement.
When developers use the dev tools during an evaluation, the system processes three layers of data: a full transcript of what they say, video analysis revealing where they struggle or get confused, and interaction data captured by the tracking script that records navigation, clicks, and time spent. AI synthesizes these three layers into a findings report, identifying patterns in how developers approach problems and where their product could improve the experience.
bunny.net
@tessak22 this is awesome, congrats on the launch Tessa!
Built for Devs
@marek_nalikowski thank you!
This looks really useful. The screen-recorded evaluations with real devs are a great idea, getting unfiltered first impressions before launch sounds way better than guessing. Curious about the time-to-value tracking too, how do you define 'conversion'? Is it something I configure or does it detect it automatically? I'm building a desktop app with Electron so also wondering if this would work for that or if it's web-only.
Built for Devs
@ray_artlas You configure it by setting your "value points" because there can be multiple points per product. In the configuration you also set which pages/endpoints go in which part of the developer journey, too. This helps you see a clear view of your developer user journey and then the events and human data are layered over top of that to provide rich recommendations on what to improve. Its web only unfortunately.
Trufflow
Do you find a difference in data quality between the developers that are being paid to test a tool versus actual users of the dev tool? For instance, I know that my behaviour is different when I'm filling out a survey for a contest versus one that I genuinely am interested in.
Built for Devs
@lienchueh YES! Massive difference. I have some videos on YouTube of evaluations when it was a service, so they are pretty lengthy. But yes, developers are told be candid and to express every emotion—good and bad—and they really do in these evaluations. It makes for the most amazing results. One developer, who never curses publicly, dropped a few F bombs in an evaluation because the oauth permissions were too loose.
Do you track where developers get stuck during onboarding or is it more focused on the overall experience? Congrats on the launch!
Built for Devs
@mcarmonas It tracks the entire journey. But you have to include the script in every platform the developers touch. We bring the pieces together and show you the entire developer journey from first visit to when they leave across your site, docs, blog, product, etc. As long as the script loads.
Thank you so much!!!
Built for Devs
Drop one script tag into your site. That's it.
bfd.js tracks how developers actually move through your docs and product—pageviews, clicks, scroll depth, time on page, rage clicks, copy events, JS errors, and form interactions. No fluff. No PII. Sensitive fields and params are automatically redacted.
Pair it with screen-recorded evaluations from ICP-matched developers in our 6k+ network, and you stop guessing what's broken. You see it.
Three products. One goal: turn drop-offs into adoption.
→ JS tracking script — measures TTV and the full developer journey
→ Screen-recorded evaluations — real developers, your exact ICP, paid to do a thorough job
→ AI recommendations engine — tells you what's broken and what to fix. Gets smarter over time.
Built for dev tool teams who are tired of shipping docs into a void.
the "how developers actually use it" angle is underserved - most UX tools optimize for non-technical users and then try to bolt on developer modes. what kind of signals do you surface that typical session recording misses? I am thinking things like rage clicks on APIs or copy-pasting error messages.
Built for Devs
@mykola_kondratiuk exactly! Between the different data surfacing tools, it really doesn't miss much honestly. The tracking script captures the users entire journey—from first visit through interaction—tracking pageviews, clicks, form submissions, time spent, errors, rage clicks, and scroll behavior.
When developers use the dev tools during an evaluation, the system processes three layers of data: a full transcript of what they say, video analysis revealing where they struggle or get confused, and interaction data captured by the tracking script that records navigation, clicks, and time spent. AI synthesizes these three layers into a findings report, identifying patterns in how developers approach problems and where their product could improve the experience.
the dev tools session data is a great signal - developers instinctively open them when something feels off, so that alone is basically a frustration indicator. that full journey capture sounds really useful for identifying where the evaluation process breaks down.
Built for Devs
@mykola_kondratiuk I agree. I'm glad you see the value like I do!
good luck with the launch!