Param

noSwag - AI Powered API Testing, that scripts write itself

by
NoSwag turns your Swagger/OpenAPI JSON or YAML into pytest test suites. Upload or link your spec, get Python tests—no hosted runner, no execution platform. Run the tests locally or in your own CI. Supports Swagger 2.0 and OpenAPI 3.0, positive and negative cases, and optional AI-enhanced generation. Spec in, pytest out.

Add a comment

Replies

Best
Param
Maker
📌
noSwag 1.0 was “do everything”: spec management, AI test generation, cloud execution, dashboards, and reporting. We wanted to be the full API testing platform. That was exciting and also heavy—every new feature pulled in more infra, more moving parts, and more surface area to maintain. The shift to 2.0 wasn’t just a version number. We had to choose what we actually were. Execution on our infra, real-time runners, result storage—it was useful, but it wasn’t the thing we loved building. The thing that felt unique was turning a Swagger JSON into a pytest file you could run anywhere. So we narrowed: NoSwag 2.0 is the generator. Spec in, pytest out. No execution platform—you run the tests. That decision cut scope and clarified the product. Under the hood, 2.0 is a different codebase. We moved from Python/Flask to Node/Express and TypeScript, from Firebase to PostgreSQL and Prisma. The front end dropped MUI for shadcn/ui and a clearer design system; we added React Query and tidied auth and state. Some of that was tech debt, but a lot of it was “if we’re rebuilding the story, we may as well fix the foundation.” The migration was long and sometimes painful, but the result is a stack that fits the product we’re building now. Looking back, 1.0 was us learning what noSwag should be. 2.0 is us committing to that: a focused tool that generates pytest (and other test formats) from OpenAPI/Swagger, and nothing we don’t need. The journey from 1.0 to 2.0 is less about features and more about deciding what we’re for—and being okay with doing that one thing well.r test