Launching today
TraceOps

TraceOps

TraceOps! Understand every decision your LLM makes.

1 follower

TraceOps brings the VCR.py pattern to LLM agents but at the SDK level, not the HTTP level. It intercepts openai.chat.completions.create, anthropic.messages.create, tool calls, and agent decisions, recording the full execution trace as a YAML cassette. On replay, it injects recorded responses without making any real API calls giving you zero-cost, millisecond-execution, fully deterministic agent tests. Contribute to ioteverythin/TraceOps development by creating an account on GitHub.
TraceOps gallery image
Free
Launch Team / Built With