joshua Nishanth

TraceOps - TraceOps! Understand every decision your LLM makes.

by
TraceOps brings the VCR.py pattern to LLM agents but at the SDK level, not the HTTP level. It intercepts openai.chat.completions.create, anthropic.messages.create, tool calls, and agent decisions, recording the full execution trace as a YAML cassette. On replay, it injects recorded responses without making any real API calls giving you zero-cost, millisecond-execution, fully deterministic agent tests. Contribute to ioteverythin/TraceOps development by creating an account on GitHub.

Add a comment

Replies

Be the first to comment