Visualizes Harmony JSON/JSONL conversations and Codex CLI session logs as interactive, filterable timelines in the browser. For AI engineers debugging agent workflows and teams building on gpt-oss models.
Wispr Flow: Dictation That Works Everywhere — Stop typing. Start speaking. 4x faster.
Stop typing. Start speaking. 4x faster.
Promoted
Hunter
📌
Harmony conversations and Codex CLI session logs are not readable by default. That is a problem when your agent just did something unexpected.
What it is: Euphony is an open-source browser tool from OpenAI that renders Harmony JSON/JSONL files and Codex session logs as structured, interactive conversation timelines.
Problem: AI agents running on Harmony-format data produce deeply nested JSON with role tokens, channel markers, and interleaved tool calls. Codex CLI writes a rollout JSONL file to disk after every session. Neither format is designed for human inspection. Debugging agent behaviour means scrolling through hundreds of lines of raw structured data with no visual hierarchy.
Solution: Euphony loads that data from a URL, local file, or clipboard paste and renders it as a readable timeline. Filter by role, recipient, or content type using JMESPath. Inspect message metadata. Edit in-browser. Translate using your own OpenAI API key. All processing stays client-side in the default frontend mode.
What makes it different: It ships as embeddable Web Components, so you can drop a conversation viewer into React, Vue, Svelte, or plain HTML with a single custom element. No need to build a viewer from scratch. The optional FastAPI backend adds Harmony tokenization rendering, which shows exactly how a conversation will be tokenized before it reaches the model.
Key features:
Load from clipboard, local file, or public HTTPS URL
JMESPath filtering by role, recipient, and content type
Metadata inspection panel for annotated datasets
In-browser editing, focus mode, and grid view
Embeddable Web Components, Apache 2.0 license
Who it's for: AI engineers working with gpt-oss models, Codex CLI, or Harmony-format datasets for training, evaluation, or agent pipelines.
Agent debugging has needed proper tooling for a while. Euphony is a small but real step toward that.
Harmony conversations and Codex CLI session logs are not readable by default. That is a problem when your agent just did something unexpected.
What it is: Euphony is an open-source browser tool from OpenAI that renders Harmony JSON/JSONL files and Codex session logs as structured, interactive conversation timelines.
Problem: AI agents running on Harmony-format data produce deeply nested JSON with role tokens, channel markers, and interleaved tool calls. Codex CLI writes a rollout JSONL file to disk after every session. Neither format is designed for human inspection. Debugging agent behaviour means scrolling through hundreds of lines of raw structured data with no visual hierarchy.
Solution: Euphony loads that data from a URL, local file, or clipboard paste and renders it as a readable timeline. Filter by role, recipient, or content type using JMESPath. Inspect message metadata. Edit in-browser. Translate using your own OpenAI API key. All processing stays client-side in the default frontend mode.
What makes it different: It ships as embeddable Web Components, so you can drop a conversation viewer into React, Vue, Svelte, or plain HTML with a single custom element. No need to build a viewer from scratch. The optional FastAPI backend adds Harmony tokenization rendering, which shows exactly how a conversation will be tokenized before it reaches the model.
Key features:
Load from clipboard, local file, or public HTTPS URL
JMESPath filtering by role, recipient, and content type
Metadata inspection panel for annotated datasets
In-browser editing, focus mode, and grid view
Embeddable Web Components, Apache 2.0 license
Who it's for: AI engineers working with gpt-oss models, Codex CLI, or Harmony-format datasets for training, evaluation, or agent pipelines.
Agent debugging has needed proper tooling for a while. Euphony is a small but real step toward that.