Lore

GPT–LLM Playground for macOS

3.0
1 review

6 followers

A native macOS playground for large language models with time travel, automatic versioning, and full-text search. It is model-cost aware, shows tokens used and available, allows to explore variants, make notable generations, leverage examples.
Free Options
Launch Team
Wispr Flow: Dictation That Works Everywhere
Wispr Flow: Dictation That Works Everywhere
Stop typing. Start speaking. 4x faster.
Promoted

What do you think? …

William Dar
Maker
📌
Hey Product Hunt! I ended up making a native Mac playground as I was finding myself spending hours on end playing with the API in the OpenAI playground—only I wasn’t satisfied with lack of ChatML in the export and the interface for browsing and restoring history was too clunky for fast iteration and retrieval. The playground is also currently limiting the parameters and some of the ranges. The app is immutable by design with both immediate time travel and automatic versioning. It is written in Swift and a bit of Rust for the tokenizer. It is faster than you expect it to be. · It has support for variants, which is the `n` parameter in the chat completion API—equivalent to the drafts feature in Bard. · It shows model- and ChatML-aware tokens and cost, and it dynamically adjusts response length in function of prompt length and context length. · It supports hiding runs that are not useful, marking the ones that are notable, and going through the ancestry of the current run. · It also has support for examples, as specified in ChatML, conversation names and personal and local-only notes. · The full-text search works across all stored text and it has support for `all`, `any`, `prefix` and `phrase` matching; the results are time-ordered rather than ranked for the time being. · There are several shortcuts and more are coming so that the app can be used entirely by keyboard. · You can export to JSON and all data is exported. · The app is sandboxed and notarized. The API keys are stored securely in the Keychain. Next I plan to implement combinatorial runs (mixing multiple values of the same parameter and multiple models as well) and full Markdown support. Currently only OpenAI’s conversational models are supported (GPT 3.5, GPT 4 8K, GPT 4 32K), but I’ll be adding support for local models and custom ChatML endpoints as soon as possible. Give it a try. Any and all feedback welcome! Thoughts and questions as well!
Alex Rozanski
@williamdar this is super cool to see! I particularly love the model settings UI on the right (although the haptic feedback on hover feels a bit too much for me, personally). I just launched LlamaChat today (SwiftUI/llama.cpp/SQLite) which allows you to interact with LLaMA models locally: https://www.producthunt.com/post...
William Dar
@alexrozanski1 Hey Alex! Congrats on the launch! LlamaChat is a good idea—a ChatGPT-like native interface to local models is definitely needed. The conversion workflow looks especially great! Yeah, the haptic feedback—I really wish I could control the intensity. It’s a bit too intense for me as well, but I’d rather keep it and put it behind a toggle than do without.