Launching today

Byoky
Share your AI budget without sharing your keys
28 followers
Share your AI budget without sharing your keys
28 followers
More than a wallet. Byoky is a network that lets your AI tokens flow between people, apps, and providers. Gift your spare budget to anyone. Translate between Claude, GPT, and Gemini on the fly. Connect from your phone via QR. 13 providers, open source.





Hey Product Hunt,
Every AI app wanted its own API key, or asked me to sign in and hand over a new credit card. My keys ended up pasted into a dozen .env files and SaaS dashboards, and I had no real idea who had what.
So I built Byoky.
A wallet that sits between you and every AI provider. Apps connect through it. Nobody sees your actual key. Want full control? Keys stay AES-256 encrypted on your device. Want convenience? Turn on vault sync and your keys work across all your devices automatically.
Once the wallet worked, we built a network on top.
Token gifting. Share your spare API budget with anyone. They use your tokens, never see your key. Budget caps, expiry, instant revocation. We built a Token Pool where the community shares free tokens with each other.
Cross-provider translation. Your app calls Claude, the user routes it through GPT, and it just works. Anthropic, OpenAI, Gemini, Cohere. Translated on the fly.
Connect from anywhere. No extension? Scan a QR code from your phone. Your keys proxy through your mobile wallet to any browser on any computer.
For developers. Add a "Connect Wallet" button. Your users bring their own keys. You ship AI features without paying for a single API call. Two lines of code, 13+ providers.
Fully open source, MIT licensed. Chrome, Firefox, iOS, Android.
Two questions:
1. What provider or integration should we add next?
2. Would you gift your spare tokens to strangers?
The cross provider translation is where the hard part lives, and the page is light on how it works in practice. Tool calling is the piece that breaks most abstractions, Anthropic wants tools with input_schema, OpenAI wants functions with parameters, Gemini wants functionDeclarations with a different shape. Does Byoky normalize all three into one canonical schema on the way in, or does each app ship provider specific tool definitions through the wallet?
@myultidev Canonical IR. Apps keep shipping their native dialect; one adapter per family parses into a shared IR and serializes out, so cross-family is just dst.serialize(src.parse(body)) — no N×M matrix.
Tool shape normalizes to {name, description, parameters: JSON Schema} and reshapes back into input_schema / parameters / functionDeclarations at the destination. The schema envelope is the easy part. The real work is on the messages side: Anthropic allows text+images inside a tool_result while the others want flat strings, so the IR keeps the rich form and serializers flatten only where they must. Tool IDs round-trip through the IR.
Gemini is the odd one out — pre-3 the functionCall.id is absent, so we synthesize one from the call name; on Gemini 3 the API supplies its own id and we pass it through. Streaming is the other sharp edge. OpenAI has no block scaffolding, so we synthesize start/stop around tool_calls[i].function.arguments fragments (only the first chunk carries id/name, the rest key off index). Gemini doesn't stream args incrementally at all — the full args arrives in one frame — so when serializing to Gemini we buffer IR deltas and flush at block close.
Cohere has its own tool-call-start/delta/end trio plus a tool_plan pre-tool reasoning channel we map to thinking blocks. A few combinations we refuse loudly rather than silently mangle: forced-tool-by-name against Cohere v2 (their REQUIRED means "any tool," not a named one), tools + application/json response mode against Gemini (the API itself rejects the combo), and n > 1 against Anthropic. The group-swap UI warns on capability mismatch at drag-time using the request log, so it surfaces before the runtime call.