Launching today

ora
Your personal simultaneous interpreter, on your Mac
40 followers
Your personal simultaneous interpreter, on your Mac
40 followers
Simultaneous interpreters used to be reserved for heads of state. Ora puts one on your Mac. Speak any language, see live translations stream into a floating caption card — entirely on Apple Silicon. No cloud. No account. Free forever.










OpenYak
Not relying on cloud tools or apis is a smart move. What trade offs did you make between latency vs accuracy when running everything locally on Apple Silicon??
OpenYak
@lak7 Thanks — for us it’s really a tradeoff between waiting for more context and staying live enough to be useful.
The pipeline is built around VAD endpointing plus rolling partial updates: Ora starts showing translation while you’re still speaking, keeps revising as the utterance grows, and only commits the final version after a short pause. That gets the experience much closer to simultaneous interpretation than “transcribe first, translate later.”
Then the quality tier is the second knob: bigger local models improve nuance/terminology, but they’re slower and heavier. So we expose that choice instead of hard-coding one point on the curve.
For real conversations, we’ve found users usually prefer something that lands on time and gets refined in place, rather than something more polished that arrives too late.