Launching today

OpenJet
Agentic TUI for self-hosted LLMs on the Edge
3 followers
Agentic TUI for self-hosted LLMs on the Edge
3 followers
Agentic Terminal User Interface for NVIDIA Jetson devices for local LLMs without any internet connection. Think Claude Code for edge compute. Fully local operation. No cloud transport, no API keys, no latency. All inference and logs stay on-device. It works with unified memory systems where memory management is crucial




I am building a Terminal User Interface (like Claude Code) for self-hosted AI agents on Jetsons. Works in air-gapped environments. Unlike other solutions, this is optimised for unified memory machines, as to avoid OOM errors.
The agent can do stuff like edit, read, create files - manage and interpret data locally.
Currently, it gets ~17 tok/s on Jetson Orin Nano 8GB using Qwen3-4B-Instruct-4bit In the future, adding TensorRT .engine support which will boost inference further. I am trying to get the memory footprint down, so if anyone has knowledge on kv cache optimisation, that would be great.
I would love to get your feedback and people try running it on more capable devices and models - post your results here.
Run ``` pip install open-jet open-jet --setup ```