Skeet

Skeet

Local-friendly command-line copilot that works with any LLM

5 followers

Skeet: The command-line AI copilot for power users. Resilient. Model-agnostic. Local-friendly. A Swiss Army Knife for your terminal.
Skeet gallery image
Free
Launch Team / Built With
Auth0
Auth0
Start building with Auth0 for AI Agents, now generally available.
Promoted

What do you think? …

Stephan Fitzpatrick
I've been using GitHub Copilot CLI, and while it's great, I found myself wanting something that could work with any LLM (including running local models through Ollama), so I built Skeet. The key features that make it different: - Works with any LLM provider through LiteLLM (OpenAI, Anthropic, local models, etc.) - Automatically retries and adapts commands when they fail - Can generate and execute Python scripts with dependencies (powered by uv) without virtual environment hassles You can try simple tasks like: ``` skeet show me system information skeet what is using port 8000 skeet --python "what's the current time on the ISS?" ``` Demo: https://asciinema.org/a/697092 Code: https://github.com/knowsuchagenc... I built it for myself, and I've been really happy with the results. It's interesting to see how different models fare against one another with everyday tasks. If running a local model, I've had decent luck with ollama_chat/phi3:medium but I'm curious to know what others use. Cheers!