Launching today
Airgap: Local AI Chat

Airgap: Local AI Chat

Chat with your local LLM. Your data stays private.

1 follower

Airgap connects your Android device to self-hosted LLM servers like LM Studio, and LocalAI. Your conversations never leave your network. No cloud services, no data collection, no accounts required. Features: • OpenAI-compatible APIs • Real-time streaming responses • Chat history with search • Token usage tracking • Image and file attachments • Code syntax highlighting • Multiple server profiles Built for developers, privacy-conscious users.
Airgap: Local AI Chat gallery image
Airgap: Local AI Chat gallery image
Airgap: Local AI Chat gallery image
Airgap: Local AI Chat gallery image
Airgap: Local AI Chat gallery image
Airgap: Local AI Chat gallery image
Free Options
Launch Team / Built With
Intercom
Intercom
Startups get 90% off Intercom + 1 year of Fin AI Agent free
Promoted

What do you think? …

Rodrigo Lopes Martins
Hey Product Hunt! 👋 I built Airgap because I wanted to chat with my local LLM models without sending data to cloud services. If you're running Ollama, LM Studio, or any OpenAI-compatible server at home, this app connects you directly. Here's what makes it different: • Everything stays on your local network • Works with 6+ server types out of the box • Beautiful Material Design 3 interface • Free core features, one-time premium for power users I'd love to hear from you: • What local LLM setup are you running? • What features would make this more useful? • Any bugs or connection issues? The app is live on Google Play. Try it and let me know what you think!