Launching today
Local LLM-Vision — Fully Offline iOS AI

Local LLM-Vision — Fully Offline iOS AI

Run LLM & Vision AI fully offline on your iPhone

2 followers

A fully on-device AI app for iOS that runs both large language models (LLMs) and vision-language models (VLMs) entirely offline. Chat with local LLMs or analyze images in real time, all powered by Apple Metal acceleration. Unlike cloud-based AI apps, no data is uploaded and no internet connection is required. You can switch between multiple models depending on speed, size, and reasoning needs. Private, fast, and designed for modern iPhones.
Local LLM-Vision — Fully Offline iOS AI gallery image
Local LLM-Vision — Fully Offline iOS AI gallery image
Local LLM-Vision — Fully Offline iOS AI gallery image
Local LLM-Vision — Fully Offline iOS AI gallery image
Free Options
Launch Team
Anima Playground
AI with an Eye for Design
Promoted

What do you think? …

yysu
Maker
📌
Hi everyone 👋 I’m excited to share my first iOS app after about two months of solo development. I’ve been working on on-device inference and model optimization, and I wanted to explore a simple question: How far can AI run directly on an iPhone — without relying on the cloud? Most AI apps today depend on remote APIs. I wanted something different: • No servers • No internet required • No accounts or tracking • No data leaving your device So I built an app that runs both LLMs and vision-language models fully on-device, optimized with Apple Metal for real-time performance. It supports multiple local models, and you can switch depending on speed or reasoning needs. Modern iPhones are surprisingly capable — and this project is my attempt to push private, edge AI a bit further. I’d love feedback from builders who care about privacy-first AI or edge inference. Happy to answer any technical questions!