Building a weather app in Flutter + Rust
Building a weather app in Flutter + Rust
DewLogic is a Flutter app on the outside and a Rust engine on the inside. The UI, state management, networking, and platform plumbing are all Dart / Flutter. The hot-path work (IDW blending, isopleth contour generation, model inference, asset management) is all Rust, called over FFI via flutter_rust_bridge v2.
This post is about why, how, and the footguns.
Why two languages
Flutter is exceptional at one thing: a single UI codebase that actually ships to every target (iOS, Android, Windows, macOS, Linux, Web). For a weather app that wants to be on every device you check weather on (phone in the morning, tablet at the barn, desktop in the home office), that's non-negotiable.
Rust is exceptional at a different thing: tight numeric loops, predictable memory layout, and frictionless FFI. When your app is doing IDW blending across dozens of observations per render, generating thousands of isopleth segments per map update, and optionally running a local LLM, you want that code in a language built for it.
Dart is fast enough for most app work and notably improved with WasmGC and AOT compilation, but "fast enough for app work" is not the same as "fast enough for numerical inner loops at 60 fps." Rust is.
How Flutter Rust Bridge works
FRB scans rust/src/api/*.rs for exported functions and generates matching Dart files in lib/src/rust/api/. Every exported function gets a sequential funcId in the generated dispatcher. When the Dart side calls the generated binding, it passes that funcId over FFI to a Rust dispatcher, which looks up the correct function and invokes it.
Our current surface:
asset.rs, Dart binding: asset.dart; Purpose: Model catalog, download, integrity check, swap
chat.rs, Dart binding: chat.dart; Purpose: Local LLM chat completion
idw_engine.rs, Dart binding: idw_engine.dart; Purpose: IDW blending, anomaly detection, Virtual Station math
isopleth_engine.rs, Dart binding: isopleth_engine.dart; Purpose: Contour generation for snow, ice, and future overlays
All calls go through a Dart-side facade (DewlogicBridge) that implements an abstract interface (AgentBridgeFacade), which means the rest of the app never touches the generated bindings directly. This matters for testing: we can swap in a pure-Dart mock bridge in tests without touching a single Rust line.
The footgun we hit hardest
If you add, remove, rename, or change the signature of any Rust API function, all existing function IDs can shift. The generated Dart bindings will compile fine. Your app will run. Every FFI call will hit the wrong Rust function.
The fix is a rigid ritual:
1. Edit Rust API. 2. Run flutter_rust_bridge_codegen generate. 3. Immediately rebuild the Rust binary (flutter run -d windows triggers cargo build).
If you skip step 3 and hot-reload the app, chat completion will dispatch to the IDW engine, the model download will try to execute contour math, and nothing will make sense. The symptoms are nonsense values, panics in unrelated code, or silent wrong answers. We learned this the hard way.
This rule now lives at the top of our CLAUDE.md for any agent touching the repo.
Feature flags across four build systems
The Rust crate uses Cargo features to gate heavy optional dependencies:
inference pulls in llama_cpp for local LLM
vectordb pulls in LanceDB and fastembed for RAG
Default features are empty. Those features have to be enabled per-platform through each platform's build system:
Android, Build system: Gradle; How features are enabled: CARGOKIT_EXTRA_CARGO_FLAGS env var
Windows, Build system: CMake; How features are enabled: set(CARGOKIT_EXTRA_CARGO_FLAGS ...)
Linux, Build system: CMake; How features are enabled: Same as Windows
macOS, Build system: CocoaPods; How features are enabled: export CARGOKIT_EXTRA_CARGO_FLAGS
If a feature isn't enabled on a platform, the functions that depend on it compile as stubs that return errors like "LLM inference not available on this platform." This is by far the #1 source of "why doesn't local chat work on Android" type bugs. Answer: the Gradle flag got dropped.
What it feels like in practice
Editing the UI: pure Flutter, hot-reload, instant feedback.
Editing the engine: edit Rust, run codegen, rebuild. Slower loop, but the engine rarely changes once it's settled.
Running in production: a single Flutter app binary with Rust compiled into a platform-native library, shipped through the normal Flutter toolchain. No extra server, no extra runtime, no extra deployment.
It's a pleasant setup. The sharp edges are real but survivable once you know where they are.
Happy to go deeper on codegen, Cargo feature plumbing, or how we handle web (where Rust compiles to WASM).

Replies