Object store built for AI workloads

Object store built for AI workloads

High performance, AI-native objectstore with S3 API included

16 followers

Anvil is an open-source, AI-native object store designed for modern workloads. We built it after hitting the limits of Git LFS, Hugging Face repos, S3, and others when working with multi-GB model files. It is S3-compatible & gRPC-native, supports: * Model-aware indexing - so it understands safetensors, gguf, and ONNX. * Tensor-level streaming * Erasure-coded storage * Open source (Apache-2.0) If you’re storing large models, versioning fine-tunes, running local inference, we want your feedback.
Object store built for AI workloads gallery image
Free
Launch Team
Auth0
Auth0
Start building with Auth0 for AI Agents, now generally available.
Promoted

What do you think? …

Courtney Robinson
Hey all, We’re the team behind Anvil. We built this because every existing option failed when serving large models: Git LFS broke on multi-GB safetensors HF repos weren’t ideal for private/internal hosting S3/MinIO treated model files as “dumb blobs” Full-model downloads were too slow for inference Replication made storage 3× the cost Fine-tunes duplicated 10–20GB base models repeatedly So we built Anvil as the object store we wish existed. It’s S3-compatible, self-hosted, open-source, and understands ML model formats natively. We run it in production on an 18-node cluster and are finally releasing it to the world. Happy to answer every question — technical deep dives welcome! If you like what we built, an upvote means the world to us ❤️
Olaoluwa Mercy Deborah

@zcourts this will really help cut cost tbh