ElevenAgents by ElevenLabs — Scale conversations without scaling your team
Scale conversations without scaling your team
Promoted
Maker
📌
👋 Hey Product Hunt!
I built **layercache** because I kept running into the same problem in production: cache stampedes taking down my database during traffic spikes.
You know the scenario — 100 requests hit an empty cache at the same time, and your DB gets hammered 100 times for the exact same query. I wanted a cache that handles this automatically, not something I had to bolt on myself.
**layercache** is a multi-layer cache for Node.js (Memory → Redis → Disk) with stampede prevention built in from day one:
- 🛡️ **100 concurrent requests = 1 DB call. Always.** No extra config.
- 🔄 **Auto backfill** — L1 miss → L2 hit → L1 auto-filled. Zero plumbing.
- 🏷️ **Tag invalidation** — `invalidateByTag('user:123')` and you're done.
- ⚡ **Distributed single-flight** — Cross-instance dedup via Redis locks.
- 🔧 **Resilience** — Circuit breaker, graceful degradation, stale-while-revalidate.
- 📦 **Tiny footprint** — Only 2 runtime deps (`async-mutex` + `msgpack`).
It also ships with middleware for **Express, Fastify, Hono, tRPC, GraphQL, and Next.js** — plus OpenTelemetry and Prometheus support out of the box.
```ts
import { CacheStack, MemoryLayer, RedisLayer } from 'layercache'
const cache = new CacheStack([
new MemoryLayer({ ttl: 60_000 }),
new RedisLayer({ client: new Redis(), ttl: 3_600_000 }),
])
const user = await cache.get('user:123', () => db.findUser(123))
```
I'd love to hear what you think! Happy to answer any questions about caching strategy, migration from node-cache-manager/keyv, or anything else.
⭐ If this saves you some database calls, consider starring us on GitHub! https://github.com/flyingsquirre...
Report
No reviews yetBe the first to leave a review for Layercache