InferiaLLM

InferiaLLM

The Operating System for LLMs in Production

13 followers

InferiaLLM is an operating system for running LLM inference in-house at scale. It provides everything required to take a raw LLM and serve it to real users: user management, inference proxying, scheduling, policy enforcement, routing, and compute orchestration - as one system.

InferiaLLM makers

Here are the founders, developers, designers and product people who worked on InferiaLLM