All activity
Kyle Mistele left a comment
This is super awesome! Really love the prometheus integration. And, I saw Ollama is supported for LLM inference. Out of curiosity, is there official support or a recommended approach for tracking stats for vLLM? I know it has support for some metrics (https://docs.vllm.ai/en/latest/s...) and given that it's one of the most popular LLM inference frameworks this could be a really cool integration...

OpenlitOne click observability & evals for LLMs & GPUs
