Marton Schneider

Marton Schneider

Building ambitious AI products
All activity
Stop hand-managing Helm installs. LiteLLM Operator brings a declarative, reconciliation-based workflow for LiteLLM on Kubernetes, with bidirectional sync between CRDs and the Admin UI, support for models, teams, users, and keys, plus OperatorHub/OpenShift-friendly distribution.
LiteLLM Operator
LiteLLM OperatorRun LiteLLM on Kubernetes the operator way
Langfuse Operator makes self-hosting Langfuse on Kubernetes actually production-ready. Deploy the full stack from one custom resource, automate upgrades and migrations, and avoid the usual self-hosting glue. Built for platform teams that want control, repeatability, and a cleaner path to running LLM observability in production.
Langfuse Operator
Langfuse OperatorSelf-host Langfuse on Kubernetes, properly
AI agents can write code fast. Codesteward makes sure they do it safely. Audit every action, map codebase structure, detect risky data flows, summarize sessions, and improve prompts — in one self-hosted open-source platform. Built for teams that want faster AI-assisted development without giving up security, compliance, or control.
Codesteward
CodestewardShip AI-written code without losing control