Wei Chen

Wei Chen

AqueductAqueduct
Building exciting AI support agents
50 points
All activity
Built on 10 years of UC Berkeley research, RunLLM reads logs, code and docs to resolve complex support issues. Saves 30%+ eng time, cuts MTTR by 50%, deflects up to 99% of tickets. Trusted by Databricks, Sourcegraph and Corelight—try for free on your product.
RunLLM
RunLLMAI that doesn’t just respond—it resolves
Aqueduct's LLM support makes it easy for you to run open-source LLMs on any infrastructure that you use. With a single API call, you can run an LLM on a single prompt or even on a whole dataset!
Aqueduct
AqueductThe easiest way to run open source LLMs
Aqueduct automates the engineering required to take data science to production. By abstracting away low-level cloud infrastructure, Aqueduct enables data teams to run models anywhere, publish predictions where they're needed, and monitor results reliably.
Aqueduct
AqueductTaking data science to production