Chinmay Singh

Chinmay Singh

TrueFoundry AI GatewayTrueFoundry AI Gateway
Building TrueFoundry

About

Currently working on product and business strategy at TrueFoundry - a Generative AI Ops Startup. Have previously worked at McKinsey & Company as a management consultant and at Apple, IBM, and CERN as a Machine Learning Engineer. Curious about business and technology in AI.

Badges

Top 5 Launch
Top 5 Launch
Tastemaker
Tastemaker
Gone streaking 10
Gone streaking 10
Gone streaking
Gone streaking
View all badges

Maker History

  • TrueFoundry AI Gateway
    TrueFoundry AI GatewayConnect, observe & control LLMs, MCPs, Guardrails & Prompts
    Dec 2025
  • 🎉
    Joined Product HuntNovember 6th, 2025

Forums

Learnings from shipping MCPs to production. Also, we go live tomorrow 🎉

MCPs are winning the protocol race. As a result, we have seen multiple F500 enterprises go live with them. For every company, this has brought the same rough edges to light. Our repeatable learnings are:

  1. Identity first: Decide who can call which tool, with what scope, and how tokens rotate. OAuth and role-based access stop approval chaos.

  2. Stop config sprawl: As servers grow, clients should not juggle endpoints and secrets. Keep one registry for discovery and access.

  3. Mind the transport gap: Many servers speak stdio, production wants HTTP with health checks and load balancing. Put a gateway in front to translate and scale.

  4. Secrets belong in a vault: Issue-scoped credentials, inject at the edge, and never ship keys in editors or repos.

  5. Trace everything: You want the user to see the tool-to-latency-to-cost on one timeline, not five logs.

  6. Compose, do not expose: Different teams need different tool sets. Build curated virtual servers and keep dev, staging, and prod cleanly isolated.

What we built because of this
TrueFoundry s MCP Gateway provides a single entry point with OAuth, fine-grained permissions, a discovery registry, and deep observability. It runs in your cloud or on premises, so data stays in your domain. It can front stdio servers with clean HTTP and lets you compose virtual servers for each team.

AI Gateways: from “just a proxy” to the GenAI control plane

A year ago, an AI/LLM Gateway felt like a thin layer: auth + simple routing across a few model providers. That era s over. As teams ship agentic apps with many moving parts (models, tools via MCP, prompts, guardrails) the complex problems are now control, standardization, and observability.

What a modern gateway really does:

  • Unified interface & routing: Swap models/providers without code changes; policy-based routing (latency/cost/quality), failover.

  • Centralized access & governance: One place for keys, RBAC, per-team quotas, audit logs, and data residency.

  • Guardrails at the edge: PII redaction, safety/moderation, jailbreak & prompt-injection checks, tool permissioning.

  • Experimentation & evals: Prompt/version management, playgrounds to connect models + MCPs and build agents

  • Deep observability: Traces for prompts/responses/tools, tokens/cost, latency SLOs, drift signals; caching/rate-limits/batching.

View more