Launching today

Docsector Documentation System
Turn Markdown into beautiful, AI-ready documentation sites
1 follower
Turn Markdown into beautiful, AI-ready documentation sites
1 follower
Docsector Reader is a documentation rendering engine built with Vue 3, Quasar v2 and Vite. It transforms Markdown files into beautiful, navigable documentation sites with AI-friendly features: Copy Page for LLMs, MCP Server auto-generation, Markdown Negotiation, LLM Bot Detection, and one-click Open in ChatGPT/Claude. Includes i18n, dark/light mode, Mermaid diagrams, and more.





Hi Product Hunt! π
I built Docsector Documentation System because I was frustrated with documentation tools that weren't designed for the AI era and wasn't free, quick, and easy to setup. Most doc engines treat docs as static pages β but today, AI assistants, LLMs, and browser agents need to read and interact with your docs too.
So I built Docsector Reader with AI-friendly features at its core:
- π Copy Page β One-click button copies the current page as raw Markdown, ready to paste into LLMs
- π View as Markdown β Open any page as plain text by appending `.md` to the URL, with locale support (`?lang=`)
- π§ Markdown Negotiation β Requests with `Accept: text/markdown` receive markdown responses, while browsers keep HTML by default
- π Web Bot Auth Directory β Optional signed JWKS directory at `/.well-known/http-message-signatures-directory` for bot identity verification
- π€ Open in ChatGPT / Claude β One-click links to open the current page directly in ChatGPT or Claude for Q&A
- π€ LLM Bot Detection β Automatically serves raw Markdown to known AI crawlers (GPTBot, ClaudeBot, PerplexityBot, GrokBot, and others)
- πΊοΈ Sitemap Generation β Automatic `sitemap.xml` generation at build time with all page URLs (requires `siteUrl` in config)
- π€ AI-Friendly robots.txt β Scaffold includes a `robots.txt` explicitly allowing 23 AI crawlers (GPTBot, ClaudeBot, PerplexityBot, GrokBot, etc.)
- π§ Content Signals β Optional `Content-Signal` directive for declaring AI usage policy (`ai-train`, `search`, `ai-input`) in `robots.txt`
- π§© Agent Skills Discovery Index β Optional `/.well-known/agent-skills/index.json` with RFC v0.2.0 schema and SHA-256 digests
- πͺͺ MCP Server Card β Optional `/.well-known/mcp/server-card.json` for MCP server discovery before connection
- π WebMCP Browser Tools β Optional registration of in-page tools via `navigator.modelContext` for browser agents
- π Homepage Link Headers β Auto-generated `Link` response headers for agent discovery (`api-catalog`, `service-doc`, `service-desc`, `describedby`) per RFC 8288 / RFC 9727
- π MCP Server β Auto-generated [MCP](https://modelcontextprotocol.io) server at `/mcp` for AI assistant integration (Claude Desktop, VS Code, etc.)
- π llms.txt / llms-full.txt β Auto-generated [llms.txt](https://llmstxt.org) index and full-content file for LLM discovery (requires `siteUrl` in config)
On top of that, it's a full-featured doc engine: Markdown rendering with Prism.js, Mermaid diagrams, GitHub-style alerts, i18n (HJSON), dark/light mode, anchor navigation, search, and much more β all from a single config file.
Would love your feedback! Happy to answer any questions. π