rikarazome

prolog-reasoner - Give your LLM a logic engine — SWI-Prolog as an MCP server

by
LLMs are great at language but bad at logic. Ask Claude to solve SEND + MORE = MONEY and it guesses wrong. prolog-reasoner gives LLMs access to SWI-Prolog via MCP. The LLM writes Prolog, Prolog solves it, the answer comes back correct and debuggable. Results on 30 logic problems: 73% → 90% accuracy. The gain is in constraint satisfaction and multi-step reasoning — where guessing fails. MCP server for Claude Desktop, Claude Code, Cursor. Rule bases for reusable domain logic. Open source (MIT).

Add a comment

Replies

Best
rikarazome
Maker
📌
I watched a video about the history of AI — how engineers originally tried to build intelligence with logic programming (like Prolog), hit a wall, and eventually arrived at LLMs through a completely different path. That got me thinking: what if we just connected them? LLMs are great at language but terrible at formal logic. Prolog is great at logic but can't understand natural language. They have complementary strengths. So I built prolog-reasoner: the LLM writes Prolog, Prolog solves it, and the answer comes back correct and debuggable. I tested this on 30 logic problems across 5 categories. The accuracy went from 73% to 90%. The biggest gains were in constraint satisfaction and multi-step reasoning — exactly where you'd expect a logic engine to help. What I like most: when it gets the answer wrong, you can see *why*. The Prolog code is right there. No black box. It's an MCP server, so it works with Claude Desktop, Claude Code, Cursor, and anything else that supports MCP. Setup is adding 5 lines to your config. Would love feedback — especially if you have use cases in mind. The rule base feature lets you save domain-specific logic (expense approval rules, compliance checks, etc.) so the LLM only writes case-specific data each time. GitHub: https://github.com/rikarazome/pr...