trending
Chris Messina

5h ago

Mesh LLM - Pool compute to run powerful open models

Turn spare capacity into an auto-configured p2p inference cloud. Serve many models, access your private models from anywhere, or share compute with others, let your agents collaborate p2p.