All activity
Felix Leeleft a comment
Excited to try this! Just set up openclaw on my Mac mini yesterday with ollama to serve the API access to my local llm. However, it seems that I did not configure my settings/ local llm correctly as my openclaw takes 5-15 minutes to respond to my Telegram text each time... I hope this helps with a cleaner and appropriate set up! Will this work for a local llm set up via ollama?

EntropicThe safe, 1-click way to run OpenClaw on your own machine.
