All activity
marcusmartinsleft a comment
Recently I a long flight and having ollama (with llama2) locally really helped me prototype some quick changes to our product without having to rely on spotty plane wifi.

OllamaThe easiest way to run large language models locally
