Courtney Bellamy

Courtney Bellamy

Forums

Isn't there a simpler way to run LLMs / models locally ?

Hi everyone,

I'm currently exploring a project idea : create an ultra-simple tool for launching open source LLM models locally, without the hassle, and I'd like to get your feedback.

The current problem:

I'm not a dev or into IT or anything, but I've become fascinated by the subject of local LLMs , but running an LLM model on your own PC can be a real pain in the ass :