Are you using LLM for local inference? Why not ChatGPT/Claude via API?

Greg Z
2 replies

Replies

Eugene Bennett
I appreciate the flexibility of the Local Language Model (LLM), I prefer using the ChatGPT-Claude via API due to its superior contextual understanding and performance in conversational AI, and because the direct API integration offers convenience for deployment in various applications.
Gary Sztajnman
For data privacy reasons