Running LLMs Locally with Ollama
I’ve been surprised at how many folks are unaware that you can run LLM models of impressive quality locally, on your own personal machine. This means that with a bit of initial set-up, you can run an LLM without access to the internet, without giving OpenAI or Anthropic or Google any money, and without giving your data over to others.