--- title: "Ollama" ring: trial quadrant: tools tags: [ai, coding] --- Running large language models local? Downloading [Ollama](https://ollama.com/download) and typing `ollama run llama3` is all you need. Ollama is great to run various open source (open weight) models local and interact with them. You can do this either via the command line or via the [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md). Ollama takes care of downloading and running models and it supports the specification of own model packages in a "Modelfile". At AOE, we use it for local development and testing.