17 lines
646 B
Markdown
17 lines
646 B
Markdown
---
|
|
title: "Ollama"
|
|
ring: trial
|
|
quadrant: tools
|
|
tags: [ai, coding]
|
|
---
|
|
|
|
Running large language models locally?
|
|
|
|
Downloading [Ollama](https://ollama.com/download) and typing `ollama run llama3` is all you need.
|
|
|
|
Ollama is great for running various open source (open weight) models locally and interacting with them. You can do this either via the command line or via the [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md).
|
|
|
|
Ollama takes care of downloading and running models, and it supports the specification of your own model packages in a "Modelfile".
|
|
|
|
At AOE, we use it for local development and testing.
|