Files
TechradarDev/radar/2024-06-01/ollama.md
2024-07-10 11:05:49 +02:00

628 B

title, ring, quadrant, tags
title ring quadrant tags
Ollama trial tools
ai
coding

Running large language models local?

Downloading Ollama and typing ollama run llama3 is all you need.

Ollama is great to run various open source (open weight) models local and interact with them. You can do this either via the command line or via the Ollama API.

Ollama takes care of downloading and running models and it supports the specification of own model packages in a "Modelfile".

At AOE, we use it for local development and testing.