docs: proofread and finalize blips for v8

This commit is contained in:
Stefan Rotsch
2024-06-27 10:20:01 +02:00
committed by Stefan Rotsch
parent 60f12f9549
commit 0fedaab680
40 changed files with 76 additions and 114 deletions

View File

@@ -5,12 +5,12 @@ quadrant: tools
tags: [ai, coding]
---
Running large language models local?
Running large language models locally?
Downloading [Ollama](https://ollama.com/download) and typing `ollama run llama3` is all you need.
Ollama is great to run various open source (open weight) models local and interact with them. You can do this either via the command line or via the [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md).
Ollama is great for running various open source (open weight) models locally and interacting with them. You can do this either via the command line or via the [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md).
Ollama takes care of downloading and running models and it supports the specification of own model packages in a "Modelfile".
Ollama takes care of downloading and running models, and it supports the specification of your own model packages in a "Modelfile".
At AOE, we use it for local development and testing.