chore: set v8 release date
This commit is contained in:
committed by
Stefan Rotsch
parent
f54ce2039c
commit
09cd2016ef
16
radar/2024-07-10/ollama.md
Normal file
16
radar/2024-07-10/ollama.md
Normal file
@@ -0,0 +1,16 @@
|
||||
---
|
||||
title: "Ollama"
|
||||
ring: trial
|
||||
quadrant: tools
|
||||
tags: [ai, coding]
|
||||
---
|
||||
|
||||
Running large language models locally?
|
||||
|
||||
Downloading [Ollama](https://ollama.com/download) and typing `ollama run llama3` is all you need.
|
||||
|
||||
Ollama is great for running various open source (open weight) models locally and interacting with them. You can do this either via the command line or via the [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md).
|
||||
|
||||
Ollama takes care of downloading and running models, and it supports the specification of your own model packages in a "Modelfile".
|
||||
|
||||
At AOE, we use it for local development and testing.
|
||||
Reference in New Issue
Block a user