diff --git a/radar/2023-11-01/assisted-ai.md b/radar/2023-11-01/assisted-ai.md index 15959fc..dbecb93 100644 --- a/radar/2023-11-01/assisted-ai.md +++ b/radar/2023-11-01/assisted-ai.md @@ -2,7 +2,7 @@ title: "AI Assisted Programming" ring: assess quadrant: "methods-and-patterns" -tags: [coding, architecture] +tags: [coding, architecture, ai] --- In recent years, the field of Artificial Intelligence (AI) has made monumental strides, and AI has demonstrated its ability to augment human capabilities and enhance user experiences. One noteworthy facet of this evolution is Assisted AI—a paradigm that holds great promise for software development companies. diff --git a/radar/2023-11-01/mlops.md b/radar/2023-11-01/mlops.md index b4ea959..8b0c51a 100644 --- a/radar/2023-11-01/mlops.md +++ b/radar/2023-11-01/mlops.md @@ -2,6 +2,6 @@ title: "MLOps" ring: assess quadrant: methods-and-patterns -tags: [devops] +tags: [devops, ai] featured: false --- diff --git a/radar/2024-06-01/ollama.md b/radar/2024-06-01/ollama.md new file mode 100644 index 0000000..7e4c0da --- /dev/null +++ b/radar/2024-06-01/ollama.md @@ -0,0 +1,16 @@ +--- +title: "Ollama" +ring: trial +quadrant: tools +tags: [ai, coding] +--- + +Running large language models local? + +Downloading [Ollama](https://ollama.com/download) and typing `ollama run llama3` is all you need. + +Ollama is great to run various open source (open weight) models local and interact with them. You can do this either via the command line or via the [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md). + +Ollama takes care of downloading and running models and it supports the specification of own model packages in a "Modelfile". + +At AOE, we use it for local development and testing.