ollama (llm/ollama) Updated: 1 week ago Add to my watchlist

ollama runs and manages LLMs

Get up and running with large language models easily

Version: 0.12.3 License: MIT GitHub
Maintainers rdallman i0ntempest
Categories llm
Homepage https://ollama.com
Platforms darwin freebsd linux
Variants
  • logging (Enable logging for startup item)

"ollama" depends on

run (1)
build (2)

Ports that depend on "ollama"

No ports


Port notes

The example config file is copied to ${prefix}/etc/ollama/ollama_env.conf and its content will be preserved across upgrades and reinstalls. This config file configures ollama to fetch models to ${prefix}/var/ollama/models. The startup item will use this config file by default. A startup item has been generated that will aid in starting ollama with launchd. It is disabled by default. Execute the following command to start it, and to cause it to launch at startup:

sudo port load ollama


Port Health:

Loading Port Health

Installations (30 days)

5

Requested Installations (30 days)

5