ollama (llm/ollama) Updated: 1 week ago Add to my watchlist
ollama runs and manages LLMsGet up and running with large language models easily
Version: 0.12.3 License: MIT
Maintainers | rdallman i0ntempest |
Categories | llm |
Homepage | https://ollama.com |
Platforms | darwin freebsd linux |
Variants |
|
"ollama" depends on
run (1)
build (2)
Ports that depend on "ollama"
No ports
Port notes
The example config file is copied to ${prefix}/etc/ollama/ollama_env.conf and its content will be preserved across upgrades and reinstalls. This config file configures ollama to fetch models to ${prefix}/var/ollama/models. The startup item will use this config file by default. A startup item has been generated that will aid in starting ollama with launchd. It is disabled by default. Execute the following command to start it, and to cause it to launch at startup:
sudo port load ollama
Port Health:
Loading Port Health