ramalama (llm/ramalama) Updated: 5 days, 17 hours ago Add to my watchlist
A tool to simplify the use of local AI modelsRamalama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.
Version: 0.16.0 License: MIT
GitHub
| Maintainers | neverpanic |
| Categories | science llm |
| Homepage | https://ramalama.ai/ |
| Platforms | darwin |
| Variants | - |
"ramalama" depends on
lib (1)
run (4)
build (5)
Ports that depend on "ramalama"
No ports
Port notes
ramalama defaults to running AI models in podman containers in a podman machine (i.e., VM) started by libkrun. This is not the podman default, so you will have to change it, either by exporting the CONTAINERS_MACHINE_PROVIDER=libkrun environment variable, or by adding 'provider = "libkrun"' to the '[machine]' section of '$HOME/.config/containers/containers.conf'. See man 7 ramalama-macos for more information.
Port Health:
Loading Port Health