ramalama (llm/ramalama) Updated: 2 months ago Add to my watchlist

A tool to simplify the use of local AI models

Ramalama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.

Version: 0.12.2 License: MIT GitHub
Maintainers neverpanic
Categories science llm
Homepage https://ramalama.ai/
Platforms darwin
Variants -

"ramalama" depends on

lib (1)
run (3)
build (5)

Ports that depend on "ramalama"

No ports


Port notes

ramalama defaults to running AI models in podman containers in a podman machine (i.e., VM) started by libkrun. This is not the podman default, so you will have to change it, either by exporting the CONTAINERS_MACHINE_PROVIDER=libkrun environment variable, or by adding 'provider = "libkrun"' to the '[machine]' section of '$HOME/.config/containers/containers.conf'. See man 7 ramalama-macos for more information.


Port Health:

Loading Port Health

Installations (30 days)

1

Requested Installations (30 days)

1

Livecheck results

ramalama seems to have been updated (port version: 0.12.2, new version: 0.14.0)

livecheck ran: 1 day, 10 hours ago