llama.cpp (llm/llama.cpp) Updated: 4 days, 23 hours ago Add to my watchlist
LLM inference in C/C++The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud.
Version: 5634 License: MIT
Statistics for selected duration
2025-May-17 to 2025-Jun-16
Total Installations | 3 |
---|---|
Requested Installations | 3 |
Variants | Count |
---|
Monthly Statistics
Can remain cached for up to 24 hours