CMD + K

libllama

Anaconda Verified

LLM inference in C/C++

Installation

To install this package, run one of the following:

Conda
$conda install main::libllama

Usage Tracking

0.0.7229
0.0.6872
0.0.6402
0.0.6082
4 / 8 versions selected
Total downloads: 0

Description

Inference of Meta's LLaMA model (and others) in pure C/C++

About

Summary

LLM inference in C/C++

Information Last Updated

Dec 9, 2025 at 21:12

License

MIT

Total Downloads

301

Platforms

Linux 64 Version: 0.0.7229
Linux aarch64 Version: 0.0.7229
macOS arm64 Version: 0.0.7229
macOS 64 Version: 0.0.6082
Win 64 Version: 0.0.7229