CMD + K

libllama

Anaconda Verified

LLM inference in C/C++

Installation

To install this package, run one of the following:

Conda
$conda install main::libllama

Usage Tracking

0.0.7984
0.0.7710
0.0.7229
0.0.6872
0.0.6402
5 / 8 versions selected
Downloads (Last 6 months): 0

Description

Inference of Meta's LLaMA model (and others) in pure C/C++

About

Summary

LLM inference in C/C++

Last Updated

Feb 20, 2026 at 16:04

License

MIT

Total Downloads

550

Version Downloads

65

Supported Platforms

linux-64
linux-aarch64
macOS-arm64
win-64

Unsupported Platforms

macOS-64 Last supported version: 0.0.6082