No Description
| Uploaded | Tue Jan 20 21:41:24 2026 |
| md5 checksum | 8c3ba25f7cbb514be3c7776abd2b77ba |
| arch | aarch64 |
| build | py312h0d1031a_0 |
| depends | diskcache >=5.6.1, fastapi >=0.100.0, jinja2 >=2.11.3, libgcc >=14, libstdcxx >=14, llama.cpp >=0.0.6188,<0.0.6239, numpy >=1.20.0, pydantic-settings >=2.0.1, python >=3.12,<3.13.0a0, pyyaml >=5.1, sse-starlette >=1.6.1, starlette-context >=0.3.6, typing-extensions >=4.5.0, uvicorn >=0.22.0 |
| license | MIT |
| md5 | 8c3ba25f7cbb514be3c7776abd2b77ba |
| name | llama-cpp-python |
| platform | linux |
| sha256 | 6baae72e91d8b50664a15105cf9a6a93f30ce9b3be588cb83a5af2fe03224716 |
| size | 274137 |
| source_url | http://repo.continuum.io/pkgs/main/linux-aarch64/llama-cpp-python-0.3.16-py312h0d1031a_0.tar.bz2 |
| subdir | linux-aarch64 |
| timestamp | 1768943935443 |
| version | 0.3.16 |