No Description
| Uploaded | Tue Jan 20 21:41:28 2026 |
| md5 checksum | dda9fc928cc951b0ea06f8db527691a4 |
| arch | x86_64 |
| build | py313h5da7b33_0 |
| depends | diskcache >=5.6.1, fastapi >=0.100.0, jinja2 >=2.11.3, llama.cpp >=0.0.6188,<0.0.6239, numpy >=1.20.0, pydantic-settings >=2.0.1, python >=3.13,<3.14.0a0, python_abi 3.13.* *_cp313, pyyaml >=5.1, sse-starlette >=1.6.1, starlette-context >=0.3.6, typing-extensions >=4.5.0, ucrt >=10.0.20348.0, uvicorn >=0.22.0, vc >=14.2,<15, vc14_runtime >=14.29.30133 |
| license | MIT |
| md5 | dda9fc928cc951b0ea06f8db527691a4 |
| name | llama-cpp-python |
| platform | win |
| sha256 | 4cb5c01a88b9760a6e5f9186476dd016d41b2e0d81147d586d0c2837f68c2f79 |
| size | 279265 |
| source_url | http://repo.continuum.io/pkgs/main/win-64/llama-cpp-python-0.3.16-py313h5da7b33_0.tar.bz2 |
| subdir | win-64 |
| timestamp | 1768944022217 |
| version | 0.3.16 |