JAX is Autograd and XLA, brought together for high-performance numerical computing and machine learning research. It provides composable transformations of Python+NumPy programs: differentiate, vectorize, parallelize, Just-In-Time compile to GPU/TPU, and more.
Uploaded | Sun Mar 30 23:21:59 2025 |
md5 checksum | b3d767f3ecfc6e51f260e1c9649100ae |
arch | x86_64 |
build | py312h06a4308_0 |
constrains | protobuf >=3.13,<4 |
depends | jaxlib >=0.4.19, ml_dtypes >=0.2.0, numpy >=1.26.0,<2.0a0, opt_einsum, python >=3.12,<3.13.0a0, scipy >=1.11.1 |
license | Apache-2.0 |
license_family | APACHE |
md5 | b3d767f3ecfc6e51f260e1c9649100ae |
name | jax |
platform | linux |
sha1 | e4ac6e0d9d372817b488203090064a6cc968b47f |
sha256 | 97108fc400376ab898b8a0c95fe07990ce77ab25878165d8cb08da902e50a4ad |
size | 3799083 |
subdir | linux-64 |
timestamp | 1706213380230 |
version | 0.4.23 |