A faster tokenizer for the json-stream Python library. It's actually just json-stream's own tokenizer (itself adapted from the NAYA project) ported to Rust almost verbatim and made available as a Python module using PyO3. On my machine, it speeds up parsing by a factor of 4–10, depending on the nature of the data.
| Uploaded | Mon Mar 31 22:05:48 2025 |
| md5 checksum | afe36ed3c60b3fd849ab153dc0eabc71 |
| arch | x86_64 |
| build | py312h3038463_1 |
| build_number | 1 |
| depends | libgcc-ng >=11.2.0, libstdcxx-ng >=11.2.0, python >=3.12,<3.13.0a0 |
| license | MIT |
| license_family | MIT |
| md5 | afe36ed3c60b3fd849ab153dc0eabc71 |
| name | json-stream-rs-tokenizer |
| platform | linux |
| sha1 | fb2fcb08fe598394d82ecc45b4d782a6097577f0 |
| sha256 | 2141e5bbc0be1e78cd434fd6f80825bdcb38d1f1d728ca259455d93144a104b4 |
| size | 943281 |
| subdir | linux-64 |
| timestamp | 1698888428701 |
| version | 0.4.22 |