A faster tokenizer for the json-stream Python library. It's actually just json-stream's own tokenizer (itself adapted from the NAYA project) ported to Rust almost verbatim and made available as a Python module using PyO3. On my machine, it speeds up parsing by a factor of 4–10, depending on the nature of the data.
| Uploaded | Mon Mar 31 22:05:47 2025 |
| md5 checksum | 1afc7cf3127a80113281f67598b9b11f |
| arch | x86_64 |
| build | py311h95f1b2d_2 |
| build_number | 2 |
| depends | libgcc-ng >=11.2.0, libstdcxx-ng >=11.2.0, python >=3.11,<3.12.0a0 |
| license | MIT |
| license_family | MIT |
| md5 | 1afc7cf3127a80113281f67598b9b11f |
| name | json-stream-rs-tokenizer |
| platform | linux |
| sha1 | 61ebb921d2e5749666233e3659b9e7a693f7d185 |
| sha256 | e05628f34f478feb2087e4e310da113fa1914d3175dea0fae266cd3fcf10cc4a |
| size | 330506 |
| subdir | linux-64 |
| timestamp | 1732227818634 |
| version | 0.4.22 |