A faster tokenizer for the json-stream Python library. It's actually just json-stream's own tokenizer (itself adapted from the NAYA project) ported to Rust almost verbatim and made available as a Python module using PyO3. On my machine, it speeds up parsing by a factor of 4–10, depending on the nature of the data.
| Uploaded | Mon Mar 31 22:05:45 2025 |
| md5 checksum | 40894c7a412e18dc5198bc54bfc03c5a |
| arch | x86_64 |
| build | py310h95f1b2d_2 |
| build_number | 2 |
| depends | libgcc-ng >=11.2.0, libstdcxx-ng >=11.2.0, python >=3.10,<3.11.0a0 |
| license | MIT |
| license_family | MIT |
| md5 | 40894c7a412e18dc5198bc54bfc03c5a |
| name | json-stream-rs-tokenizer |
| platform | linux |
| sha1 | 63fa405628ea83c2a02c490ce218f0cb017cae77 |
| sha256 | 63bc32e916d659950136777234ce0da11ef9011c9093ec114f55aa339f63a13c |
| size | 327601 |
| subdir | linux-64 |
| timestamp | 1732228038727 |
| version | 0.4.22 |