A faster tokenizer for the json-stream Python library. It's actually just json-stream's own tokenizer (itself adapted from the NAYA project) ported to Rust almost verbatim and made available as a Python module using PyO3. On my machine, it speeds up parsing by a factor of 4–10, depending on the nature of the data.
| Uploaded | Mon Mar 31 22:05:49 2025 |
| md5 checksum | fc4b7f93d51332ffbb4b3ac51ed9f976 |
| arch | x86_64 |
| build | py312h95f1b2d_2 |
| build_number | 2 |
| depends | libgcc-ng >=11.2.0, libstdcxx-ng >=11.2.0, python >=3.12,<3.13.0a0 |
| license | MIT |
| license_family | MIT |
| md5 | fc4b7f93d51332ffbb4b3ac51ed9f976 |
| name | json-stream-rs-tokenizer |
| platform | linux |
| sha1 | 763ba83a9ce86e0294f46b0663262eaa4d2e217d |
| sha256 | 24861e501c8805706fabeeb3630fda8bcdc295a0207aa091e74f1ae9c15f329a |
| size | 330288 |
| subdir | linux-64 |
| timestamp | 1732227926419 |
| version | 0.4.22 |