A faster tokenizer for the json-stream Python library. It's actually just json-stream's own tokenizer (itself adapted from the NAYA project) ported to Rust almost verbatim and made available as a Python module using PyO3. On my machine, it speeds up parsing by a factor of 4–10, depending on the nature of the data.
| Uploaded | Mon Mar 31 22:05:45 2025 |
| md5 checksum | ce3efc1bba068a30b6d5a8e0d6160484 |
| arch | x86_64 |
| build | py310h3038463_1 |
| build_number | 1 |
| depends | libgcc-ng >=11.2.0, libstdcxx-ng >=11.2.0, python >=3.10,<3.11.0a0 |
| license | MIT |
| license_family | MIT |
| md5 | ce3efc1bba068a30b6d5a8e0d6160484 |
| name | json-stream-rs-tokenizer |
| platform | linux |
| sha1 | 62002531f59c555055a6ac1c419f30f089158a59 |
| sha256 | df541182f479208f1c975c170adf30cd6b12258f8857888e12e37e562d40c3f9 |
| size | 1025527 |
| subdir | linux-64 |
| timestamp | 1696355331058 |
| version | 0.4.22 |