A faster tokenizer for the json-stream Python library. It's actually just json-stream's own tokenizer (itself adapted from the NAYA project) ported to Rust almost verbatim and made available as a Python module using PyO3. On my machine, it speeds up parsing by a factor of 4–10, depending on the nature of the data.
| Uploaded | Mon Mar 31 22:05:47 2025 |
| md5 checksum | dc747f344f9c5df99649b9c065d62dd3 |
| arch | x86_64 |
| build | py311h3038463_1 |
| build_number | 1 |
| depends | libgcc-ng >=11.2.0, libstdcxx-ng >=11.2.0, python >=3.11,<3.12.0a0 |
| license | MIT |
| license_family | MIT |
| md5 | dc747f344f9c5df99649b9c065d62dd3 |
| name | json-stream-rs-tokenizer |
| platform | linux |
| sha1 | 0353d0d49eca21a78239eb0060307488e1ed142c |
| sha256 | 24cee499a0b4d956157120eb889c03d51a8b373e82dd07ec8ace74a4600a70fe |
| size | 1028574 |
| subdir | linux-64 |
| timestamp | 1696355407565 |
| version | 0.4.22 |