About Anaconda Help Download Anaconda

r_test / packages / r-tokenizers

Convert natural language text into tokens. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, shingled characters, lines, tweets, Penn Treebank, regular expressions, as well as functions for counting characters, words, and sentences, and a function for splitting longer texts into separate documents, each with the same number of words. The tokenizers have a consistent interface, and the package is built on the 'stringi' and 'Rcpp' packages for fast yet correct tokenization in 'UTF-8'.

Type Size Name Uploaded Downloads Labels
conda 647.6 kB | linux-64/r-tokenizers-0.2.1-r36h29659fb_0.tar.bz2  5 years and 9 months ago 3 main
conda 659.9 kB | win-32/r-tokenizers-0.2.1-r36h6115d3f_0.tar.bz2  5 years and 9 months ago 1 main
conda 655.6 kB | win-64/r-tokenizers-0.2.1-r36h6115d3f_0.tar.bz2  5 years and 9 months ago 3 main
conda 640.4 kB | osx-64/r-tokenizers-0.2.1-r36h466af19_0.tar.bz2  5 years and 9 months ago 3 main

© 2025 Anaconda, Inc. All Rights Reserved. (v4.0.7) Legal | Privacy Policy