CMD + K

r-tokenizers

Community

Convert natural language text into tokens. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, shingled characters, lines, tweets, Penn Treebank, regular expressions, as well as functions for counting characters, words, and sentences, and a function for splitting longer texts into separate documents, each with the same number of words. The tokenizers have a consistent interface, and the package is built on the 'stringi' and 'Rcpp' packages for fast yet correct tokenization in 'UTF-8'.

Installation

To install this package, run one of the following:

Conda
$conda install mro_test::r-tokenizers

Usage Tracking

0.2.1
0.1.4
2 / 8 versions selected
Total downloads: 0

About

Summary

Convert natural language text into tokens. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, shingled characters, lines, tweets, Penn Treebank, regular expressions, as well as functions for counting characters, words, and sentences, and a function for splitting longer texts into separate documents, each with the same number of words. The tokenizers have a consistent interface, and the package is built on the 'stringi' and 'Rcpp' packages for fast yet correct tokenization in 'UTF-8'.

Information Last Updated

Mar 25, 2025 at 16:21

License

MIT + file LICENSE

Total Downloads

16

Platforms

Linux 64 Version: 0.2.1