CMD + K

r-tokenizers

Community

Convert natural language text into tokens. The tokenizers have a consistent interface and are compatible with Unicode, thanks to being built on the 'stringi' package. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, lines, and regular expressions.

Installation

To install this package, run one of the following:

Conda
$conda install kite-eating-tree::r-tokenizers

Usage Tracking

0.1.4
1 / 8 versions selected
Downloads (Last 6 months): 0

About

Summary

Convert natural language text into tokens. The tokenizers have a consistent interface and are compatible with Unicode, thanks to being built on the 'stringi' package. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, lines, and regular expressions.

Last Updated

Feb 2, 2018 at 23:17

License

MIT + file LICENSE

Total Downloads

32

Supported Platforms

linux-64