About Anaconda Help Download Anaconda

conda-forge / packages / r-tokenizers 0.3.0

Convert natural language text into tokens. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, shingled characters, lines, tweets, Penn Treebank, regular expressions, as well as functions for counting characters, words, and sentences, and a function for splitting longer texts into separate documents, each with the same number of words. The tokenizers have a consistent interface, and the package is built on the 'stringi' and 'Rcpp' packages for fast yet correct tokenization in 'UTF-8'.

copied from cf-staging / r-tokenizers

Installers

Info: This package contains files in non-standard labels.
  • osx-64 v0.3.0
  • osx-arm64 v0.3.0
  • linux-64 v0.3.0
  • linux-ppc64le v0.3.0
  • linux-aarch64 v0.3.0
  • win-64 v0.3.0

conda install

To install this package run one of the following:
conda install conda-forge::r-tokenizers
conda install conda-forge/label/cf201901::r-tokenizers
conda install conda-forge/label/cf202003::r-tokenizers
conda install conda-forge/label/gcc7::r-tokenizers

Description


© 2025 Anaconda, Inc. All Rights Reserved. (v4.0.7) Legal | Privacy Policy