py310-tokenizers (python/py-tokenizers) Updated: 1 day, 6 hours ago Add to my watchlist

Fast and Customizable Tokenizers

Tokenizers provides an implementation of today's most used tokenizers, with a focus on performance and versatility. Includes BPE, WordPiece, and Unigram tokenizer implementations.

Version: 0.22.2 License: Apache-2 GitHub
Port Health:

Loading Port Health

Installations (30 days)

0

Requested Installations (30 days)

0