py39-bpemb (python/py-bpemb) Updated: 1 month ago Add to my watchlist

Byte pair embeddings in 275 languages

BPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia. Its intended use is as input for neural models in natural language processing.

Version: 0.3.5 License: MIT GitHub
Displaying statistics for 1,024 users who made submissions during: until

Statistics for selected duration

2024-Mar-27 to 2024-Apr-26


No stats available for this selection.

Try changing the range of days. Alternatively visit statistics page to have an overall look at the submitted statistics.