py39-bpemb (python/py-bpemb) Add to my watchlist

Byte pair embeddings in 275 languages

BPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia. Its intended use is as input for neural models in natural language processing.

Version: 0.3.2 License: MIT GitHub
Displaying statistics for 774 users who made submissions during: until

Statistics for selected duration

2022-Oct-30 to 2022-Nov-29


Total Installations 1
Requested Installations 0


macOS Versions

Loading Chart

Port Versions

Loading Chart



Xcode Versions

Loading Chart

CLT Versions

Loading Chart



Variants table

Variants Count


Monthly Statistics

Can remain cached for up to 24 hours