v 0.3.5 Updated: 7 months, 2 weeks ago
Byte pair embeddings in 275 languages
BPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia. Its intended use is as input for neural models in natural language processing.
Installations | 0 |
Requested Installations | 0 |