v 0.3.2

Byte pair embeddings in 275 languages

BPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia. Its intended use is as input for neural models in natural language processing.

Add to my watchlist

Installations 0
Requested Installations 0