py39-bpemb (python/py-bpemb) Add to my watchlistByte pair embeddings in 275 languages
BPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia. Its intended use is as input for neural models in natural language processing.
Version: 0.3.2 License: MIT GitHub
Statistics for selected duration
2022-Oct-30 to 2022-Nov-29
Can remain cached for up to 24 hours