py310-bpemb

v 0.3.5 Updated: 8 months ago

Byte pair embeddings in 275 languages

BPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia. Its intended use is as input for neural models in natural language processing.

https://nlp.h-its.org/bpemb

To install py310-bpemb, paste this in macOS terminal after installing MacPorts

sudo port install py310-bpemb

Add to my watchlist

Installations 0
Requested Installations 0