This package provides spaCy components and architectures to use transformer models via Hugging Face's transformers in spaCy. The result is convenient access to state-of-the-art transformer architectures, such as BERT, GPT-2, XLNet, etc.
Uploaded | Tue Apr 1 00:14:06 2025 |
md5 checksum | 07a9b18e0ec438d3b1ca7ae36669c658 |
arch | x86_64 |
build | py310h06a4308_0 |
depends | python >=3.10,<3.11.0a0, pytorch >=1.6.0, spacy >=3.1.3,<4.0.0, spacy-alignments >=0.7.2,<1.0.0, srsly >=2.4.0,<3.0.0, transformers >=3.4.0,<4.19.0 |
license | MIT |
license_family | MIT |
md5 | 07a9b18e0ec438d3b1ca7ae36669c658 |
name | spacy-transformers |
platform | linux |
sha1 | cfb1a0e91f49a49ca7d595776540f45b27f556ae |
sha256 | 360d88a2596ddb44c906000d9da7d2db0243953d431c9b5257d885b6c4696757 |
size | 79029 |
subdir | linux-64 |
timestamp | 1651845980317 |
version | 1.1.5 |