pytorch-pretrained-bert
PyTorch version of Google AI BERT model with script to load Google pre-trained models
PyTorch version of Google AI BERT model with script to load Google pre-trained models
To install this package, run one of the following:
This repository contains op-for-op PyTorch reimplementations, pre-trained models and fine-tuning examples for: - Google's BERT model, - OpenAI's GPT model, - Google/CMU's Transformer-XL model, and - OpenAI's GPT-2 model. These implementations have been tested on several datasets (see the examples) and should match the performances of the associated TensorFlow implementations (e.g. ~91 F1 on SQuAD for BERT, ~88 F1 on RocStories for OpenAI GPT and ~18.3 perplexity on WikiText 103 for the Transformer-XL).
Summary
PyTorch version of Google AI BERT model with script to load Google pre-trained models
Last Updated
May 10, 2019 at 10:47
License
Apache-2.0
Total Downloads
89.2K
Version Downloads
43.4K
Supported Platforms