CMD + K

pytorch-pretrained-bert

Community

PyTorch version of Google AI BERT model with script to load Google pre-trained models

Installation

To install this package, run one of the following:

Conda
$conda install vikigenius::pytorch-pretrained-bert

Usage Tracking

0.6.2
1 / 8 versions selected
Downloads (Last 6 months): 0

Description

This repository contains op-for-op PyTorch reimplementations, pre-trained models and fine-tuning examples for: - Google's BERT model, - OpenAI's GPT model, - Google/CMU's Transformer-XL model, and - OpenAI's GPT-2 model. These implementations have been tested on several datasets (see the examples) and should match the performances of the associated TensorFlow implementations (e.g. ~91 F1 on SQuAD for BERT, ~88 F1 on RocStories for OpenAI GPT and ~18.3 perplexity on WikiText 103 for the Transformer-XL).

About

Summary

PyTorch version of Google AI BERT model with script to load Google pre-trained models

Last Updated

Nov 21, 2019 at 00:31

License

Apache-2.0

Supported Platforms

linux-64