CMD + K

pytorch-pretrained-bert

Community

PyTorch version of Google AI BERT model with script to load Google pre-trained models

Installation

To install this package, run one of the following:

Conda
$conda install conda-forge::pytorch-pretrained-bert

Usage Tracking

0.6.2
0.5.1
0.5.0
0.4.0
0.1.2
5 / 8 versions selected
Downloads (Last 6 months): 0

Description

This repository contains op-for-op PyTorch reimplementations, pre-trained models and fine-tuning examples for: - Google's BERT model, - OpenAI's GPT model, - Google/CMU's Transformer-XL model, and - OpenAI's GPT-2 model. These implementations have been tested on several datasets (see the examples) and should match the performances of the associated TensorFlow implementations (e.g. ~91 F1 on SQuAD for BERT, ~88 F1 on RocStories for OpenAI GPT and ~18.3 perplexity on WikiText 103 for the Transformer-XL).

About

Summary

PyTorch version of Google AI BERT model with script to load Google pre-trained models

Last Updated

May 10, 2019 at 10:47

License

Apache-2.0

Total Downloads

89.2K

Version Downloads

43.4K

Supported Platforms

linux-64