Skip to content

Basic implementation of BERT and Transformer in Pytorch in one short python file

License

Notifications You must be signed in to change notification settings

voyager2009/BERT-Transformer-Pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

BERT-Transformer-Pytorch

Basic implementation of BERT and Transformer in Pytorch in one python file of ~300 lines of code.

This project aims to provide an easy-to-run easy-to-understand code for NLP beginners and people who want to know how Transformers work.
The project uses a simplified implementation of BERT (unsupervised learning).
The original implementation of Transformer uses an encoder and a decoder, here we only need the encoder.
The model can train in 30 minutes on 1 x RTX2070Super GPU.

Visualization of word embeddings: alt text

Implementation details: https://hyugen-ai.medium.com/transformers-in-pytorch-from-scratch-for-nlp-beginners-ff3b3d922ef7

About

Basic implementation of BERT and Transformer in Pytorch in one short python file

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%