Search
Results
google-research/bert: TensorFlow code and pre-trained models for BERT
google/bert_uncased_L-4_H-256_A-4 ยท Hugging Face
[https://huggingface.co/google/bert_uncased_L-4_H-256_A-4] - - public:mzimmerm
Repository of all Bert models, including small. Start using this model for testing.
Training Bert on Yelp - Copy of training.ipynb - Colaboratory
[https://colab.research.google.com/drive/1FhwrZ05umMvj4cshnEMUOLxjD9ynvCy9#scrollTo=nCFiAJ55LcLt] - - public:mzimmerm
BERT 101 - State Of The Art NLP Model Explained
[https://huggingface.co/blog/bert-101] - - public:mzimmerm
Best summary of Natural Language Processing and terms - model (a language model - e.g. BertModel, defines encoder and decoder and their properties), transformer (a specific neural network based on attention paper), encoder (series of transformers on input), decoders (series of transformers on output). Bert does NOT use decoder. TensorFlow and PyTorch are possible backends to Transformers (NN). Summary: BERT is a highly complex and advanced language model that helps people automate language understanding.
BERT vs GPT: A Tale of Two Transformers That Revolutionized NLP | by Tavva Prudhvith | Medium
[https://medium.com/@prudhvithtavva/bert-vs-gpt-a-tale-of-two-transformers-that-revolutionized-nlp-11fff8e61984] - - public:mzimmerm