Search
Results
Optimum
[https://huggingface.co/docs/optimum/index] - - public:mzimmerm
Optimum is an extension of Transformers that provides a set of performance optimization tools to train and run models on targeted hardware with maximum efficiency. It is also the repository of small, mini, tiny models.
BERT Transformers – How Do They Work? | Exxact Blog
[https://www.exxactcorp.com/blog/Deep-Learning/how-do-bert-transformers-work] - - public:mzimmerm
Excellent document about BERT transformers / models and their parameters: - L=number of layers. - H=size of the hidden layer = number of vectors for each word in the sentence. - A = Number of self-attention heads - Total parameters.