BERT

Mau Rua
Mar 8, 2021

--

Bidirectional Encoder Representations from Transformers (BERT) is a Transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google.

BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google.

As of 2019, Google has been leveraging BERT to better understand user searches.

The original English-language BERT has two models: (1) the BERTBASE: 12 Encoders with 12 bidirectional self-attention heads, and (2) the BERTLARGE: 24 Encoders with 24 bidirectional self-attention heads. Both models are pre-trained from unlabeled data extracted from the BooksCorpus with 800M words and English Wikipedia with 2,500M words.

--

--

Mau Rua

Welcome to my software engineering blog. Please visit my career portfolio at https://mruanova.com 🚀🌎