Open Sourcing BERT: State-of-the-Art Pre-training for Natural …
https://blog.research.google/2018/11/open-sourcing-bert-state-of-art-pre.html
WEBNov 2, 2018 · BERT builds upon recent work in pre-training contextual representations — including Semi-supervised Sequence Learning, Generative Pre-Training, ELMo, and ULMFit. However, unlike these previous models, BERT is the first deeply bidirectional, unsupervised language representation, pre-trained using only a plain text corpus (in this …
DA: 41 PA: 89 MOZ Rank: 43 Up or Down: Up