BERT

BERT is a method of natural language processing (NLP) pre-training named 'Bidirectional Encoder Representations from Transformers' - aka BERT.

In October 2019 Google announced that they were using BERT in Google Search.

References

Created: Last modified:

chevron-down