Skip to content

BERT

Entity Type: Glossary ID: bert

Definition: Bidirectional Encoder Representations from Transformers, a pre-trained transformer model that learns bidirectional representations by masking random tokens and predicting them using context from both directions. BERT has achieved state-of-the-art results on many NLP tasks.

Related Terms: - transformer - bidirectional - masked-language-model - pre-training

Source Urls: - https://en.wikipedia.org/wiki/BERT_(language_model)

Tags: - transformers - nlp - pre-trained-models - bidirectional

Status: active

Version: 1.0.0

Created At: 2025-08-31

Last Updated: 2025-08-31