Skip to content

Masked Language Model

Entity Type: Glossary ID: masked-language-model

Definition: A training objective where random tokens in a sequence are masked (hidden) and the model learns to predict these masked tokens using the surrounding context. This approach enables bidirectional learning and is used in models like BERT.

Related Terms: - bert - pre-training - bidirectional - masked-token

Source Urls: - https://en.wikipedia.org/wiki/BERT_(language_model)#Masked_language_model

Tags: - nlp - pre-training - language-models - training-objectives

Status: active

Version: 1.0.0

Created At: 2025-08-31

Last Updated: 2025-08-31