Skip to content

Self-Attention

Entity Type: Glossary ID: self-attention

Definition: An attention mechanism where each element in a sequence attends to all elements in the same sequence, including itself. Self-attention is a key component of transformer architectures and enables modeling of long-range dependencies.

Related Terms: - attention-mechanism - transformer - multi-head-attention - query-key-value

Source Urls: - https://en.wikipedia.org/wiki/Attention_(machine_learning)#Self-attention

Tags: - attention - transformers - sequence-modeling

Status: active

Version: 1.0.0

Created At: 2025-08-31

Last Updated: 2025-08-31