Skip to content

Transformers

Entity Type: Tags ID: transformers

Category: capability

Description: Neural network architecture based on self-attention mechanisms, widely used for natural language processing and other sequence-to-sequence tasks with superior performance.

Aliases: - transformer-architecture - attention-models - self-attention

Status: active

Version: 1.0.0

Created At: 2025-09-10

Last Updated: 2025-09-10

Glossary