cross-attention
Appearance
English
[edit]Alternative forms
[edit]Etymology
[edit]Noun
[edit]- (machine learning) A form of attention (machine learning method) where two different input sequences are compared, i.e. the keys and queries differ.
- Antonym: self-attention
- 2024, Benoit Liquet, Sarat Moka, Yoni Nazarathy, Mathematical Engineering of Deep Learning[1], CRC Press, page 286:
- The cross attention layer inside each transformer decoder block is in fact a multi-head cross attention layer […]