cross-attention

From Wiktionary, the free dictionary
Jump to navigation Jump to search

English

[edit]

Alternative forms

[edit]

Etymology

[edit]

From cross- +‎ attention.

Noun

[edit]

cross-attention (uncountable)

  1. (machine learning) A form of attention (machine learning method) where two different input sequences are compared, i.e. the keys and queries differ.
    Antonym: self-attention
    • 2024, Benoit Liquet, Sarat Moka, Yoni Nazarathy, Mathematical Engineering of Deep Learning[1], CRC Press, page 286:
      The cross attention layer inside each transformer decoder block is in fact a multi-head cross attention layer []