Envisioning LogoEnvisioning
HomeServicesWorkSignalsVocabAbout
Vocab
1092 entries
GridGraphSunflowerInfo

Vaswani Shardlow

(2 articles)
Attention Projection Matrix
2017

Attention Projection Matrix

Matrix used in attention mechanisms within neural networks, particularly in transformer models, to project input vectors into query, key, and value vectors.

Generality: 625

Attention Block
2017

Attention Block

Core component in neural networks, particularly in transformers, designed to selectively focus on the most relevant parts of an input sequence when making predictions.

Generality: 835