Envisioning LogoEnvisioning
HomeServicesWorkSignalsVocabAbout
Vocab
1092 entries
GridGraphSunflowerInfo

Vinod Nair

(1 article)
ReLU (Rectified Linear Unit)
2000

ReLU
Rectified Linear Unit

Activation function commonly used in neural networks which outputs the input directly if it is positive, otherwise, it outputs zero.

Generality: 855