Rectified Linear Unit (ReLU): Introduction and Uses in Machine Learning

Why it matters: The rectified linear activation function or ReLU for short is a piecewise linear function that will output the input directly if it is positive, otherwise, it will output zero.

Rectified Linear Unit (ReLU): Introduction and Uses in Machine Learning
Why it matters: The rectified linear activation function or ReLU for short is a piecewise linear function that will output the input directly if it is positive, otherwise, it will output zero.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow