0
0
ML Pythonml~5 mins

Activation functions in ML Python - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is an activation function in a neural network?
An activation function decides if a neuron should be activated or not. It adds non-linearity to the network, helping it learn complex patterns.
Click to reveal answer
beginner
Why do we need non-linear activation functions?
Non-linear activation functions let neural networks learn and model complex data. Without them, the network would behave like a simple linear model.
Click to reveal answer
beginner
Name three common activation functions used in neural networks.
Three common activation functions are: Sigmoid, ReLU (Rectified Linear Unit), and Tanh (Hyperbolic Tangent).
Click to reveal answer
beginner
What is the output range of the Sigmoid activation function?
The Sigmoid function outputs values between 0 and 1, making it useful for probabilities.
Click to reveal answer
beginner
How does the ReLU activation function work?
ReLU outputs the input directly if it is positive; otherwise, it outputs zero. This helps the network learn faster and reduces the chance of vanishing gradients.
Click to reveal answer
What does an activation function add to a neural network?
AMore layers
BNon-linearity
CData normalization
DWeight initialization
Which activation function outputs values between -1 and 1?
AReLU
BSigmoid
CTanh
DSoftmax
What is a key advantage of ReLU over Sigmoid?
AReLU outputs probabilities
BReLU is non-differentiable
CReLU outputs values between 0 and 1
DReLU helps reduce vanishing gradients
Which activation function is best for the output layer in binary classification?
ASigmoid
BTanh
CReLU
DLinear
What happens to negative inputs in the ReLU function?
AThey become zero
BThey are passed through unchanged
CThey become one
DThey become negative one
Explain what an activation function is and why it is important in neural networks.
Think about how neurons decide to 'fire' or not.
You got /3 concepts.
    Describe the differences between Sigmoid, Tanh, and ReLU activation functions.
    Consider their output values and where they are commonly used.
    You got /3 concepts.