0
0
ML Pythonml~20 mins

Activation functions in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Activation Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Why use activation functions in neural networks?

What is the main reason to use activation functions in neural networks?

ATo convert outputs into probabilities only
BTo reduce the size of the input data
CTo speed up the training by skipping some layers
DTo introduce non-linearity so the network can learn complex patterns
Attempts:
2 left
💡 Hint

Think about what would happen if the network was just a chain of linear operations.

Predict Output
intermediate
2:00remaining
Output of ReLU activation function

What is the output of the ReLU function for the input array [-2, 0, 3]?

ML Python
import numpy as np
inputs = np.array([-2, 0, 3])
outputs = np.maximum(0, inputs)
print(outputs.tolist())
A[0, 0, 3]
B[-2, 0, 3]
C[0, 0, 0]
D[-2, 0, 0]
Attempts:
2 left
💡 Hint

ReLU outputs zero for negative inputs and the input itself if positive.

Hyperparameter
advanced
2:00remaining
Choosing activation function for hidden layers

Which activation function is generally preferred for hidden layers in deep neural networks to avoid vanishing gradients?

AReLU
BTanh
CSigmoid
DLinear
Attempts:
2 left
💡 Hint

Consider which function keeps gradients strong during backpropagation.

Metrics
advanced
2:00remaining
Effect of activation function on model accuracy

A model trained with sigmoid activation in hidden layers achieves 70% accuracy. After switching to ReLU, accuracy improves to 85%. What is the most likely reason?

AReLU reduces the number of model parameters
BReLU allows faster training and better gradient flow, improving accuracy
CSigmoid activation always produces lower accuracy regardless of data
DSigmoid causes the model to overfit the training data
Attempts:
2 left
💡 Hint

Think about how activation functions affect training dynamics and gradients.

🔧 Debug
expert
2:00remaining
Identifying error in activation function usage

What does this code output when applying the sigmoid function incorrectly?

import numpy as np
def sigmoid(x):
    return 1 / (1 + np.exp(x))
print(sigmoid(1))
AOutputs 0.0
BOutputs 1.0
COutputs approximately 0.268
DOutputs 0.5
Attempts:
2 left
💡 Hint

Check the formula for sigmoid and the sign inside the exponent.