0
0
ML Pythonml~10 mins

Activation functions in ML Python - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to apply the ReLU activation function to the input tensor.

ML Python
import torch
input_tensor = torch.tensor([-1.0, 0.0, 2.0, -3.0])
output = torch.nn.functional.[1](input_tensor)
Drag options to blanks, or click blank then click option'
Atanh
Bsigmoid
Csoftmax
Drelu
Attempts:
3 left
💡 Hint
Common Mistakes
Using sigmoid instead of relu changes output range.
Using softmax applies normalization, not element-wise activation.
2fill in blank
medium

Complete the code to apply the sigmoid activation function to the input tensor.

ML Python
import torch
input_tensor = torch.tensor([-2.0, 0.0, 2.0])
output = torch.nn.functional.[1](input_tensor)
Drag options to blanks, or click blank then click option'
Asigmoid
Brelu
Csoftmax
Dtanh
Attempts:
3 left
💡 Hint
Common Mistakes
Using relu instead of sigmoid changes output range.
Using softmax applies normalization across vector, not element-wise.
3fill in blank
hard

Fix the error in the code by choosing the correct activation function to apply softmax over dimension 1.

ML Python
import torch
input_tensor = torch.tensor([[1.0, 2.0, 3.0], [1.0, 2.0, 3.0]])
output = torch.nn.functional.[1](input_tensor, dim=1)
Drag options to blanks, or click blank then click option'
Asigmoid
Bsoftmax
Crelu
Dtanh
Attempts:
3 left
💡 Hint
Common Mistakes
Using relu or sigmoid does not normalize across dimension.
Forgetting to specify dim parameter causes errors.
4fill in blank
hard

Fill both blanks to define a custom activation function that applies tanh and then scales the output by 2.

ML Python
import torch
input_tensor = torch.tensor([-1.0, 0.0, 1.0])
output = 2 * torch.nn.functional.[1](input_tensor) [2] 0
Drag options to blanks, or click blank then click option'
Atanh
B+
C-
D*
Attempts:
3 left
💡 Hint
Common Mistakes
Using relu instead of tanh changes output range.
Using multiplication instead of addition after scaling.
5fill in blank
hard

Fill all three blanks to create a dictionary comprehension that maps each word to its sigmoid activation applied to its length.

ML Python
import torch
words = ['hi', 'hello', 'hey']
result = {word: torch.nn.functional.[1](torch.tensor(len(word))) for word in words if len(word) [2] 2 and len(word) [3] 5}
Drag options to blanks, or click blank then click option'
Asigmoid
B>
C<
D==
Attempts:
3 left
💡 Hint
Common Mistakes
Using relu instead of sigmoid changes output range.
Using equality instead of range conditions.