0
0
ML Pythonml~10 mins

Backpropagation concept in ML Python - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to calculate the error between predicted and actual output.

ML Python
error = predicted - [1]
Drag options to blanks, or click blank then click option'
Aactual
Bpredicted
Cweights
Dbias
Attempts:
3 left
💡 Hint
Common Mistakes
Using predicted instead of actual causes zero error.
Using weights or bias here is incorrect.
2fill in blank
medium

Complete the code to compute the gradient of the loss with respect to weights.

ML Python
gradient = error * [1]
Drag options to blanks, or click blank then click option'
Abias
Binput
Coutput
Dlearning_rate
Attempts:
3 left
💡 Hint
Common Mistakes
Using bias or learning rate instead of input.
Using output instead of input.
3fill in blank
hard

Fix the error in updating the weights using gradient descent.

ML Python
weights = weights - [1] * gradient
Drag options to blanks, or click blank then click option'
Ainput
Bgradient
Cerror
Dlearning_rate
Attempts:
3 left
💡 Hint
Common Mistakes
Using gradient alone causes too large updates.
Using error or input directly is incorrect.
4fill in blank
hard

Fill both blanks to calculate the derivative of the sigmoid activation function.

ML Python
sigmoid_derivative = [1] * (1 - [2])
Drag options to blanks, or click blank then click option'
Asigmoid_output
Binput
Cweights
Dbias
Attempts:
3 left
💡 Hint
Common Mistakes
Using input, weights, or bias instead of sigmoid output.
Using different variables in the two blanks.
5fill in blank
hard

Fill all three blanks to update bias using gradient descent.

ML Python
bias = bias - [1] * [2] * [3]
Drag options to blanks, or click blank then click option'
Alearning_rate
Berror
C1
Dinput
Attempts:
3 left
💡 Hint
Common Mistakes
Using input instead of 1 for bias update.
Forgetting to multiply by learning rate or error.