0
0
ML Pythonml~10 mins

Gradient descent optimization in ML Python - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to update the weight using gradient descent.

ML Python
weight = weight - [1] * gradient
Drag options to blanks, or click blank then click option'
Aepoch
Bmomentum
Clearning_rate
Dloss
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'momentum' instead of 'learning_rate' for the step size.
Using 'loss' which is a measure, not a step size.
Using 'epoch' which counts iterations, not step size.
2fill in blank
medium

Complete the code to calculate the gradient of the loss with respect to the weight.

ML Python
gradient = 2 * (prediction - target) * [1]
Drag options to blanks, or click blank then click option'
Alearning_rate
Bweight
Cbias
Depoch
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'learning_rate' which is unrelated to gradient calculation.
Using 'bias' which is a separate parameter.
Using 'epoch' which counts iterations.
3fill in blank
hard

Fix the error in the code to correctly perform one step of gradient descent.

ML Python
weight = weight [1] learning_rate * gradient
Drag options to blanks, or click blank then click option'
A-
B+
C*
D/
Attempts:
3 left
💡 Hint
Common Mistakes
Using '+' which moves weights in the wrong direction.
Using '*' or '/' which are not correct operators for updating weights.
4fill in blank
hard

Fill both blanks to create a dictionary of squared errors for each data point where error is less than 1.

ML Python
squared_errors = {x: (y - prediction)[1]2 for x, y in data.items() if abs(y - prediction) [2] 1}
Drag options to blanks, or click blank then click option'
A**
B<
C>
D-
Attempts:
3 left
💡 Hint
Common Mistakes
Using '+' instead of '**' for squaring.
Using '>' instead of '<' for filtering.
5fill in blank
hard

Fill all three blanks to create a dictionary of weights updated by gradient descent for each feature.

ML Python
updated_weights = {feature: weights[feature] [1] learning_rate * gradients[[2]] for feature in features if gradients[feature] [3] 0}
Drag options to blanks, or click blank then click option'
A-
Bfeature
C>
D+
Attempts:
3 left
💡 Hint
Common Mistakes
Using '+' instead of '-' for weight update.
Using '+' or '<' instead of '>' for filtering gradients.