0
0
TensorFlowml~20 mins

L1 and L2 regularization in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Regularization Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
1:30remaining
Difference between L1 and L2 regularization

Which statement correctly describes the main difference between L1 and L2 regularization in machine learning models?

AL2 regularization produces sparse models by setting weights to zero, while L1 regularization only shrinks weights without making them zero.
BL1 regularization tends to produce sparse models by driving some weights exactly to zero, while L2 regularization shrinks weights but rarely makes them zero.
CBoth L1 and L2 regularization always set all weights to zero to prevent overfitting.
DL1 regularization increases model complexity, while L2 regularization decreases it.
Attempts:
2 left
💡 Hint

Think about which regularization method helps with feature selection by removing some features completely.

Predict Output
intermediate
1:30remaining
Output shape after applying L2 regularization in TensorFlow

What will be the shape of the weights after creating a Dense layer with 5 units and L2 regularization in TensorFlow?

TensorFlow
import tensorflow as tf
layer = tf.keras.layers.Dense(5, kernel_regularizer=tf.keras.regularizers.l2(0.01), input_shape=(10,))
model = tf.keras.Sequential([layer])
model.build()
weights_shape = model.layers[0].kernel.shape
print(weights_shape)
A(10, 5)
B(5, 10)
C(5,)
D(10,)
Attempts:
2 left
💡 Hint

Remember the Dense layer shape is (input_features, output_units).

Hyperparameter
advanced
1:30remaining
Choosing regularization strength

You train a neural network and notice it overfits the training data. Which change to the regularization parameter lambda (strength) will most likely reduce overfitting?

AIncrease lambda to add stronger regularization.
BDecrease lambda to reduce regularization strength.
CSet lambda to zero to remove regularization.
DKeep lambda unchanged because it does not affect overfitting.
Attempts:
2 left
💡 Hint

Think about how regularization controls model complexity and overfitting.

Metrics
advanced
1:30remaining
Effect of L1 regularization on model sparsity metric

After training a model with L1 regularization, which metric would best show that many weights are exactly zero?

ASum of all weights regardless of value.
BMean squared error on training data.
CCount of weights equal to zero (sparsity count).
DAccuracy on validation data.
Attempts:
2 left
💡 Hint

Sparsity means many weights are zero. Which metric directly measures that?

🔧 Debug
expert
2:00remaining
Identifying error in TensorFlow model with combined L1 and L2 regularization

Consider this TensorFlow code snippet that tries to apply both L1 and L2 regularization to a Dense layer. What error will it raise?

TensorFlow
import tensorflow as tf
layer = tf.keras.layers.Dense(4, kernel_regularizer=tf.keras.regularizers.l1_l2(l1=0.01, l2=0.01))
model = tf.keras.Sequential([layer])
model.build(input_shape=(None, 8))
print(model.layers[0].kernel_regularizer.l1)
ASyntaxError due to incorrect function call.
BTypeError because l1_l2 does not accept keyword arguments.
CNo error; prints 0.01 as expected.
DAttributeError because 'kernel_regularizer' has no attribute 'l1'.
Attempts:
2 left
💡 Hint

Check the attributes available on the l1_l2 regularizer object.