Complete the code to add L2 regularization to a Dense layer in TensorFlow.
layer = tf.keras.layers.Dense(64, kernel_regularizer=tf.keras.regularizers.[1](0.01))
The l2 function applies L2 regularization, which adds a penalty proportional to the square of the weights.
Complete the code to add L1 regularization with strength 0.005 to a Dense layer.
layer = tf.keras.layers.Dense(32, kernel_regularizer=tf.keras.regularizers.[1](0.005))
The l1 function applies L1 regularization, which adds a penalty proportional to the absolute value of the weights.
Fix the error in the code to correctly apply both L1 and L2 regularization.
layer = tf.keras.layers.Dense(128, kernel_regularizer=tf.keras.regularizers.[1](l1=0.001, l2=0.002))
The l1_l2 function allows applying both L1 and L2 regularization together by specifying both penalties.
Fill both blanks to create a model with L1 regularization on the first layer and L2 on the second.
model = tf.keras.Sequential([
tf.keras.layers.Dense(64, kernel_regularizer=tf.keras.regularizers.[1](0.01), activation='relu'),
tf.keras.layers.Dense(10, kernel_regularizer=tf.keras.regularizers.[2](0.02), activation='softmax')
])The first layer uses l1 regularization to encourage sparsity, and the second uses l2 to penalize large weights.
Fill all three blanks to create a dictionary comprehension that maps layer names to their regularization type strings ('L1', 'L2', or 'L1_L2').
layers = {'dense1': tf.keras.regularizers.l1(0.01), 'dense2': tf.keras.regularizers.l2(0.02), 'dense3': tf.keras.regularizers.l1_l2(l1=0.001, l2=0.002)}
reg_types = {name: [1] for name, reg in layers.items() if isinstance(reg, [2])}
reg_types.update({name: 'L2' for name, reg in layers.items() if isinstance(reg, [3])})This code maps layer names to strings describing their regularization type by checking the instance type of each regularizer.