0
0
Computer Visionml~10 mins

ResNet and skip connections in Computer Vision - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to add a skip connection by summing the input tensor with the output tensor.

Computer Vision
def residual_block(x):
    out = some_layer(x)
    out = some_activation(out)
    return out [1] x
Drag options to blanks, or click blank then click option'
A*
B-
C+
D/
Attempts:
3 left
💡 Hint
Common Mistakes
Using subtraction or multiplication instead of addition.
Forgetting to add the input tensor back.
2fill in blank
medium

Complete the code to apply batch normalization after the convolution layer in the residual block.

Computer Vision
def residual_block(x):
    out = Conv2D(64, (3, 3), padding='same')(x)
    out = [1](out)
    out = Activation('relu')(out)
    return out + x
Drag options to blanks, or click blank then click option'
ABatchNormalization
BDropout
CMaxPooling2D
DFlatten
Attempts:
3 left
💡 Hint
Common Mistakes
Using Dropout or pooling layers instead of normalization.
Placing BatchNormalization after activation instead of before.
3fill in blank
hard

Fix the error in the code to correctly implement a skip connection when the input and output shapes differ.

Computer Vision
def residual_block(x):
    out = Conv2D(128, (3, 3), strides=2, padding='same')(x)
    out = BatchNormalization()(out)
    out = Activation('relu')(out)
    shortcut = [1](x)
    return out + shortcut
Drag options to blanks, or click blank then click option'
AMaxPooling2D(pool_size=(2, 2))
BConv2D(128, (1, 1), strides=2, padding='same')
CDense(128)
DFlatten()
Attempts:
3 left
💡 Hint
Common Mistakes
Using pooling or dense layers which change shape incorrectly.
Not adjusting the shortcut shape causing shape mismatch errors.
4fill in blank
hard

Fill both blanks to create a residual block that applies two convolution layers with batch normalization and adds the skip connection.

Computer Vision
def residual_block(x):
    out = Conv2D(64, (3, 3), padding='same')(x)
    out = [1](out)
    out = Activation('relu')(out)
    out = Conv2D(64, (3, 3), padding='same')(out)
    out = [2](out)
    return out + x
Drag options to blanks, or click blank then click option'
ABatchNormalization
BDropout
CMaxPooling2D
DFlatten
Attempts:
3 left
💡 Hint
Common Mistakes
Using Dropout or pooling layers instead of BatchNormalization.
Skipping normalization after the second convolution.
5fill in blank
hard

Fill all three blanks to define a ResNet block that adjusts the shortcut path and applies two convolution layers with batch normalization.

Computer Vision
def resnet_block(x):
    shortcut = [1](x)
    out = Conv2D(128, (3, 3), strides=2, padding='same')(x)
    out = [2](out)
    out = Activation('relu')(out)
    out = Conv2D(128, (3, 3), padding='same')(out)
    out = [3](out)
    return out + shortcut
Drag options to blanks, or click blank then click option'
AConv2D(128, (1, 1), strides=2, padding='same')
BBatchNormalization
DMaxPooling2D(pool_size=(2, 2))
Attempts:
3 left
💡 Hint
Common Mistakes
Using pooling instead of convolution for shortcut.
Skipping batch normalization layers.