0
0
Computer Visionml~20 mins

ResNet and skip connections in Computer Vision - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
ResNet Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Why do ResNet models use skip connections?

ResNet introduced skip connections to help deep neural networks. What is the main reason for using these skip connections?

ATo allow gradients to flow directly and avoid vanishing gradient problems in very deep networks.
BTo reduce the number of parameters by skipping some layers during training.
CTo increase the size of the input images automatically.
DTo replace convolutional layers with fully connected layers.
Attempts:
2 left
💡 Hint

Think about what happens to gradients when networks get very deep.

Predict Output
intermediate
2:00remaining
Output shape after a ResNet skip connection block

Consider a ResNet block where the input tensor has shape (batch_size=32, height=64, width=64, channels=64). The block applies two convolution layers with padding='same' and keeps the number of channels the same. What will be the output shape after adding the skip connection?

Computer Vision
import tensorflow as tf
input_tensor = tf.random.normal([32, 64, 64, 64])
conv1 = tf.keras.layers.Conv2D(64, 3, padding='same', activation='relu')(input_tensor)
conv2 = tf.keras.layers.Conv2D(64, 3, padding='same')(conv1)
output = tf.keras.layers.Add()([input_tensor, conv2])
print(output.shape)
A(32, 32, 32, 64)
B(32, 62, 62, 64)
C(32, 64, 64, 128)
D(32, 64, 64, 64)
Attempts:
2 left
💡 Hint

Padding='same' keeps height and width unchanged. The skip connection adds tensors of the same shape.

Model Choice
advanced
2:00remaining
Choosing the correct ResNet block for channel mismatch

In ResNet, when the input and output channels differ, a skip connection cannot be a simple addition. Which option correctly handles this channel mismatch?

AAdd zero padding channels to the input tensor to match output channels.
BUse max pooling on the output to reduce channels before addition.
CUse a 1x1 convolution on the input to match output channels before addition.
DSkip the addition and concatenate input and output tensors instead.
Attempts:
2 left
💡 Hint

Think about how to change the input tensor shape to match the output tensor shape for addition.

Metrics
advanced
2:00remaining
Effect of skip connections on training loss curves

When training a very deep ResNet with skip connections, how does the training loss curve typically compare to a similar deep network without skip connections?

AThe ResNet with skip connections shows slower loss decrease and higher final training loss.
BThe ResNet with skip connections usually shows faster loss decrease and lower final training loss.
CBoth networks show identical loss curves because skip connections do not affect training.
DThe network without skip connections converges faster due to simpler architecture.
Attempts:
2 left
💡 Hint

Consider how skip connections help gradients during backpropagation.

🔧 Debug
expert
3:00remaining
Identifying the error in a ResNet skip connection implementation

Examine the following PyTorch code snippet for a ResNet block with a skip connection. What error will occur when running this code?

Computer Vision
import torch
import torch.nn as nn

class ResNetBlock(nn.Module):
    def __init__(self, in_channels, out_channels):
        super().__init__()
        self.conv1 = nn.Conv2d(in_channels, out_channels, kernel_size=3, padding=1)
        self.conv2 = nn.Conv2d(out_channels, out_channels, kernel_size=3, padding=1)
        self.relu = nn.ReLU()

    def forward(self, x):
        out = self.relu(self.conv1(x))
        out = self.conv2(out)
        out += x  # skip connection
        out = self.relu(out)
        return out

block = ResNetBlock(64, 128)
input_tensor = torch.randn(1, 64, 32, 32)
output = block(input_tensor)
ARuntimeError due to shape mismatch when adding tensors of different channel sizes.
BSyntaxError because of missing colon in class definition.
CTypeError because ReLU cannot be applied to convolution layers.
DNo error; the code runs correctly and outputs tensor shape (1, 128, 32, 32).
Attempts:
2 left
💡 Hint

Check the shapes of tensors before the addition in the forward method.