0
0
Computer Visionml~5 mins

ResNet and skip connections in Computer Vision - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is the main problem ResNet aims to solve in deep neural networks?
ResNet addresses the problem of vanishing gradients and degradation in very deep networks, where adding more layers makes training harder and accuracy worse.
Click to reveal answer
beginner
What is a skip connection in ResNet?
A skip connection is a shortcut that bypasses one or more layers by directly adding the input of those layers to their output, helping gradients flow better during training.
Click to reveal answer
intermediate
How does a skip connection help during backpropagation?
Skip connections allow gradients to flow directly through the shortcut, reducing the chance of gradients becoming too small and enabling easier training of deep networks.
Click to reveal answer
intermediate
In ResNet, what is the typical operation performed when combining the input and output in a skip connection?
The input is added element-wise to the output of the stacked layers, forming a residual block output: output = F(input) + input.
Click to reveal answer
intermediate
Why are ResNet models often deeper than traditional CNNs?
Because skip connections help avoid training problems like vanishing gradients, ResNet can safely add many more layers, improving feature learning and accuracy.
Click to reveal answer
What problem do skip connections in ResNet primarily address?
ABatch normalization
BVanishing gradients
CData augmentation
DOverfitting
In a ResNet residual block, how is the output computed?
AOutput = F(input) + input
BOutput = input * F(input)
COutput = F(input) - input
DOutput = input / F(input)
Why can ResNet have hundreds of layers without training issues?
ABecause it uses dropout
BBecause it uses small batch sizes
CBecause of skip connections
DBecause it uses ReLU activation only
What does the 'F' represent in the ResNet formula output = F(input) + input?
AThe identity function
BThe input data
CThe loss function
DThe residual mapping learned by the layers
Which of these is NOT a benefit of skip connections?
AIncreasing model overfitting
BReduced training time
CAvoiding vanishing gradients
DImproved gradient flow
Explain in your own words what a skip connection is and why it helps deep neural networks.
Think about how information flows in a network and what happens when it can skip some layers.
You got /4 concepts.
    Describe the structure of a ResNet residual block and how it differs from a traditional convolutional block.
    Focus on the addition operation and the shortcut path.
    You got /4 concepts.