Deep Learning: Conv Nets

Quiz
•
Mathematics
•
University
•
Hard
Standards-aligned

Josiah Wang
Used 38+ times
FREE Resource
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
How is shift Invariance achieved in ConvNets?
Through convolutional equivariance
Through convolutional equivariance and approximate translation invariance with pooling
Through convolutional equivariance and exact pooling invariance
They exist in a higher dimensional invariant space
Answer explanation
The convolutional layers are shift equivariant. If an input image is shifted a little bit, the convolutional filters will produce the same response at the shifted location. The pooling layers are approximately shift invariant. For example, if an input image is shifted a little bit under a max pooling layer, the maximum value will still be the same. Overall, given an input image x, a shift S, a shift equivariant convolutional layer f, and a shift invariant pooling layer g, the ConvNet g(f(x)) is shift invariant because g(f(Sx)) = g(Sf(x)) = g(f(x)). (see Note02)
2.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
Why do we include dropout in the network architecture ?
Offers regularization and helps build deeper networks
Can help with uncertainty estimation through Monte-Carlo use
Increases the capacity of the model
Prevents vanishing gradients
None of these
Answer explanation
Dropout randomly removes connections between neurons in neural networks. It offers regularization through preventing the model from relying on all features, often termed co-adaption, helping reduce overfitting. In addition, Monte Carlo dropout can be used as a Bayesian approximation to estimate uncertainties in neural networks (see https://arxiv.org/abs/1506.02142).
3.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Model Ensembling is:
Having multiple instances of the network(s) and average together their responses
Having a single instance of the network and pass the input multiple times but altered in a small way
The perfect string quartet
None of the above
4.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
Which of the following activation functions helps with the vanishing gradients problem?
Sigmoid
Tanh
ReLU
SELU
Softmax
Answer explanation
Hyperbolic activation functions have gradients in the range of 0-1 with significant proportions of the activation function space yielding very low gradients. As backpropagtion involves the chain role, repeatedly calculating the product of partial derivatives, the gradients passed back become vanishingly small. This does not occur in ReLUs for example as the gradient is either 0 or 1.
5.
MULTIPLE CHOICE QUESTION
45 sec • 1 pt
True or False. Two 3x3 convolutional layers have the same receptive field as one 5x5 convolutional layer, results in more non linearities and requires less weights.
True
False
Answer explanation
6.
MULTIPLE CHOICE QUESTION
45 sec • 1 pt
What causes vanishing gradients?
The Wizard Merlin
Large changes in X cause small changes in Y
Large changes in Y cause small changes in X
ReLU activations 'dying'
Answer explanation
Vanishing gradients occur when the derivative of a function becomes very close to zero, meaning large changes in input (X) cause only small changes in output (Y). This is a problem as backpropagation is done by calculating the derivatives of the error with respect to the weights, so if the derivatives are very small, the parameters will barely change, and the error will remain.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
True or False. SELUs are more likely to 'die' compared to ReLUs.
True
False
Answer explanation
ReLUs can 'die' as when inactive, below 0, they yield gradients of 0. Therefore, there is no learning signal propagating through the deactivated unit. Weights will not be updated based on any learning signal which was intended to pass through the deactivated unit. SeLUs combat this problem as have no non zero gradients therefore always yielding a learning signal.
Create a free account and access millions of resources
Similar Resources on Wayground
15 questions
AI Models Quiz

Quiz
•
University
6 questions
Intro to ML: Neural Networks Lecture 2 Part 1

Quiz
•
University
10 questions
BMAT201-2

Quiz
•
University
15 questions
6th Grade: Pre-Assessment

Quiz
•
6th Grade - University
7 questions
Conv Neural Networks

Quiz
•
University - Professi...
10 questions
Deep Learning: Generative Models

Quiz
•
University
15 questions
Quiz-1

Quiz
•
University - Professi...
10 questions
4NT Coordinates Geometry

Quiz
•
10th Grade - University
Popular Resources on Wayground
11 questions
Hallway & Bathroom Expectations

Quiz
•
6th - 8th Grade
20 questions
PBIS-HGMS

Quiz
•
6th - 8th Grade
10 questions
"LAST STOP ON MARKET STREET" Vocabulary Quiz

Quiz
•
3rd Grade
19 questions
Fractions to Decimals and Decimals to Fractions

Quiz
•
6th Grade
16 questions
Logic and Venn Diagrams

Quiz
•
12th Grade
15 questions
Compare and Order Decimals

Quiz
•
4th - 5th Grade
20 questions
Simplifying Fractions

Quiz
•
6th Grade
20 questions
Multiplication facts 1-12

Quiz
•
2nd - 3rd Grade