What is the "cache" used for in our implementation of forward propagation and backward propagation?

C1M4

Quiz
•
Information Technology (IT)
•
University
•
Medium
Abylai Aitzhanuly
Used 1+ times
FREE Resource
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
It is used to cache the intermediate values of the cost function during training.
We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.
It is used to keep track of the hyperparameters that we are searching over, to speed up computation.
We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activations.
2.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
Among the following, which ones are "hyperparameters"? (Check all that apply.)
learning rate α
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following statements is true?
The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers.
The earlier layers of a neural network are typically computing more complex features of the input than the deeper layers.
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Vectorization allows you to compute forward propagation in an L-layer neural network without an explicit for-loop (or any other explicit iterative loop) over the layers l=1, 2, …,L. True/False?
TRUE
FALSE
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Assume we store the values for n^[l] in an array called layers, as follows: layer_dims = [n_x, 4,3,2,1]. So layer 1 has four hidden units, layer 2 has 3 hidden units and so on. Which of the following for-loops will allow you to initialize the parameters for the model?
for(i in range(1, len(layer_dims))):
parameter[‘W’ + str(i)] = np.random.randn(layers[i], layers[i - 1])) * 0.01
parameter[‘b’ + str(i)] = np.random.randn(layers[i], 1)
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
The number of layers L is 4. The number of hidden layers is 2.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
During forward propagation, in the forward function for a layer l you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During backpropagation, the corresponding backward function also needs to know what is the activation function for layer l, since the gradient depends on it. True/False?
TRUE
FALSE
Create a free account and access millions of resources
Similar Resources on Quizizz
10 questions
C4M2

Quiz
•
University
15 questions
Network revision

Quiz
•
University
15 questions
Q6. Cybersecurity Best Practices

Quiz
•
University
15 questions
EasyRound

Quiz
•
University
10 questions
C4M1

Quiz
•
University
8 questions
ComNet Lecture 1-2

Quiz
•
University
12 questions
eXtensible Markup Language

Quiz
•
12th Grade - University
15 questions
komputasi awan

Quiz
•
University
Popular Resources on Quizizz
15 questions
Multiplication Facts

Quiz
•
4th Grade
20 questions
Math Review - Grade 6

Quiz
•
6th Grade
20 questions
math review

Quiz
•
4th Grade
5 questions
capitalization in sentences

Quiz
•
5th - 8th Grade
10 questions
Juneteenth History and Significance

Interactive video
•
5th - 8th Grade
15 questions
Adding and Subtracting Fractions

Quiz
•
5th Grade
10 questions
R2H Day One Internship Expectation Review Guidelines

Quiz
•
Professional Development
12 questions
Dividing Fractions

Quiz
•
6th Grade