C1M4

C1M4

University

10 Qs

quiz-placeholder

Similar activities

ISP and Data Packets Part 2

ISP and Data Packets Part 2

University

15 Qs

cybersecurity-common security infrastructure security

cybersecurity-common security infrastructure security

University

15 Qs

Link layer and LANs

Link layer and LANs

University

10 Qs

TCP/IP Model Quiz

TCP/IP Model Quiz

University

15 Qs

Multiple Choice Quiz: Designing for Impact

Multiple Choice Quiz: Designing for Impact

University

10 Qs

C1M4

C1M4

Assessment

Quiz

Information Technology (IT)

University

Medium

Created by

Abylai Aitzhanuly

Used 1+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the "cache" used for in our implementation of forward propagation and backward propagation?

It is used to cache the intermediate values of the cost function during training.

We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.

It is used to keep track of the hyperparameters that we are searching over, to speed up computation.

We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activations.

2.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Among the following, which ones are "hyperparameters"? (Check all that apply.)

Media Image
Media Image
  • learning rate α

Media Image
Media Image

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following statements is true?

The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers.

The earlier layers of a neural network are typically computing more complex features of the input than the deeper layers.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Vectorization allows you to compute forward propagation in an L-layer neural network without an explicit for-loop (or any other explicit iterative loop) over the layers l=1, 2, …,L. True/False?

TRUE

FALSE

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Assume we store the values for n^[l] in an array called layers, as follows: layer_dims = [n_x, 4,3,2,1]. So layer 1 has four hidden units, layer 2 has 3 hidden units and so on. Which of the following for-loops will allow you to initialize the parameters for the model?

Media Image
Media Image
Media Image

for(i in range(1, len(layer_dims))):

parameter[‘W’ + str(i)] = np.random.randn(layers[i], layers[i - 1])) * 0.01

parameter[‘b’ + str(i)] = np.random.randn(layers[i], 1)

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image
Media Image
Media Image
Media Image
  • The number of layers L is 4. The number of hidden layers is 2.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

During forward propagation, in the forward function for a layer l you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During backpropagation, the corresponding backward function also needs to know what is the activation function for layer l, since the gradient depends on it. True/False?

TRUE

FALSE

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?