Search Header Logo

C5M1

Authored by Abylai Aitzhanuly

Information Technology (IT)

University

Used 1+ times

C5M1
AI

AI Actions

Add similar questions

Adjust reading levels

Convert to real-world scenario

Translate activity

More...

    Content View

    Student View

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Suppose your training examples are sentences (sequences of words). Which of the following refers to the jth word in the ith training example?

x(i)<j>

x<i>(j)

x(j)<i>

x<j>(i)

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

Tx = Ty

Tx < Ty

Tx > Ty

Tx = 1

3.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Media Image

Speech recognition (input an audio clip and output a transcript)

Sentiment classification (input a piece of text and output a 0/1 to denote positive or negative sentiment)

Image classification (input an image and output a label)

Gender recognition from speech (input an audio clip and output a label indicating the speaker’s gender)

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

Estimating P(y<1>, y<2>, ...., y<t-1>)

Estimating P(y<1>)

  • Estimating P(y<t> | y<1>, y<2>, ...., y<t-1>)

Estimating P(y<t> | y<1>, y<2>, ...., y<t>)

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

(i) Use the probabilities output by the RNN to pick the highest probability word for that time-step as y<t>. (ii) Then pass the ground-truth word from the training set to the next time-step.

(i) Use the probabilities output by the RNN to randomly sample a chosen word for that time-step as y<t>. (ii) Then pass the ground-truth word from the training set to the next time-step.

(i) Use the probabilities output by the RNN to pick the highest probability word for that time-step as y<t>. (ii) Then pass this selected word to the next time-step.

(i) Use the probabilities output by the RNN to randomly sample a chosen word for that time-step as y<t>. (ii) Then pass this selected word to the next time-step.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You are training an RNN, and find that your weights and activations are all taking on the value of NaN (“Not a Number”). Which of these is the most likely cause of this problem?

Vanishing gradient problem

  • Exploding gradient problem.

  • ReLU activation function g(.) used to compute g(z), where z is too large.

Sigmoid activation function g(.) used to compute g(z), where z is too large.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Suppose you are training a LSTM. You have a 10000 word vocabulary, and are using an LSTM with 100-dimensional activations a<t>. What is the dimension of Γu at each time step?

1

100

300

10000

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?