ML B2 CH7

ML B2 CH7

University

10 Qs

quiz-placeholder

Similar activities

Unit4: COMBITIONAL CIRCUIT

Unit4: COMBITIONAL CIRCUIT

University

8 Qs

CS #04

CS #04

University

5 Qs

CONTROL ORGANIZATION AND INSTRUCTION CYCLE

CONTROL ORGANIZATION AND INSTRUCTION CYCLE

University

8 Qs

Machine Translation

Machine Translation

University

9 Qs

ILT-ML-05-AS Advanced Technique in Deeplearning with TensorFlow

ILT-ML-05-AS Advanced Technique in Deeplearning with TensorFlow

University

10 Qs

Flowol

Flowol

KG - University

10 Qs

Module 1 Type A

Module 1 Type A

University

10 Qs

Quiz 1: Intro, Revision and MUX

Quiz 1: Intro, Revision and MUX

University

12 Qs

ML B2 CH7

ML B2 CH7

Assessment

Quiz

Computers

University

Easy

Created by

Jhonston Benjumea

Used 2+ times

FREE Resource

AI

Enhance your content

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does an LSTM-based language model generate?
Probability of word translations
Probability distribution of the next word
Fixed-length sentence embeddings
Correct grammar for sentences

Answer explanation

The LSTM model generates a probability distribution for the next word in a sentence.

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is a deterministic method in sentence generation?
Randomly choosing any word
Sampling based on probability
Choosing the word with the highest probability
Shuffling the input words

Answer explanation

Deterministic generation selects the word with the highest probability at each step.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does the seq2seq model consist of?
Multiple attention layers
A Transformer and Decoder
An Encoder and a Decoder
Two separate RNNs without communication

Answer explanation

Seq2seq models consist of an Encoder that processes input and a Decoder that generates output.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the encoder in seq2seq?
Generate translations
Convert a sequence into a fixed-length vector
Decode output text
Create random data

Answer explanation

The encoder transforms a variable-length input sequence into a fixed-length context vector.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What layer is used to convert text into vectors in the encoder?
Softmax
ReLU
Embedding Layer
Pooling Layer

Answer explanation

The embedding layer maps discrete word indices into continuous vector space representations.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the function of the decoder in seq2seq?
Generate input text
Interpret vector h and generate output sequence
Evaluate gradients
Summarize input only

Answer explanation

The decoder uses the vector h from the encoder to generate the output sequence step by step.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of padding in seq2seq?
To compress data for memory
To add noise to training data
To equalize the length of input and output sequences
To remove rare words

Answer explanation

Padding ensures all sequences have the same length for batch processing.

Create a free account and access millions of resources

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

By signing up, you agree to our Terms of Service & Privacy Policy

Already have an account?