ML B2 CH3

ML B2 CH3

University

10 Qs

quiz-placeholder

Similar activities

Bitmap v Vector Images

Bitmap v Vector Images

KG - University

10 Qs

Natural Language Processsing Intro

Natural Language Processsing Intro

University

10 Qs

How Software Works

How Software Works

University

10 Qs

NLP_Unit 1_Quiz

NLP_Unit 1_Quiz

University

10 Qs

HCI-U2-L1

HCI-U2-L1

University

10 Qs

10 Questions of Machine Learning

10 Questions of Machine Learning

University

10 Qs

Language Model and Word Embeddings Quiz

Language Model and Word Embeddings Quiz

University

10 Qs

Understanding Words and Vectors

Understanding Words and Vectors

University

10 Qs

ML B2 CH3

ML B2 CH3

Assessment

Quiz

Computers

University

Medium

Created by

Jhonston Benjumea

Used 1+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is one key problem of statistics-based NLP techniques?
They require GPU for training
They use real-time prediction
They create huge matrices and require full-batch learning
They ignore text data

Answer explanation

Statistics-based methods create large co-occurrence matrices and need full-batch learning, which is computationally heavy.

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How do inference-based techniques differ from statistics-based techniques?
They predict scores for unseen programming code
They do not need context words
They use mini-batch learning and GPU acceleration
They require a thesaurus to function

Answer explanation

Inference-based techniques learn patterns using mini-batches, allowing GPU-based parallel training.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does a one-hot vector represent in NLP?
A gradient value
A probability distribution
A vector with one element set to 1 and others to 0
A random binary pattern

Answer explanation

One-hot vectors represent words using binary vectors with only one active position.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the CBOW model in word2vec?
To convert text into audio
To predict the next sentence
To predict a target word from context words
To translate between languages

Answer explanation

CBOW (Continuous Bag of Words) predicts the center (target) word using the surrounding context.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does the output layer in CBOW represent?
Loss values
Weights only
Scores for each word in the vocabulary
Sentence embeddings

Answer explanation

The output layer produces scores for each vocabulary word; softmax is applied to get probabilities.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which function converts output scores into probabilities in CBOW?
ReLU
Sigmoid
Tanh
Softmax

Answer explanation

Softmax converts the output scores into a probability distribution over all vocabulary words.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Why are only input-side weights often used in word2vec representation?
They require less memory
They are easier to normalize
They provide a meaningful vector representation of word semantics
They include grammatical rules

Answer explanation

Input-side weights in word2vec reflect word meanings and are commonly used in vector space models.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?