NLP_6_7

NLP_6_7

11 Qs

quiz-placeholder

Similar activities

Long A Quiz U1W2

Long A Quiz U1W2

KG - University

9 Qs

Short Vowels Quiz U1W1

Short Vowels Quiz U1W1

KG - University

9 Qs

Vocabulary Quiz 8

Vocabulary Quiz 8

KG - University

14 Qs

Vowel Consonant-e Syllables Quiz  U5W1

Vowel Consonant-e Syllables Quiz U5W1

KG - University

9 Qs

Language and Style Unit Test

Language and Style Unit Test

KG - University

10 Qs

1 q4 w1 MONDAY: Language Review

1 q4 w1 MONDAY: Language Review

1st Grade

12 Qs

Unit 2.5 - Vocabulary

Unit 2.5 - Vocabulary

KG - University

8 Qs

NLP_6_7

NLP_6_7

Assessment

Quiz

others

Medium

Created by

Hazem Abdelazim

Used 14+ times

FREE Resource

11 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is NOT a type of word embedding technique?
a) Word2Vec
b) GLoVE
c) Bag of Words

d) Skip-gram

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the Continuous Bag of Words (CBoW) model?
To predict the context words given a target word
b) To predict the target word given a set of context words
c) To generate new text based on a given set of words

To predict the next word given pervious words

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the difference between static embeddings and contextualized embeddings?
Static embeddings are trained on a specific task, while contextualized embeddings are trained on a general task

b) Static embeddings are the same as binary bag of words

c) Static embeddings are fixed for all instances of a word, while contextualized embeddings vary depending on the context of the word

Static embeddings are generated using CBoW while contextualized embedding are based on skip-grams

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

What is the purpose of word embeddings?
To capture the meaning of words in a numeric format

To create hand-crafted features for machine learning models

To collect specially-designed data for machine learning models

to compress high dimensional sparse representation of words into a compact form

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the advantage of language models over most other machine learning models?
They need hand-crafted features and specially-collected data
b) They can be trained on running text in a self-supervised manner
c) They require a small corpus of text data
d) They are much smarter

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What type of loss function used in both CBOW and Skip-Gram models?
A) sigmoid
B) They exclusively use a mean squared error loss function.
C) The loss functions prioritize syntactic accuracy over semantic meaning.

D) They employ softmax functions

E) Loss functions in these models are irrelevant as long as the embeddings are accurate.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the dimension of the softmax layer in the CBoW model
one as this is a binary classification
Vocabulary size
equals the embedding dimension used
the same as the dimension of the weight matrix

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?