NLP_6_7

NLP_6_7

11 Qs

quiz-placeholder

Similar activities

THURSDAY: Language Review

THURSDAY: Language Review

1st - 5th Grade

10 Qs

9. Paraphrasing 2

9. Paraphrasing 2

KG - University

13 Qs

NLP8

NLP8

KG - University

8 Qs

Final Exam Review 2

Final Exam Review 2

6th Grade

10 Qs

Poetry Terms

Poetry Terms

KG - University

10 Qs

STAAR Smart Week 2

STAAR Smart Week 2

7th Grade

16 Qs

G5 U2 W1 Spelling Test

G5 U2 W1 Spelling Test

KG - University

8 Qs

GREEK & LATIN ROOTS, PREFIXES & SUFFIXES LESSON 5 Quiz

GREEK & LATIN ROOTS, PREFIXES & SUFFIXES LESSON 5 Quiz

KG - University

14 Qs

NLP_6_7

NLP_6_7

Assessment

Quiz

others

Medium

Created by

Hazem Abdelazim

Used 14+ times

FREE Resource

11 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is NOT a type of word embedding technique?
a) Word2Vec
b) GLoVE
c) Bag of Words

d) Skip-gram

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the Continuous Bag of Words (CBoW) model?
To predict the context words given a target word
b) To predict the target word given a set of context words
c) To generate new text based on a given set of words

To predict the next word given pervious words

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the difference between static embeddings and contextualized embeddings?
Static embeddings are trained on a specific task, while contextualized embeddings are trained on a general task

b) Static embeddings are the same as binary bag of words

c) Static embeddings are fixed for all instances of a word, while contextualized embeddings vary depending on the context of the word

Static embeddings are generated using CBoW while contextualized embedding are based on skip-grams

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

What is the purpose of word embeddings?
To capture the meaning of words in a numeric format

To create hand-crafted features for machine learning models

To collect specially-designed data for machine learning models

to compress high dimensional sparse representation of words into a compact form

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the advantage of language models over most other machine learning models?
They need hand-crafted features and specially-collected data
b) They can be trained on running text in a self-supervised manner
c) They require a small corpus of text data
d) They are much smarter

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What type of loss function used in both CBOW and Skip-Gram models?
A) sigmoid
B) They exclusively use a mean squared error loss function.
C) The loss functions prioritize syntactic accuracy over semantic meaning.

D) They employ softmax functions

E) Loss functions in these models are irrelevant as long as the embeddings are accurate.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the dimension of the softmax layer in the CBoW model
one as this is a binary classification
Vocabulary size
equals the embedding dimension used
the same as the dimension of the weight matrix

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?