Search Header Logo

NLP_6_7

Authored by Hazem Abdelazim

others

Used 16+ times

NLP_6_7
AI

AI Actions

Add similar questions

Adjust reading levels

Convert to real-world scenario

Translate activity

More...

    Content View

    Student View

11 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is NOT a type of word embedding technique?

a) Word2Vec
b) GLoVE
c) Bag of Words

d) Skip-gram

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the Continuous Bag of Words (CBoW) model?

To predict the context words given a target word
b) To predict the target word given a set of context words
c) To generate new text based on a given set of words

To predict the next word given pervious words

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the difference between static embeddings and contextualized embeddings?

Static embeddings are trained on a specific task, while contextualized embeddings are trained on a general task

b) Static embeddings are the same as binary bag of words

c) Static embeddings are fixed for all instances of a word, while contextualized embeddings vary depending on the context of the word

Static embeddings are generated using CBoW while contextualized embedding are based on skip-grams

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

What is the purpose of word embeddings?

To capture the meaning of words in a numeric format

To create hand-crafted features for machine learning models

To collect specially-designed data for machine learning models

to compress high dimensional sparse representation of words into a compact form

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the advantage of language models over most other machine learning models?

They need hand-crafted features and specially-collected data
b) They can be trained on running text in a self-supervised manner
c) They require a small corpus of text data
d) They are much smarter

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What type of loss function used in both CBOW and Skip-Gram models?

A) sigmoid
B) They exclusively use a mean squared error loss function.
C) The loss functions prioritize syntactic accuracy over semantic meaning.

D) They employ softmax functions

E) Loss functions in these models are irrelevant as long as the embeddings are accurate.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the dimension of the softmax layer in the CBoW model

one as this is a binary classification
Vocabulary size
equals the embedding dimension used
the same as the dimension of the weight matrix

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?