C5M2

C5M2

University

10 Qs

quiz-placeholder

Similar activities

Math Unit 1 Test

Math Unit 1 Test

University

13 Qs

Quiz 1 IoT

Quiz 1 IoT

University

10 Qs

Typing

Typing

3rd Grade - University

15 Qs

As Diagnosis Sistem Komputer

As Diagnosis Sistem Komputer

9th Grade - University

10 Qs

Quick Quiz Simpeda

Quick Quiz Simpeda

University

10 Qs

Microsoft Word

Microsoft Word

4th Grade - University

15 Qs

Borders and Shades in Word 2010 Quiz

Borders and Shades in Word 2010 Quiz

10th Grade - University

12 Qs

Gò Nổi HK 2-Tin Học 5

Gò Nổi HK 2-Tin Học 5

5th Grade - University

14 Qs

C5M2

C5M2

Assessment

Quiz

Information Technology (IT)

University

Practice Problem

Medium

Created by

Abylai Aitzhanuly

Used 1+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Suppose you learn a word embedding for a vocabulary of 10000 words. Then the embedding vectors should be 10000 dimensional, so as to capture the full range of variation and meaning in those words.

TRUE

FALSE

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is t-SNE?

A linear transformation that allows us to solve analogies on word vectors

A non-linear dimensionality reduction technique

A supervised learning algorithm for learning word embeddings

An open-source sequence modeling library

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

TRUE

FALSE

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Which of these equations do you think should hold for a good word embedding? (Check all that apply)

eboy - egirl ≈ ebrother - esister

eboy - egirl ≈ esister - ebrother

eboy - ebrother ≈ egirl - esister

eboy - ebrother ≈ esister - egirl

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Let EE be an embedding matrix, and let o1234 be a one-hot vector corresponding to word 1234. Then to get the embedding of word 1234, why don’t we call E * o1234 in Python?

It is computationally wasteful.

The correct formula is ET* o1234.

This doesn’t handle unknown words ().

None of the above: calling the Python snippet as described above is fine.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

When learning word embeddings, we create an artificial task of estimating P(target∣context). It is okay if we do poorly on this artificial prediction task; the more important by-product of this task is that we learn a useful set of word embeddings.

TRUE

FALSE

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In the word2vec algorithm, you estimate P(t∣c), where t is the target word and c is a context word. How are t and c chosen from the training set? Pick the best answer.

c is a sequence of several words immediately before t.

c is the one word that comes immediately before t.

c and t are chosen to be nearby words.

c is the sequence of all the words in the sentence before t.

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?