Transformer Quiz

Transformer Quiz

12th Grade

10 Qs

quiz-placeholder

Similar activities

10 ARTIFICIAL INTELLIGENCE

10 ARTIFICIAL INTELLIGENCE

10th Grade - University

10 Qs

Quiz Video

Quiz Video

11th Grade - University

10 Qs

Java Diagnostic Assessment

Java Diagnostic Assessment

12th Grade

10 Qs

Learning Activity 5: Quizzes

Learning Activity 5: Quizzes

10th Grade - University

10 Qs

Dasar Koding dan Kecerdasan Artifisial

Dasar Koding dan Kecerdasan Artifisial

11th Grade - University

10 Qs

UlANGAN TENGAH SEMESTER INFORMATIKA BY JENIFER

UlANGAN TENGAH SEMESTER INFORMATIKA BY JENIFER

12th Grade

15 Qs

Reinforcement Learning Quiz

Reinforcement Learning Quiz

12th Grade

10 Qs

Quiz 1

Quiz 1

12th Grade

11 Qs

Transformer Quiz

Transformer Quiz

Assessment

Quiz

Information Technology (IT)

12th Grade

Hard

Created by

Elakkiya E

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the main problem with Recurrent Neural Networks (RNN)?

Vanishing or exploding gradients

Easy access to information from long time ago

Fast computation for long sequences

Efficient for short sequences

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of positional encoding in the Transformer model?

To capture the position of words in a sentence

To introduce fluctuations in the data

To represent the size of the embedding vector

To relate words to each other

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does Self-Attention allow the model to do?

Relate words to each other

Access information from long time ago

Compute fast for long sequences

Avoid vanishing or exploding gradients

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of layer normalization in the Transformer model?

To relate words to each other

To introduce fluctuations in the data

To make the model causal

To normalize the output of each layer

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the goal of Masked Multi-Head Attention in the Transformer model?

To make the model causal

To relate words to each other

To compute fast for long sequences

To introduce fluctuations in the data

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which task is typically performed by an Encoder Only Transformer?

Machine translation

Text summarization

Question-answering

Sentimental analysis

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the primary use case for a Decoder Only Transformer?

Anomaly detection

Text classification

Text generation

Chatbots

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?