Transformer Quiz

Transformer Quiz

12th Grade

10 Qs

quiz-placeholder

Similar activities

TCP/IP dan OSI LAYER

TCP/IP dan OSI LAYER

12th Grade

10 Qs

Roblox 1

Roblox 1

4th Grade - University

11 Qs

quiz Chat GPT

quiz Chat GPT

12th Grade

15 Qs

Modelo OSI e TCP/IP (2)

Modelo OSI e TCP/IP (2)

10th Grade - University

10 Qs

Quizizz-Understanding Unsupervised Learning

Quizizz-Understanding Unsupervised Learning

12th Grade

10 Qs

Y12 AS Level Computational Thinking Skills Quiz

Y12 AS Level Computational Thinking Skills Quiz

12th Grade

12 Qs

Intro to JS: Functions, Scope & Objects

Intro to JS: Functions, Scope & Objects

11th Grade - University

8 Qs

The Role of IETF & Network Protocols

The Role of IETF & Network Protocols

10th Grade - University

15 Qs

Transformer Quiz

Transformer Quiz

Assessment

Quiz

Information Technology (IT)

12th Grade

Hard

Created by

Elakkiya E

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the main problem with Recurrent Neural Networks (RNN)?

Vanishing or exploding gradients

Easy access to information from long time ago

Fast computation for long sequences

Efficient for short sequences

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of positional encoding in the Transformer model?

To capture the position of words in a sentence

To introduce fluctuations in the data

To represent the size of the embedding vector

To relate words to each other

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does Self-Attention allow the model to do?

Relate words to each other

Access information from long time ago

Compute fast for long sequences

Avoid vanishing or exploding gradients

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of layer normalization in the Transformer model?

To relate words to each other

To introduce fluctuations in the data

To make the model causal

To normalize the output of each layer

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the goal of Masked Multi-Head Attention in the Transformer model?

To make the model causal

To relate words to each other

To compute fast for long sequences

To introduce fluctuations in the data

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which task is typically performed by an Encoder Only Transformer?

Machine translation

Text summarization

Question-answering

Sentimental analysis

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the primary use case for a Decoder Only Transformer?

Anomaly detection

Text classification

Text generation

Chatbots

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?