Recurrent Neural Networks and Transformer Models

Recurrent Neural Networks and Transformer Models

Assessment

Flashcard

Computers

9th Grade

Easy

Created by

Abhishek Sharma

Used 1+ times

FREE Resource

Student preview

quiz-placeholder

11 questions

Show all answers

1.

FLASHCARD QUESTION

Front

What are recurrent neural networks commonly used for?

Back

Sequence modeling and transduction problems such as language modeling and machine translation.

2.

FLASHCARD QUESTION

Front

What are the two types of recurrent neural networks mentioned?

Back

Long short-term memory (LSTM) and gated recurrent neural networks.

3.

FLASHCARD QUESTION

Front

What is a key limitation of recurrent models in training?

Back

The inherently sequential nature precludes parallelization within training examples.

4.

FLASHCARD QUESTION

Front

What recent advancements have improved computational efficiency in recurrent models?

Back

Factorization tricks and conditional computation.

5.

FLASHCARD QUESTION

Front

What role do attention mechanisms play in sequence modeling?

Back

They allow modeling of dependencies without regard to their distance in the input or output sequences.

6.

FLASHCARD QUESTION

Front

What is the main innovation of the Transformer model?

Back

It relies entirely on an attention mechanism to draw global dependencies between input and output, eschewing recurrence.

7.

FLASHCARD QUESTION

Front

How does the Transformer model improve parallelization?

Back

By not relying on recurrence, it allows for significantly more parallelization.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?