

Recurrent Neural Networks and Transformer Models
Flashcard
•
Computers
•
9th Grade
•
Practice Problem
•
Easy
Abhishek Sharma
Used 1+ times
FREE Resource
Student preview

11 questions
Show all answers
1.
FLASHCARD QUESTION
Front
What are recurrent neural networks commonly used for?
Back
Sequence modeling and transduction problems such as language modeling and machine translation.
2.
FLASHCARD QUESTION
Front
What are the two types of recurrent neural networks mentioned?
Back
Long short-term memory (LSTM) and gated recurrent neural networks.
3.
FLASHCARD QUESTION
Front
What is a key limitation of recurrent models in training?
Back
The inherently sequential nature precludes parallelization within training examples.
4.
FLASHCARD QUESTION
Front
What recent advancements have improved computational efficiency in recurrent models?
Back
Factorization tricks and conditional computation.
5.
FLASHCARD QUESTION
Front
What role do attention mechanisms play in sequence modeling?
Back
They allow modeling of dependencies without regard to their distance in the input or output sequences.
6.
FLASHCARD QUESTION
Front
What is the main innovation of the Transformer model?
Back
It relies entirely on an attention mechanism to draw global dependencies between input and output, eschewing recurrence.
7.
FLASHCARD QUESTION
Front
How does the Transformer model improve parallelization?
Back
By not relying on recurrence, it allows for significantly more parallelization.
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?