
Training Data Exam
Quiz
•
Instructional Technology
•
Professional Development
•
Practice Problem
•
Hard
Stefy MZ
Used 1+ times
FREE Resource
Enhance your content in a minute
99 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 5 pts
1. Your company built a TensorFlow neutral-network model with a large number of neurons and layers. The model fits well for the training data. However, when tested against new data, it performs poorly. What method can you employ to address this?
2.
MULTIPLE CHOICE QUESTION
30 sec • 5 pts
2. An external customer provides you with a daily dump of data from their database. The data flows into Google Cloud Storage GCS as comma-separated values (CSV) files. You want to analyze this data in Google BigQuery, but the data could have rows that are formatted incorrectly or corrupted. How should you build this pipeline?
3.
MULTIPLE CHOICE QUESTION
30 sec • 5 pts
3. Your weather app queries a database every 15 minutes to get the current temperature. The frontend is powered by Google App Engine and server millions of users. How should you design the frontend to respond to a database failure?
4.
MULTIPLE CHOICE QUESTION
30 sec • 5 pts
4. You are building new real-time data warehouse for your company and will use Google BigQuery streaming inserts. There is no guarantee that data will only be sent in once but you do have a unique ID for each row of data and an event timestamp. You want to ensure that duplicates are not included while interactively querying data. Which query type should you use?
5.
MULTIPLE CHOICE QUESTION
30 sec • 5 pts
5. You are designing a basket abandonment system for an ecommerce company. The system will send a message to a user based on these rules: • No interaction by the user on the site for 1 hour • Has added more than $30 worth of products to the basket • Has not completed a transaction You use Google Cloud Dataflow to process the data and decide if a message should be sent. How should you design the pipeline?
6.
MULTIPLE CHOICE QUESTION
30 sec • 5 pts
6. Your company is migrating their 30-node Apache Hadoop cluster to the cloud. They want to re-use Hadoop jobs they have already created and minimize the management of the cluster as much as possible. They also want to be able to persist data beyond the life of the cluster.
7.
MULTIPLE CHOICE QUESTION
30 sec • 5 pts
7. Your company's on-premises Apache Hadoop servers are approaching end-of-life, and IT has decided to migrate the cluster to Google Cloud Dataproc. A like-for-like migration of the cluster would require 50 TB of Google Persistent Disk per node. The CIO is concerned about the cost of using that much block storage. You want to minimize the storage cost of the migration.
Create a free account and access millions of resources
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?
Similar Resources on Wayground
Popular Resources on Wayground
10 questions
Honoring the Significance of Veterans Day
Interactive video
•
6th - 10th Grade
9 questions
FOREST Community of Caring
Lesson
•
1st - 5th Grade
10 questions
Exploring Veterans Day: Facts and Celebrations for Kids
Interactive video
•
6th - 10th Grade
19 questions
Veterans Day
Quiz
•
5th Grade
14 questions
General Technology Use Quiz
Quiz
•
8th Grade
25 questions
Multiplication Facts
Quiz
•
5th Grade
15 questions
Circuits, Light Energy, and Forces
Quiz
•
5th Grade
19 questions
Thanksgiving Trivia
Quiz
•
6th Grade
