
PMLE M4
Authored by Mateusz Utracki
Other
Professional Development
Used 1+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
9 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
You are developing a process for training and running your custom model in production. You need to be able to show lineage for your model and predictions. What should you do?
1. Create a Vertex AI managed dataset.
2. Use a Vertex AI training pipeline to train your model.
3. Generate batch predictions in Vertex AI.
1. Use a Vertex AI Pipelines custom training job component to train your model.
2. Generate predictions by using a Vertex AI Pipelines model batch predict component.
1. Upload your dataset to BigQuery.
2. Use a Vertex AI custom training job to train your model.
3. Generate predictions by using Vertex Al SDK custom prediction routines.
1. Use Vertex AI Experiments to train your model.
2. Register your model in Vertex AI Model Registry.
3. Generate batch predictions in Vertex AI.
2.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
You want to migrate a scikit-learn classifier model to TensorFlow. You plan to train the TensorFlow classifier model using the same training set that was used to train the scikit-learn model, and then compare the performances using a common test set. You want to use the Vertex AI Python SDK to manually log the evaluation metrics of each model and compare them based on their F1 scores and confusion matrices. How should you log the metrics?
Use the aiplatform.log_classification_metrics function to log the F1 score, and use the aiplatform.log_metrics function to log the confusion matrix.
Use the aiplatform.log_classification_metrics function to log the F1 score and the confusion matrix.
Use the aiplatform.log_metrics function to log the F1 score and the confusion matrix.
Use the aiplatform.log_metrics function to log the F1 score: and use the aiplatform.log_classification_metrics function to log the confusion matrix.
3.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
You have created a Vertex AI pipeline that automates custom model training. You want to add a pipeline component that enables your team to most easily collaborate when running different executions and comparing metrics both visually and programmatically. What should you do?
Add a component to the Vertex AI pipeline that logs metrics to a BigQuery table. Query the table to compare different executions of the pipeline. Connect BigQuery to Looker Studio to visualize metrics.
Add a component to the Vertex AI pipeline that logs metrics to a BigQuery table. Load the table into a pandas DataFrame to compare different executions of the pipeline. Use Matplotlib to visualize metrics.
Add a component to the Vertex AI pipeline that logs metrics to Vertex ML Metadata. Use Vertex AI Experiments to compare different executions of the pipeline. Use Vertex AI TensorBoard to visualize metrics.
Add a component to the Vertex AI pipeline that logs metrics to Vertex ML Metadata. Load the Vertex ML Metadata into a pandas DataFrame to compare different executions of the pipeline. Use Matplotlib to visualize metrics.
4.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
You are investigating the root cause of a misclassification error made by one of your models. You used Vertex AI Pipelines to train and deploy the model. The pipeline reads data from BigQuery. creates a copy of the data in Cloud Storage in TFRecord format, trains the model in Vertex AI Training on that copy, and deploys the model to a Vertex AI endpoint. You have identified the specific version of that model that misclassified, and you need to recover the data this model was trained on. How should you find that copy of the data?
Use Vertex AI Feature Store. Modify the pipeline to use the feature store, and ensure that all training data is stored in it. Search the feature store for the data used for the training.
Use the lineage feature of Vertex AI Metadata to find the model artifact. Determine the version of the model and identify the step that creates the data copy and search in the metadata for its location.
Use the logging features in the Vertex AI endpoint to determine the timestamp of the model’s deployment. Find the pipeline run at that timestamp. Identify the step that creates the data copy, and search in the logs for its location.
Find the job ID in Vertex AI Training corresponding to the training for the model. Search in the logs of that job for the data used for the training.
5.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
You are developing a recommendation engine for an online clothing store. The historical customer transaction data is stored in BigQuery and Cloud Storage. You need to perform exploratory data analysis (EDA), preprocessing and model training. You plan to rerun these EDA, preprocessing, and training steps as you experiment with different types of algorithms. You want to minimize the cost and development effort of running these steps as you experiment. How should you configure the environment?
Create a Vertex AI Workbench user-managed notebook using the default VM instance, and use the %%bigquery magic commands in Jupyter to query the tables.
Create a Vertex AI Workbench managed notebook to browse and query the tables directly from the JupyterLab interface.
Create a Vertex AI Workbench user-managed notebook on a Dataproc Hub, and use the %%bigquery magic commands in Jupyter to query the tables.
Create a Vertex AI Workbench managed notebook on a Dataproc cluster, and use the spark-bigquery-connector to access
6.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
You work for a rapidly growing social media company. Your team builds TensorFlow recommender models in an on-premises CPU cluster. The data contains billions of historical user events and 100,000 categorical features. You notice that as the data increases, the model training time increases. You plan to move the models to Google Cloud. You want to use the most scalable approach that also minimizes training time. What should you do?
Deploy the training jobs by using TPU VMs with TPUv3 Pod slices, and use the TPUEmbeading API
Deploy the training jobs in an autoscaling Google Kubernetes Engine cluster with CPUs
Deploy a matrix factorization model training job by using BigQuery ML
Deploy the training jobs by using Compute Engine instances with A100 GPUs, and use the tf.nn.embedding_lookup API
7.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
You need to train an XGBoost model on a small dataset. Your training code requires custom dependencies. You want to minimize the startup time of your training job. How should you set up your Vertex AI custom training job?
Store the data in a Cloud Storage bucket, and create a custom container with your training application. In your training application, read the data from Cloud Storage and train the model.
Use the XGBoost prebuilt custom container. Create a Python source distribution that includes the data and installs the dependencies at runtime. In your training application, load the data into a pandas DataFrame and train the model.
Create a custom container that includes the data. In your training application, load the data into a pandas DataFrame and train the model.
Store the data in a Cloud Storage bucket, and use the XGBoost prebuilt custom container to run your training application. Create a Python source distribution that installs the dependencies at runtime. In your training application, read the data from Cloud Storage and train the model.
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?
Similar Resources on Wayground
10 questions
Skincare Sunday Quiz
Quiz
•
Professional Development
10 questions
mushroom
Quiz
•
KG - Professional Dev...
14 questions
Brawl Stars
Quiz
•
KG - Professional Dev...
10 questions
eOPE (Review)
Quiz
•
Professional Development
10 questions
CENFINITY
Quiz
•
Professional Development
10 questions
E6F Unit 5 - An Interim Solution
Quiz
•
Professional Development
10 questions
Guess the Singer
Quiz
•
Professional Development
10 questions
Ori and the will of the wisps
Quiz
•
KG - Professional Dev...
Popular Resources on Wayground
8 questions
Spartan Way - Classroom Responsible
Quiz
•
9th - 12th Grade
15 questions
Fractions on a Number Line
Quiz
•
3rd Grade
14 questions
Boundaries & Healthy Relationships
Lesson
•
6th - 8th Grade
20 questions
Equivalent Fractions
Quiz
•
3rd Grade
3 questions
Integrity and Your Health
Lesson
•
6th - 8th Grade
25 questions
Multiplication Facts
Quiz
•
5th Grade
9 questions
FOREST Perception
Lesson
•
KG
20 questions
Main Idea and Details
Quiz
•
5th Grade
Discover more resources for Other
15 questions
LOTE_SPN2 5WEEK3 Day 2 Itinerary
Quiz
•
Professional Development
6 questions
Copy of G5_U6_L5_22-23
Lesson
•
KG - Professional Dev...
10 questions
March Quiz
Quiz
•
Professional Development
5 questions
Copy of G5_U6_L8_22-23
Lesson
•
KG - Professional Dev...
10 questions
suffixes FUL OR LESS
Quiz
•
Professional Development