Data Engineer 288-297

Data Engineer 288-297

12th Grade

10 Qs

quiz-placeholder

Similar activities

Vertex AI Pipelines V1

Vertex AI Pipelines V1

12th Grade

10 Qs

SQL

SQL

12th Grade

15 Qs

BTEC unit 2 databases - key terms

BTEC unit 2 databases - key terms

12th Grade

15 Qs

RDBMS & SQL QUERIES

RDBMS & SQL QUERIES

12th Grade

15 Qs

Data 268-277

Data 268-277

12th Grade

10 Qs

Data Engineering y BigQuery V1

Data Engineering y BigQuery V1

12th Grade

10 Qs

MLOps V1

MLOps V1

12th Grade

10 Qs

Data 221-230

Data 221-230

12th Grade

10 Qs

Data Engineer 288-297

Data Engineer 288-297

Assessment

Quiz

Computers

12th Grade

Medium

Created by

Academia Google

Used 3+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

What should you do?

Media Image
Media Image
Media Image
Media Image

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

What should you do?

Setup a Kafka Connect bridge between Kafka and Pub/Sub. Use a Google-provided Dataflow template to read the data from Pub/Sub, and write the data to BigQuery.

Use a proxy host in the VPC in Google Cloud connecting to Kafka. Write a Dataflow pipeline, read data from the proxy host, and write the data to BigQuery.

Use Dataflow, write a pipeline that reads the data from Kafka, and writes the data to BigQuery.

Setup a Kafka Connect bridge between Kafka and Pub/Sub. Write a Dataflow pipeline, read the data from Pub/Sub, and write the data to BigQuery.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You are designing the architecture to process your data from Cloud Storage to BigQuery by using Dataflow. The network team provided you with the Shared VPC network and subnetwork to be used by your pipelines. You need to enable the deployment of the pipeline on the Shared VPC network. What should you do?

Assign the compute.networkUser role to the Dataflow service agent.

Assign the compute.networkUser role to the service account that executes the Dataflow pipeline.

Assign the dataflow.admin role to the Dataflow service agent.

Assign the dataflow.admin role to the service account that executes the Dataflow pipeline.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

What should you do?

Create a new topic, and publish the last 30 days of data each time a new subscriber connects to an existing topic.

Set the topic retention policy to 30 days

Set the subscriber retention policy to 30 days.

Ask the source system to re-push the data to Pub/Sub, and subscribe to it

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Your organization is modernizing their IT services and migrating to Google Cloud. You need to organize the data that will be stored in Cloud Storage and BigQuery. You need to enable a data mesh approach to share the data between sales, product design, and marketing departments. What should you do?

Media Image
Media Image
Media Image
Media Image

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

What should you do?

Create BigQuery connections to both Cloud SQL databases. Use BigQuery federated queries on the two databases and the Google Analytics data on BigQuery to run these queries.

Create a job on Apache Spark with Dataproc Serverless to query both Cloud SQL databases and the Google Analytics data on BigQuery for these queries

Create streams in Datastream to replicate the required tables from both Cloud SQL databases to BigQuery for these queries.

Create a Dataproc cluster with Trino to establish connections to both Cloud SQL databases and BigQuery, to execute the queries

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You designed a data warehouse in BigQuery to analyze sales data. You want a self-serving, lowmaintenance, and cost- effective solution to share the sales dataset to other business units in your organization. What should you do?

Create an Analytics Hub private exchange, and publish the sales dataset.

Enable the other business units’ projects to access the authorized views of the sales dataset.

Create and share views with the users in the other business units.

Use the BigQuery Data Transfer Service to create a schedule that copies the sales dataset to the other business units’ projects.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?