GCP Quest

GCP Quest

Professional Development

10 Qs

quiz-placeholder

Similar activities

AWS RDS/Dynamo DB

AWS RDS/Dynamo DB

Professional Development

8 Qs

Teste - Domínio 1 - Design de arquiteturas seguras

Teste - Domínio 1 - Design de arquiteturas seguras

Professional Development

15 Qs

Philippine History of Architecture

Philippine History of Architecture

7th Grade - Professional Development

15 Qs

AutoCAD Quiz

AutoCAD Quiz

Professional Development

10 Qs

Persistency Quiz

Persistency Quiz

Professional Development

12 Qs

Digital Quiz

Digital Quiz

Professional Development

10 Qs

Avengers

Avengers

KG - Professional Development

10 Qs

aws quiz

aws quiz

Professional Development

6 Qs

GCP Quest

GCP Quest

Assessment

Quiz

Architecture

Professional Development

Easy

Created by

Nampu Nampu

Used 1+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Your company has hundreds of user identities in Microsoft Active Directory. Your company needs to retain the use of your Active Directory as your source of truth for user identities and authorization. Your company requires to have full control over the employees’ Google accounts for all Google services as well as your Google Cloud Platform (GCP) organization.

Export the company’s users from the Microsoft Active Directory as a CSV file. Import them into Google Cloud Identity via the Admin Console.

Utilize Google Cloud Directory Sync (GCDS) to synchronize users into Google Cloud Identity.

Write a custom script using the Cloud Identity APIs to synchronize users to Cloud Identity.

Require each employee to set up a Google account using the self signup process. Mandate each employee to use their corporate email address and password.

2.

MULTIPLE SELECT QUESTION

2 mins • 1 pt

You are developing an application that stores and processes files from thousands of producers. Data security and expiration of obsolete data are your top priorities in building the application. Moreover, the application has to: 1. Provide producers write permissions to data for 30 minutes only. 2. Delete files that are stored for over 45 days. 3. Restrict producers from reading files they don’t own. The development timeline for the application is short, and you need to ensure that the solution has a low maintenance overhead.
Set up an SFTP server on a Compute Engine instance and create user accounts for each producer.
Generate signed URLs to give limited-time access for producers to store objects.
Deploy a Cloud function that triggers a countdown timer of 45 days and deletes the expired objects.
Create a script written in Python that loops through all objects inside a Cloud Storage bucket and deletes objects that are 45 days old.
Create an object lifecycle configuration to delete Cloud Storage objects after 45 days of storage.

3.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Your company strictly observes the best practice of giving least-privilege access to control the GCP projects and other resources. Your Site Reliability Engineers (SRE) team recently opened a support case to Google Cloud Support. The SREs should be able to grant permission requests from the Google Cloud Support team while working through the case. You want to follow Google-recommended practices. What should you do?
Use the predefined roles/iam.roleAdmin role and assign it to the accounts of your SREs.
Use the predefined roles/iam.organizationRoleAdmin role and assign it to the accounts of your SREs.
Create a Google group named sre-group. Use the predefined roles/iam.roleAdmin role and assign it to the newly created group
Create a Google group named sre-group. Use the predefined roles/accessapproval role and assign it to the newly created group.

4.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Your company runs hundreds of projects on the Google Cloud Platform. You are tasked to store the company’s audit log files for three years for compliance purposes. You need to implement a solution to store these audit logs in a cost-effective manner. What should you do?
Develop a custom script written in Python that utilizes the Logging API to duplicate the logs generated by Operations Suite to BigQuery.
On the Logs Router, create a sink with Cloud BigQuery as a destination to save audit logs.
Create a Cloud Storage bucket using a Coldline storage class. Then on the Logs Router, create a sink. Choose Cloud Storage as a sink service and select the bucket you previously created.
Configure all resources to be a publisher on a Cloud Pub/Sub topic and publish all the message logs received from the topic to Cloud SQL to store the logs.

5.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Your team is planning to move a mission-critical application to GCP. Your team decided to do a lift-and-shift migration strategy and host it in a Google Compute Engine. The application is monolithic and requires a custom number of vCPUs and memory to run efficiently. What should you do?
Launch the VM instance using default settings. Add 2 vCPUs at a time until the application runs smoothly.
Launch two VM instances in separate zones. Enable the Rightsizing Recommendations to resize the virtual machines to the desired number of vCPU and memory.
Utilize the Tau T2D VM to host the application and optimize the workloads.
Select Custom as machine type during instance creation. Configure the desired number of vCPUs and memory.

6.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

You are assigned to set up a solution that stores a large amount of financial data in a cost-effective manner and archive it after 30 days. The data will only be accessed once a year for auditing purposes. As part of compliance objectives, you also have to ensure that the data is stored in a single geographic location. What should you do?
Create a Cloud Storage bucket and set its location to Regional. Configure an object lifecycle rule that transitions the bucket into Nearline Storage after 30 days.
Create a Cloud Storage bucket and set its location to Regional. Configure an object lifecycle rule that transitions the bucket into Coldline Storage after 30 days.
Create a Cloud Storage bucket and set its location to Dual-Region. Configure an object bucket lifecycle rule that transitions the bucket into Nearline Storage after 30 days.
Create a Cloud Storage bucket and set its location to Multi-Regional. Configure an object lifecycle rule that transitions the bucket into Cloud Storage after 30 days.

7.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

You are asked to deploy a Node.js application in your company’s GCP environment. The application must run every time an object is deleted on a specific Cloud Storage bucket. You want to follow Google-recommended best practices. What should you do?
Deploy your application to Google Kubernetes Engine (GKE). Configure a cron job to trigger the application using Cloud Pub/Sub.
Deploy your code to Google Cloud Functions. Set a Cloud Storage trigger when an object is deleted from your bucket.
Create a batch job with your code by using Cloud Dataflow. Configure the bucket as a data source.
Utilize App Engine and configure Cloud Scheduler to trigger the application using a Pub/Sub subscription.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?