crackyourinterview.com


Google Professional Cloud Architect Exam Page 2(Dumps)


Question No:-11

Your customer is moving an existing corporate application to Google Cloud Platform from an on-premises data center. The business owners require minimal user disruption. There are strict security team requirements for storing passwords.

What authentication strategy should they use?


  1. Use G Suite Password Sync to replicate passwords into Google
  2. Federate authentication via SAML 2.0 to the existing Identity Provider
  3. Provision users in Google using the Google Cloud Directory Sync tool
  4. Ask users to set their Google password to match their corporate password

 






Question No:-12

Your company has successfully migrated to the cloud and wants to analyze their data stream to optimize operations. They do not have any existing code for this analysis, so they are exploring all their options. These options include a mix of batch and stream processing, as they are running some hourly jobs and live- processing some data as it comes in.

Which technology should they use for this?


  1. Google Cloud Dataproc
  2. Google Cloud Dataflow
  3. Google Container Engine with Bigtable
  4. Google Compute Engine with Google BigQuery

 





Question No:-13

Your customer is receiving reports that their recently updated Google App Engine application is taking approximately 30 seconds to load for some of their users.

This behavior was not reported before the update.

What strategy should you take?


  1. Work with your ISP to diagnose the problem
  2. Open a support ticket to ask for network capture and flow data to diagnose the problem, then roll back your application
  3. Roll back to an earlier known good release initially, then use Stackdriver Trace and Logging to diagnose the problem in a development/test/staging environment
  4. Roll back to an earlier known good release, then push the release again at a quieter period to investigate. Then use Stackdriver Trace and Logging to diagnose the problem

 









Question No:-14

A production database virtual machine on Google Compute Engine has an ext4-formatted persistent disk for data files. The database is about to run out of storage space.

How can you remediate the problem with the least amount of downtime?


  1. In the Cloud Platform Console, increase the size of the persistent disk and use the resize2fs command in Linux.
  2. Shut down the virtual machine, use the Cloud Platform Console to increase the persistent disk size, then restart the virtual machine
  3. In the Cloud Platform Console, increase the size of the persistent disk and verify the new space is ready to use with the fdisk command in Linux
  4. In the Cloud Platform Console, create a new persistent disk attached to the virtual machine, format and mount it, and configure the database service to move the files to the new disk
  5. In the Cloud Platform Console, create a snapshot of the persistent disk restore the snapshot to a new larger disk, unmount the old disk, mount the new disk and restart the database service


 





Question No:-15

Your application needs to process credit card transactions. You want the smallest scope of Payment Card Industry (PCI) compliance without compromising the ability to analyze transactional data and trends relating to which payment methods are used.

How should you design your architecture?


  1. Create a tokenizer service and store only tokenized data
  2. Create separate projects that only process credit card data
  3. Create separate subnetworks and isolate the components that process credit card data
  4. Streamline the audit discovery phase by labeling all of the virtual machines (VMs) that process PCI data
  5. Enable Logging export to Google BigQuery and use ACLs and views to scope the data shared with the auditor

 





Question No:-16

You have been asked to select the storage system for the click-data of your company's large portfolio of websites. This data is streamed in from a custom website analytics package at a typical rate of 6,000 clicks per minute. With bursts of up to 8,500 clicks per second. It must have been stored for future analysis by your data science and user experience teams.

Which storage infrastructure should you choose?


  1. Google Cloud SQL
  2. Google Cloud Bigtable
  3. Google Cloud Storage
  4. Google Cloud Datastore

 





Question No:-17

You are creating a solution to remove backup files older than 90 days from your backup Cloud Storage bucket. You want to optimize ongoing Cloud Storage spend.

What should you do?


  1. Write a lifecycle management rule in XML and push it to the bucket with gsutil
  2. Write a lifecycle management rule in JSON and push it to the bucket with gsutil
  3. Schedule a cron script using gsutil ls ג€"lr gs://backups/** to find and remove items older than 90 days
  4. Schedule a cron script using gsutil ls ג€"l gs://backups/** to find and remove items older than 90 days and schedule it with cron

 





Question No:-18

Your company is forecasting a sharp increase in the number and size of Apache Spark and Hadoop jobs being run on your local datacenter. You want to utilize the cloud to help you scale this upcoming demand with the least amount of operations work and code change.

Which product should you use?


  1. Google Cloud Dataflow
  2. Google Cloud Dataproc
  3. Google Compute Engine
  4. Google Kubernetes Engine

 





Question No:-19

The database administration team has asked you to help them improve the performance of their new database server running on Google Compute Engine. The database is for importing and normalizing their performance statistics and is built with MySQL running on Debian Linux. They have an n1-standard-8 virtual machine with 80 GB of SSD persistent disk.

What should they change to get better performance from this system?


  1. Increase the virtual machine's memory to 64 GB
  2. Create a new virtual machine running PostgreSQL
  3. Dynamically resize the SSD persistent disk to 500 GB
  4. Migrate their performance metrics warehouse to BigQuery
  5. Modify all of their batch jobs to use bulk inserts into the database
 





Question No:-20

You want to optimize the performance of an accurate, real-time, weather-charting application. The data comes from 50,000 sensors sending 10 readings a second, in the format of a timestamp and sensor reading.

Where should you store the data?


  1. Google BigQuery
  2. Google Cloud SQL
  3. Google Cloud Bigtable
  4. Google Cloud Storage

 




1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | ...















@2014-2022 Crackyourinterview (All rights reserved)
Privacy Policy - Disclaimer - Sitemap