[PDE CERTIFICATE - EXAMTOPIC] DUMPS Q76-Q80

Google Professional Data Engineer Certificate EXAMTOPIC DUMPS Q76-Q80

Q 76.

Government regulations in your industry mandate that you have to maintain an auditable record of access to certain types of data. Assuming that all expiring logs will be archived correctly, where should you store data that is subject to that mandate?

  • ❌ A. Encrypted on Cloud Storage with user-supplied encryption keys. A separate decryption key will be given to each authorized user.
  • B. In a BigQuery dataset that is viewable only by authorized personnel, with the Data Access log used to provide the auditability.
  • ❌ C. In Cloud SQL, with separate database user names to each user. The Cloud SQL Admin activity logs will be used to provide the auditability.
  • ❌ D. In a bucket on Cloud Storage that is accessible only by an AppEngine service that collects user information and logs the access before providing a link to the bucket.

BigQuery Audit Logs to review data access

Q 77.

Your neural network model is taking days to train. You want to increase the training speed. What can you do?

  • ❌ A. Subsample your test dataset.
    Modifying test data is incorrect
  • B. Subsample your training dataset.
  • ❌ C. Increase the number of input features to your model.
  • ❌ D. Increase the number of layers in your neural network.

Q 79.

Your company maintains a hybrid deployment with GCP, where analytics are performed on your anonymized customer data. The data are imported to Cloud Storage from your data center through parallel uploads to a data transfer server running on GCP. Management informs you that the daily transfers take too long and have asked you to fix the problem. You want to maximize transfer speeds. Which action should you take?

  • ❌ A. Increase the CPU size on your server.
  • ❌ B. Increase the size of the Google Persistent Disk on your server.
  • C. Increase your network bandwidth from your datacenter to GCP.
  • ❌ D. Increase your network bandwidth~ from Compute Engine to Cloud Storage.

Speed of data transfer depends on Network Bandwidth

Few things in computing highlight the hardware limitations of networks as transferring large amounts of data.

  • How much time a transfer might take, given the size of the dataset you're moving and the bandwidth available for the transfer.
    • For Instance, given the Dataset size 100PB, Higher Bandwidth Slower Transfer time.

Q 80.

MJTelco Case Study (Continued)
  • Full Scenario : [PDE CERTIFICATE - EXAMTOPIC] DUMPS Q36-Q40
  • MJTelco is building a custom interface to share data. They have these requirements:
    1. They need to do aggregations over their petabyte-scale datasets.
    2. They need to scan specific time range rows with a very fast response time (milliseconds).

Which combination of Google Cloud Platform products should you recommend?

  • ❌ A. Cloud Datastore and Cloud Bigtable
  • ❌ B. Cloud Bigtable and Cloud SQL
  • C. BigQuery and Cloud Bigtable
    Bigquery and Big table = petabyte storage capacity
    Bigtable = to read scan rows Big query select row to read with very low latency (millisecond)
  • ❌ D. BigQuery and Cloud Storage