Google Professional Data Engineer Certificate EXAMTOPIC DUMPS Q76-Q80
Q 76.
Government regulations in your industry mandate that you have to maintain an auditable record of access to certain types of data. Assuming that all expiring logs will be archived correctly, where should you store data that is subject to that mandate?
- ❌ A. Encrypted on Cloud Storage with user-supplied encryption keys. A separate decryption key will be given to each authorized user.
- ⭕ B. In a
BigQuery
dataset that is viewable only by authorized personnel, with the Data Access log used to provide the auditability. - ❌ C. In Cloud SQL, with separate database user names to each user. The Cloud SQL Admin activity logs will be used to provide the auditability.
- ❌ D. In a bucket on Cloud Storage that is accessible only by an AppEngine service that collects user information and logs the access before providing a link to the bucket.
BigQuery Audit Logs to review data access
Best practices for working with Google Cloud Audit Logs | Google Cloud Blog
-
- Audit logs are organized into the following 3 streams ; Cloud Audit Logs
- Data access
Thedata_access
stream contains entries about jobs by using theJobInsertion
andJobChange
events and about table data modifications by using theTableDataChange
andTableDataRead
events. - System event
- Admin activity
- Data access
- Audit logs are organized into the following 3 streams ; Cloud Audit Logs
Scenarios for exporting logging data: Security and access analytics
Q 77.
Your neural network model is taking days to train. You want to increase the training speed. What can you do?
- ❌ A. Subsample your
test dataset.
→ Modifying test data is incorrect - ⭕ B. Subsample your training dataset.
- ❌ C.
Increasethe number of input features to your model. - ❌ D.
Increasethe number of layers in your neural network.
Q 79.
Your company maintains a hybrid deployment with GCP, where analytics are performed on your anonymized customer data. The data are imported to
Cloud Storage
from your data center through parallel uploads to a data transfer server running on GCP. Management informs you that the daily transfers take too long and have asked you to fix the problem. You want to maximize transfer speeds. Which action should you take?
- ❌ A. Increase the
CPU sizeon your server. - ❌ B. Increase the
size of the Google Persistent Diskon your server. - ⭕ C. Increase your network bandwidth from your datacenter to GCP.
- ❌ D. Increase your network bandwidth
~ from Compute Engine to Cloud Storage.
Speed of data transfer depends on Network Bandwidth
- How much time a transfer might take, given the size of the dataset you're moving and the bandwidth available for the transfer.
- For Instance, given the Dataset size 100PB, Higher Bandwidth Slower Transfer time.
Q 80.
MJTelco Case Study (Continued)
- Full Scenario : [PDE CERTIFICATE - EXAMTOPIC] DUMPS Q36-Q40
- MJTelco is building a custom interface to share data. They have these requirements:
- They need to do aggregations over their petabyte-scale datasets.
- They need to scan specific time range rows with a very fast response time (milliseconds).
Which combination of Google Cloud Platform products should you recommend?
- ❌ A.
Cloud Datastoreand Cloud Bigtable - ❌ B. Cloud Bigtable and
Cloud SQL - ⭕ C.
BigQuery
andCloud Bigtable
→ Bigquery and Big table = petabyte storage capacity
→ Bigtable = to read scan rows Big query select row to read with very low latency (millisecond) - ❌ D. BigQuery and
Cloud Storage
'Certificate - DS > Data engineer' 카테고리의 다른 글
[PDE CERTIFICATE - EXAMTOPIC] DUMPS Q91-Q95 (0) | 2022.02.17 |
---|---|
Professional Data Engineer 샘플 문제 정리 (0) | 2022.02.16 |
[PDE CERTIFICATE - EXAMTOPIC] DUMPS Q71-Q75 (0) | 2022.02.16 |
[PDE CERTIFICATE - EXAMTOPIC] DUMPS Q51-Q55 (0) | 2022.01.26 |
[PDE CERTIFICATE - EXAMTOPIC] DUMPS Q36-Q40 (0) | 2022.01.26 |