[PMLE CERTIFICATE - EXAMTOPIC] DUMPS Q41-Q44

EXAMTOPIC DUMPS Q41-Q44 ; Firebase vs. Pub/sub, Bigquery with Pandas dataframe, Feature Crosses, AI Platform vs. AutoML vs. ML API vs Custom Model

Q 41.

Your team is building an application for a global bank that will be used by millions of customers. You built a forecasting model that predicts customers' account balances 3 days in the future. Your team will use the results in a new feature that will notify users when their account balance is likely to drop below $25. How should you serve your predictions?

Firebase vs. Pub/sub
  • ❌ A. Create a Pub/Sub topic for each user. 2. Deploy a Cloud Function that sends a notification when your model predicts that a users account balance will drop below the $25 threshold.
  • ❌ B. Create a Pub/Sub topic for each user. 2. Deploy an application on the App Engine standard environment that sends a notification when your model predicts that a users account balance will drop below the $25 threshold.
  • ❌ C. Build a notification system on Firebase. 2. Register each user with a user ID on the Firebase Cloud Messaging server, which sends a notification when the average of all account balance predictions drops below the $25 threshold.
  • D. Build a notification system on Firebase. 2. Register each user with a user ID on the Firebase Cloud Messaging server, which sends a notification when your model predicts that a users account balance will drop below the $25 threshold.
  • Pub/Sub topic for each user is overkill

Q 42.

You work for an advertising company and want to understand the effectiveness of your company's latest advertising campaign. You have streamed 500 MB of campaign data into BigQuery. You want to query the table, and then manipulate the results of that query with a pandas dataframe in an AI Platform notebook. What should you do?

Bigquery with Pandas dataframe
  • A. *Use AI Platform Notebooks BigQuery cell magic to query the data, and ingest the results as a pandas dataframe.
  • ❌ B. Export your table as a CSV file from BigQuery to Google Drive, and use the Google Drive API to ingest the file into your notebook instance.
  • ❌ C. Download your table from BigQuery as a local CSV file, and upload it to your AI Platform notebook instance. Use pandas.read_csv to ingest he file as a pandas dataframe.
  • ❌ D. From a bash cell in your AI Platform notebook, use the bq extract command to export the table as a CSV file to Cloud Storage, and then use gsutil cp to copy the data into the notebook. Use pandas.read_csv to ingest the file as a pandas dataframe.

Q 43.

You are an ML engineer at a global car manufacture. You need to build an ML model to predict car sales in different cities around the world. Which features or feature crosses should you use to train city-specific relationships between car type and number of sales?

Feature Crosses
  • ❌ A. Thee individual features: binned latitude, binned longitude, and one-hot encoded car type.
    → REQUIRE city-specific : conjunction of latitude and longitude
  • ❌ B. One feature obtained as an element-wise product between latitude, longitude, and car type.
  • C. One feature obtained as an element-wise product between binned latitude, binned longitu de, and one-hot encoded car type.
  • ❌ D. Two feature crosses as an element-wise product: the first between binned latitude and one-hot encoded car type, and the second between binned longitude and one-hot encoded car type.
    separate latitude & longitude in different feature crosses
    → REQUIRE city-specific : conjunction of latitude and longitude

Binning Features & Crossing Features

  • Feature Engineering - Feature Crosses
  • Binning features enable the model to learn nonlinear relationships within a single feature.
  • In that order of Ideas Crossing binned latitude with binned longitude enables the model to learn city-specific effects of car type.

Q 44.

You work for a large technology company that wants to modernize their contact center. You have been asked to develop a solution to classify incoming calls by product so that requests can be more quickly routed to the correct support team. You have already transcribed the calls using the Speech-to-Text API. You want to minimize data preprocessing and development time. How should you build the model?

AI Platform vs. AutoML vs. ML API vs Custom Model
  • Minimize the preprocess & dev time to build the classifier using the result text data from speech-to-Text API
    • Comparison on dev time : AI Platform vs. AutoML vs. ML API vs Custom Model
    • Text classification by products
  • A. Use the AI Platform Training built-in algorithms to create a custom model.
  • B. Use AutoML Natural Language to extract custom entities for classification.
    AutoML classifies content in custom categories for your specific needs.
    • Using text : Natural Language
    • Custom entities/categories = products
  • C. Use the Cloud Natural Language API to extract custom entities for classification.
    → Natural Language API reveals the structure and meaning of text with thousands of pretrained classifications.
  • ❌ D. Build a custom model to identify the product keywords from the transcribed calls, and then run the keywords through a classification algorithm.
    → the most inefficient choice