AI Platform for Hyperparameter Tuning
Cloud AI Platform
: the serverless platform for training and hosting ML models
AI Platform abstracts away the process of hyperparameter tuning. |
---|
1. Express the hyperparameter in need of tuning as a command-line argument. |
When you have 2 hyperparameters : The number of buckets to discretize latitude and longitude , The number of hidden units in my deep neural network |
2. Make sure that ouputs from different iterations of training don't clobber each other. |
e.g, Use suffix - employing a good naming convention for the output folders |
3. Supply hyperparameters to the training job. |
→ Create a YAML file → Supply the path to the yaml file via command line parameters to the gcloud |
ai-platform command._ |
EXAMTOPIC Q 40. How to speed up the Hyp Tuning on AI Platform without compromising its effectiveness
You have a functioning end-to-end ML pipeline that involves tuning the hyperparameters of your ML model using AI Platform, and then using the best-tuned parameters for training. Hypertuning is taking longer than expected and is delaying the downstream processes. You want to speed up the tuning job without significantly compromising its effectiveness. Which actions should you take? (Choose two.)
- A. Decrease the number of parallel trials. ❌
- B. Decrease the range of floating-point values.⭕
- C. Set the early stopping parameter to TRUE. ⭕
- D. Change the search algorithm from Bayesian search to random search. ❌
- E. Decrease the maximum number of trials during subsequent training phases.
하이퍼파라미터 튜닝 - 병렬 처리
AI Platform > using hyperparameter tuning > Running parallel trials
Running parallel trials has the benefit of reducing the time the training job takes (real time—the total processing time required is not typically changed). However, running in parallel can reduce the effectiveness of the tuning job overall. That is because hyperparameter tuning uses the results of previous trials to inform the values to assign to the hyperparameters of subsequent trials. When running in parallel, some trials start without having the benefit of the results of any trials still running.
Faster tuning with early stopping
Hyperparameter tuning on Google Cloud Platform is now faster and smarter
하이퍼파라미터 튜닝 - 베이지안 최적화
Hyperparameter tuning in Cloud Machine Learning Engine using Bayesian Optimization
Grid search
exhaustively searches through the hyperparameters and is not feasible in high dimensional space.- By contrast,
random search
, which simply samples the search space randomly, does not have this problem, and is widely used in practice. The downside of random search, however, is that it doesn’t use information from prior experiments to select the next setting. This is particularly undesirable when the cost of running experiments is high and you want to make an educated decision on what experiment to run next. It’s this problem whichBayesian optimization
will help solve.
- Bayesian search works better and faster than random search since it's selective in points to evaluate and uses knowledge of previouls evaluated points.
'Certificate - DS > Machine learning engineer' 카테고리의 다른 글
Professional Machine Learning Engineer 샘플 문제 정리 (0) | 2021.11.26 |
---|---|
Cloud IAM, API Gateway - Security, Privacy, compliance, legal issues (0) | 2021.11.26 |
Explainable AI, Feature Attribution (2) | 2021.11.26 |
Data/Target leakage (0) | 2021.11.24 |
[Certificate] GCP Professional ML Engineer 자격증 (0) | 2021.11.14 |