Introduction to Auto Search Jobs
Based on the ModelArts platform, the auto search service integrates multiple AutoML technologies, such as automatic data enhancement, automatic hyperparameter search, and neural architecture search, to help you easily obtain AutoML capabilities at the lowest cost, and optimize and accelerate models in actual services.
If you have a set of service model code that defines a complete training, evaluation, and model export process, but you are not satisfied with the model precision, inference speed, and training time, you can slightly modify the code according to Code Compilation Specifications to make the auto search service as the upper layer of your service model code. You can call the code for multiple times to change the training model architecture and parameters, train and evaluate the model, and finally obtain a model with higher precision and faster speed.
Types of Auto Search Jobs
ModelArts supports the following search modes:
- Neural network architecture search (pure NAS search): In image classification and object detection scenarios, ModelArts provides several ResNet structures. You can replace the original ResNet structures with these structures to improve the accuracy or performance, such a search cost is 4 to 10 times the time consumed by a single network training. In addition, ModelArts supports any architecture search. It provides a self-developed search algorithm (MBNAS) to search for a better architecture in any search space.
- Hyperparameter search: ModelArts provides multiple classic hyperparameter search algorithms, which can be used to search for common hyperparameters in deep learning, such as learning_rate.
- Auto data augmentation policy search: ModelArts provides the optimal pre-processing policy and policy decoder that can be used to replace the data augmentation module in the code. You can also use the ModelArts decoder to search for the optimal augmentation policy that is more suitable for your scenario.
- Multisearch (any combination of the preceding three items): NAS search, hyperparameter search, and auto data augmentation policy search can be used at the same time to obtain better architectures, hyperparameters, and augmentation policies, but the cost is higher.
Using an Auto Search Job for the First Time
If you use auto search for the first time, the following information helps you get familiar with the auto search function:
- First, refer to Types of Auto Search Jobs to determine the auto search job that your business needs.
- Then, view the following examples to learn about auto search jobs of ModelArts based on the sample code and YAML configuration file.
You can refer to Code Compilation Specifications and YAML Configuration File Description for coding requirements.
- Example: Replacing the Original ResNet-50 with a Better Network Architecture
- Example: Searching for Hyperparameters Using Classic Hyperparameter Algorithms
- Example: Searching for Network Architectures Using the MBNAS Algorithm
- Example: Implementing Auto Data Augmentation Using a Preset Data Augmentation Policy
- Example: Using Multisearch
- Prepare the boot script and YAML configuration file of the job. Refer to Creating an Auto Search Job to start an auto search job and view the search result.
Functions of Auto Search Jobs
|
Function |
Sub-Function |
Description |
Operation Guide |
|---|---|---|---|
|
Job management |
Auto search job creation |
Create an auto search job. |
|
|
Job stopping or deletion |
Stop a running auto search job, or delete unnecessary auto search jobs. The operation procedure is similar to that of a training job. For details, see the related training job guide. |
||
|
Job version management |
ModelArts allows you to manage job versions to effectively train your model after each tuning. Specifically, ModelArts generates a version each time when a training is performed. You can compare different versions to quickly get the difference. The operation procedure is similar to that of a training job. For details, see the related training job guide. |
||
|
Job details viewing |
After a training job finishes, in addition to managing job versions, you can also check whether the job is satisfactory by viewing the job details, logs, and search results. The operation procedure is similar to that of a training job. For details, see the related training job guide. |
||
|
Job parameter management |
You can store the parameter settings in ModelArts during job creation so that you can use the stored settings to create follow-up jobs, which makes job creation more efficient. The operation procedure is similar to that of a training job. For details, see the related training job guide. |
||
|
Encoding rules |
Code compilation specifications |
You need to slightly modify your existing model training code based on the code compilation specifications of the auto search service so that the upper-layer search service can successfully call the user code for training and evaluation. |
|
|
YAML configuration file description |
To use the hyperparameter search capability to perform multisearch, you need to provide a YAML configuration file to configure the control information required by hyperparameter searches. |
||
|
Examples |
Replacing the original ResNet50 with a better network architecture |
Use the classification task of ResNet 50 on the MNIST dataset as an example to describe how to prepare data, download code, modify code, create a job, and view search results in an end-to-end manner. |
Example: Replacing the Original ResNet-50 with a Better Network Architecture |
|
Searching for hyperparameters using classic hyperparameter algorithms |
Use the hyperparameter algorithm to optimize the black box function. |
Example: Searching for Hyperparameters Using Classic Hyperparameter Algorithms |
|
|
Searching for network architectures using the MBNAS algorithm |
Use the MBNAS algorithm with the simulated data similar to MNIST. |
Example: Searching for Network Architectures Using the MBNAS Algorithm |
|
|
Implementing auto data augmentation using a built-in data augmentation policy |
Use the classification task of ResNet50 on the MNIST dataset as an example to describe how to use the data augmentation policy. |
Example: Implementing Auto Data Augmentation Using a Preset Data Augmentation Policy |
|
|
Multisearch |
Use the classification task of ResNet50 on the MNIST dataset as an example to describe how to use the multisearch function. |
Last Article: Auto Search Jobs
Next Article: Creating an Auto Search Job
Did this article solve your problem?
Thank you for your score!Your feedback would help us improve the website.