Example: Searching for Hyperparameters Using Classic Hyperparameter Algorithms
This section describes how to use hyperparameter search by optimizing the black box function. In general, the hyperparameter search problem is a black box optimization problem.
Sample Code
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
import time
import autosearch # Change 1: Import the AutoSearch package.
def black_box_function(x, y):
"""Function with unknown internals we wish to maximize.
This is just serving as an example, for all intents and
purposes think of the internals of this function, i.e.: the process
which generates its output values, as unknown.
"""
return -(x ** 2) - (y - 1) ** 2 + 1
def train():
result = black_box_function(autosearch.config["x"], autosearch.config["y"]) # Change 2: Obtain the parameters delivered by the framework.
time.sleep(0.2)
autosearch.reporter(result=result) # Change 3: Send the result to the AutoSearch framework.
|
The search objective of the preceding sample code is to find the maximum value of the black_box_function function.
Compiling the Configuration File
When the Naïve Bayes algorithm is used for optimization, you can configure the YAML file as follows:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
general:
gpu_per_instance: 1
cpu_per_instance: 1
search_space:
- type: search_space
params:
- type: continuous_param
name : x
start: 1
stop: 4
num: 2
- type: continuous_param
name : y
start: 1
stop: 4
num: 2
search_algorithm:
type: bayes_opt_search
reward_attr: result
max_concurrent: 2
num_samples: 4
mode: max
|
Starting a Search Job
After uploading the preceding script and YAML file to OBS, you can start the job on the page. Select an existing dataset or an empty OBS directory because no actual data is required. For details about how to select other configurations, see Starting a Search Job in Example: Searching for Hyperparameters Using Classic Hyperparameter Algorithms.
Using Other Hyperparameter Algorithms
ModelArts supports simple random search, grid search, and three other classic hyperparameter search algorithms. To use different hyperparameter algorithms, you only need to modify search_algorithm in the YAML file. For details about the algorithm parameters, see Table 4, Table 5, and Table 6 in YAML Configuration File Description.
- To use random search, configure the parameters as follows:
search_algorithm: type: random_search repeat: 1000 reward_attr: result - To use grid search, configure the parameters as follows:
search_algorithm: type: grid_search reward_attr: resultGrid search traverses all possibilities in the search space by default, and is applicable only to scenarios where the search space is not large.
Did this article solve your problem?
Thank you for your score!Your feedback would help us improve the website.