Creating an AI Application
Function
Import a meta model to create an AI application. The execution code and model must be uploaded to OBS first. By default, the model generated by a training job is stored in OBS.
Constraints
The body parameter requirements for importing a model using a template are different from those for importing a model without using a template. In the following body parameters, template parameters indicate the parameters that can be configured when a model is imported using a template, non-template parameters indicate the parameters that can be configured when a model is imported without using a template, and public parameters indicate the parameters that are irrelevant to the model import mode.
-
When a model is imported using a template (model_type is set to Template), the template field is mandatory and the source_location field does not need to be configured.
-
If a model is imported without using a template (model_type is not set to Template), source_location is mandatory and template does not need to be configured.
URI
POST /v1/{project_id}/models
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
project_id |
Yes |
String |
Project ID. For details, see Obtaining a Project ID and Name. |
Request Parameters
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
X-Auth-Token |
Yes |
String |
User token. It can be obtained by calling the IAM API that is used to obtain a user token. The value of X-Subject-Token in the response header is the user token. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
model_docs |
No |
Array of GuideDoc objects |
List of model description documents. A maximum of three documents are supported. Common parameter If model_type is set to Custom, this parameter is invalid. |
template |
No |
template object |
Configuration items in a template. This parameter is mandatory when model_type is set to Template. Template parameter If model_type is set to Custom, this parameter is invalid. |
model_version |
Yes |
String |
Model version in the format of Digit.Digit.Digit. Each digit is a one-digit or two-digit positive integer, but cannot start with 0. For example, 01.01.01 is not allowed. Common parameter |
source_job_version |
No |
String |
Version of the source training job. If the model is generated from a training job, input this parameter for source tracing. If the model is imported from a third-party meta model, leave this parameter blank. This parameter is left blank by default. Non-template parameter If model_type is set to Custom, this parameter is invalid. |
source_location |
Yes |
String |
OBS path where the model is located or the SWR image location. If the model_type value is Custom, set source_location to the OBS path to the model. |
source_copy |
No |
String |
Whether to enable image replication. This parameter is valid only when model_type is set to Image. Options:
|
initial_config |
No |
String |
Character string converted from the model configuration file. Obtain fields such as apis, dependencies, input_params, output_params, and health in the initial_config configuration file. Non-template parameter If model_type is set to Custom, this parameter is invalid. |
execution_code |
No |
String |
OBS path for storing the execution code. By default, this parameter is left blank. The name of the execution code file is consistently to be customize_service.py. The inference code file must be stored in the model directory. This parameter can be left blank. Then, the system will automatically identify the inference code in the model directory. Common parameter If model_type is set to Custom, this parameter is invalid. |
source_job_id |
No |
String |
ID of the source training job. If the model is generated from a training job, input this parameter for source tracing. If the model is imported from a third-party meta model, leave this parameter blank. This parameter is left blank by default. Non-template parameter If model_type is set to Custom, this parameter is invalid. |
model_type |
Yes |
String |
Model type. The value is TensorFlow/Image/PyTorch/Template/MindSpore/Custom, which is read from the configuration file. Common Parameters |
output_params |
No |
Array of CreateModelRequestInferParams objects |
Collection of output parameters of a model. By default, this parameter is left blank. If the parameters are read from apis in the configuration file, provide only the initial_config field, and this field can be left blank. Non-template parameter If model_type is set to Custom, this parameter is invalid. |
description |
No |
String |
Model description that consists of 1 to 100 characters. The following special characters cannot be contained: &!'"<>= Common parameter |
runtime |
No |
String |
Model runtime environment. Its possible values are determined based on model_type. For details about runtime options, see Managing AI Applications > Creating an AI Application > Importing a Meta Model from OBS in ModelArts User Guide. |
model_metrics |
No |
String |
Model precision. If the value is read from the configuration file, this parameter can be left blank. Non-template parameter |
source_type |
No |
String |
Model source type. Currently, the value can only be auto, which is used to distinguish models deployed through ExeML (the model download function is not provided). This parameter is not required for models deployed through training jobs or other methods. It is left blank by default. Non-template parameter If model_type is set to Custom, this parameter is invalid. |
dependencies |
No |
Array of ModelDependencies objects |
Package required for inference code and model. By default, this parameter is left blank. If the package is read from the configuration file, this parameter can be left blank. Non-template parameter If model_type is set to Custom, this parameter is invalid. |
workspace_id |
No |
String |
Workspace ID, which defaults to 0. Common parameter |
model_algorithm |
No |
String |
Model algorithm. If the algorithm is read from the configuration file, this parameter can be left blank. The value can be predict_analysis, object_detection, or image_classification. Non-template parameter |
apis |
No |
Array of CreateModelRequestModelApis objects |
All API input and output parameters of the model. If the parameters are parsed from the configuration file, this parameter can be left blank. Non-template parameter If model_type is set to Custom in an asynchronous request, this parameter is invalid. |
model_name |
Yes |
String |
Model name, which consists of 1 to 64 characters. Common parameter |
install_type |
No |
Array of strings |
Deployment type. Only lowercase letters are supported. The value can be real-time or batch. Default value: [real-time, batch] |
input_params |
No |
Array of CreateModelRequestInferParams objects |
Collection of input parameters of a model. By default, this parameter is left blank. If the parameters are read from apis in the configuration file, provide only the initial_config field, and this field can be left blank. Non-template parameter |
cmd |
No |
String |
Image startup command. |
dynamic_load_mode |
No |
String |
Dynamic loading mode. The default value is None, indicating that this mode is not used. The value Single indicating that this mode is used. |
deployment_constraints |
No |
deployment_constraints objects |
Model deployment constraints. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
doc_url |
Yes |
String |
HTTP(S) link of the document |
doc_name |
Yes |
String |
Document name, which must start with a letter. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
infer_format |
No |
String |
ID of the input and output mode. When this parameter is used, the input and output mode built in the template does not take effect. |
template_inputs |
Yes |
Array of CreateModelRequestTemplateInput objects |
Template input configuration, specifying the source path for configuring a model. |
template_id |
Yes |
String |
ID of the used template. The template has a built-in input and output mode. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
input |
Yes |
String |
Template input path, which can be a path to an OBS file or directory. When you use a template with multiple input items to create a model, if the target paths input_properties specified in the template are the same, the OBS directory or OBS file name entered here must be unique to prevent files from being overwritten. |
input_id |
Yes |
String |
Input item ID, which is obtained from template details |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
installer |
Yes |
String |
Installation mode. Only pip is supported. |
packages |
Yes |
Array of Packages objects |
Collection of dependency packages |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
package_version |
No |
String |
Version of a dependency package. If this parameter is left blank, the latest version is installed by default. |
package_name |
Yes |
String |
Name of a dependency package. Ensure that the package name is correct and available. |
restraint |
No |
String |
Version restriction, which can be EXACT, ATLEAST, or ATMOST. This parameter is mandatory only when package_version is available. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
protocol |
No |
String |
Request protocol. The options are HTTP and HTTPS. |
method |
No |
String |
Request method, which can be post or get |
input_params |
No |
ModelInOutputParams object |
API input and output parameters, described in JSON Schema format |
output_params |
No |
ModelInOutputParams object |
API input and output parameters, described in JSON Schema format |
url |
No |
String |
Inference request URL |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
type |
No |
String |
Type in JSON Schema, which can be object |
properties |
No |
Object |
Properties of an object element in JSON Schema. You can configure parameters, including the parameter name and type. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
protocol |
Yes |
String |
Request protocol. The options are HTTP and HTTPS. |
min |
No |
Number |
Minimum value of the parameter. This parameter is optional when param_type is set to int or float. By default, this parameter is left blank. |
method |
Yes |
String |
Request method, which can be post or get. |
max |
No |
Number |
Maximum value of the parameter. This parameter is optional when param_type is set to int or float. By default, this parameter is left blank. |
param_desc |
No |
String |
Parameter description. It is recommended that the parameter description contain a maximum of 100 characters. By default, this parameter is left blank. |
param_name |
Yes |
String |
Parameter name. It is recommended that the parameter name contain a maximum of 64 characters. |
url |
Yes |
String |
API URL |
param_type |
Yes |
String |
Parameter type, which can be int, string, float, timestamp, date, or file |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
check_method |
Yes |
String |
Health check method. |
protocol |
No |
String |
Protocol of a health check API. The default value is https. |
url |
Yes |
String |
Health check URL. |
period_seconds |
Yes |
String |
Health check period. |
initial_delay_seconds |
No |
String |
Delay for initializing the health check. |
failure_threshold |
Yes |
String |
Maximum number of health check failures. |
timeout_seconds |
No |
String |
Health check timeout. |
command |
No |
String |
Commands, which are strings separated by spaces. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
request_mode |
Yes |
String |
Request mode. The value can be async or sync. |
cpu_type |
Yes |
String |
CPU type. The value can be aarch64 or x86_64. |
accelerators |
Yes |
Array of accelerators object |
Accelerator card type. Example: [{"type": "GPU","name": "a800"}] |
Response Parameters
Status code: 200
Parameter |
Type |
Description |
---|---|---|
model_id |
String |
Model ID |
Example Requests
The following is an example of how to create an AI application whose name is mnist, version is 1.0.0, type is TensorFlow, and model file source is an OBS bucket.
POST https://{endpoint}/v1/{project_id}/models { "model_name" : "mnist", "model_version" : "1.0.0", "source_location" : "https://models.obs.xxxxx.com/mnist", "source_job_id" : "55", "source_job_version" : "V100", "model_type" : "TensorFlow", "runtime" : "python2.7", "description" : "mnist model", "execution_code" : "https://testmodel.obs.xxxxx.com/customize_service.py", "input_params" : [ { "url" : "/v1/xxx/image", "protocol" : "http", "method" : "post", "param_name" : "image_url", "param_type" : "string", "min" : 0, "max" : 9, "param_desc" : "http://test/test.jpeg" } ], "output_params" : [ { "url" : "/v1/xxx/image", "protocol" : "http", "method" : "post", "param_name" : "face_location", "param_type" : "box", "param_desc" : "face_location param value description" } ], "dependencies" : [ { "installer" : "pip", "packages" : [ { "package_name" : "numpy", "package_version" : "1.5.0", "restraint" : "ATLEAST" } ] } ], "model_algorithm" : "object_detection", "model_metrics" : "{\"f1\":0.52381,\"recall\":0.666667,\"precision\":0.466667,\"accuracy\":0.625}", "apis" : [ { "url" : "/v1/xxx/image", "protocol" : "http", "method" : "post", "input_params" : { "type" : "object", "properties" : { "image_url" : { "type" : "string" } } }, "output_params" : { "type" : "object", "properties" : { "face_location" : { "type" : "box" } } } } ] }
Example Responses
Status code: 200
The model is created.
{ "model_id" : "7feb7235-ed9c-48ae-9833-2876b2458445" }
Status Codes
Status Code |
Description |
---|---|
200 |
The model is created. |
Error Codes
See Error Codes.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.