Model Package Structure
When creating an AI application, make sure that any meta model imported from OBS complies with specifications.
- The model package specifications apply when you import one model. For multiple models, such as multiple files, use custom images instead.
- If you want to use an AI engine that is not supported by ModelArts, use a custom image.
- For details about how to create a custom image, see Specifications for Custom Images and Creating a Custom Image on ECS.
- For more examples of custom scripts, see Examples of Custom Scripts.
The model package must contain the model directory. The model directory stores the model file, model configuration file, and model inference code file.
- Model files: The requirements for model files vary according to the model package structure. For details, see Model Package Example.
- Model configuration file: The model configuration file must be available and its name is consistently to be config.json. There must be only one model configuration file. For details about how to edit a model configuration file, see Specifications for Editing a Model Configuration File.
- Model inference code file: It is mandatory. The file name is consistently to be customize_service.py. There must be only one model inference code file. For details about how to edit model inference code, see Specifications for Writing a Model Inference Code File.
- The .py file on which customize_service.py depends can be directly stored in the model directory. Use relative import for the custom package.
- The other files on which customize_service.py depends can be stored in the model directory. Use absolute paths to access these files. For details, see Obtaining an Absolute Path.
ModelArts provides samples and sample code for multiple engines. You can edit your configuration files and inference code by referring to ModelArts Samples. ModelArts also provides custom script examples of common AI engines. For details, see Examples of Custom Scripts.
If you encounter any problem when importing a meta model, contact Huawei Cloud technical support.
Model Package Example
- Structure of the TensorFlow-based model package
When publishing the model, you only need to specify the ocr directory.
OBS bucket or directory name |── ocr | ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files | │ ├── <<Custom Python package>> (Optional) Your Python package, which can be directly referenced in model inference code | │ ├── saved_model.pb (Mandatory) Protocol buffer file, which contains the graph description of the model | │ ├── variables Name of a fixed sub-directory, which contains the weight and deviation rate of the model. It is mandatory for the main file of a *.pb model. | │ │ ├── variables.index Mandatory | │ │ ├── variables.data-00000-of-00001 Mandatory | │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is allowed. | │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file is allowed. The files on which customize_service.py depends can be directly stored in the model directory.
- Structure of the PyTorch-based model package
When publishing the model, you only need to specify the resnet directory.
OBS bucket or directory name |── resnet | ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files | │ ├── <<Custom Python package>> (Optional) Your Python package, which can be directly referenced in model inference code | │ ├── resnet50.pth (Mandatory) PyTorch model file, which contains variable and weight information and is saved as state_dict | │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is allowed. | │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file is allowed. The files on which customize_service.py depends can be directly stored in the model directory.
- Structure of a custom model package depends on the AI engine in your custom image. For example, if the AI engine in your custom image is TensorFlow, the model package uses the TensorFlow structure.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot