Engine::Init
Initializes the configuration of an engine instance. This API is defined in engine.h.

The Init API is optional. You can determine whether to overload and implement it based on the actual situation.
Syntax
HIAI_StatusT Engine::Init(const AIConfig &config, const vector<AIModelDescription> &modelDesc)
Parameter Description
Parameter |
Description |
Value Range |
---|---|---|
config |
Engine configuration item |
- |
modelDesc |
Model description |
- |
Return Value
The returned error codes are registered by the user in advance. For details about the registration method, see Error Code Registration.
Error Codes
The error codes of this API are registered by the user.
Example of Overload and Implementation
Initialize the inference engine. During the initialization, load the model through the API of the model manager (AIModelManager), as shown in AIModelManager::Init.
HIAI_StatusT FrameworkerEngine::Init(const hiai::AIConfig& config, const std::vector<hiai::AIModelDescription>& model_desc) { hiai::AIStatus ret = hiai::SUCCESS; // init ai_model_manager_ if (nullptr == ai_model_manager_) { ai_model_manager_ = std::make_shared<hiai::AIModelManager>(); } std::cout<<"FrameworkerEngine Init"<<std::endl; HIAI_ENGINE_LOG("FrameworkerEngine Init"); for (int index = 0; index < config.items_size(); ++index) { const ::hiai::AIConfigItem& item = config.items(index); // loading model if(item.name() == "model_path") { const char* model_path = item.value().data(); std::vector<hiai::AIModelDescription> model_desc_vec; hiai::AIModelDescription model_desc_; model_desc_.set_path(model_path); model_desc_vec.push_back(model_desc_); ret = ai_model_manager_->Init(config, model_desc_vec); if (hiai::SUCCESS != ret) { HIAI_ENGINE_LOG(this, HIAI_AI_MODEL_MANAGER_INIT_FAIL, "[DEBUG] fail to init ai_model"); return HIAI_AI_MODEL_MANAGER_INIT_FAIL; } } } HIAI_ENGINE_LOG("FrameworkerEngine Init success"); return HIAI_OK; }
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot