Macro: HIAI_IMPL_ENGINE_PROCESS
You need to overload and implement this macro to define the implementation of the engine. This macro is defined in engine.h.
This macro encapsulates the following functions:
static HIAIEngineFactory* GetInstance(); HIAI_StatusT HIAIEngineFactory::RegisterEngineCreator(const std::string& engine_name,HIAI_ENGINE_FUNCTOR_CREATOR engineCreatorFunc); HIAI_StatusT HIAIEngineFactory::UnRegisterEngineCreator(const std::string& engine_name);
Related macro:
This macro is called after HIAI_DEFINE_PROCESS (inputPortNum, outputPortNum).
Syntax
HIAI_IMPL_ENGINE_PROCESS(name, engineClass, inPortNum)
Parameter Description
Parameter |
Description |
Value Range |
---|---|---|
name |
Engine name in the configuration |
- |
engineClass |
Name of the engine implementation class |
- |
inPortNum |
Number of input ports |
- |
Return Value
The returned error codes are registered by the user in advance.
Error Codes
The error codes of this API are registered by the user.
Example of Overload and Implementation
Define the implementation of the inference engine. During implementation, the model inference is executed through the API of the model manager (AIModelManager), as shown in AIModelManager::Process.
HIAI_IMPL_ENGINE_PROCESS("FrameworkerEngine", FrameworkerEngine, FRAMEWORK_ENGINE_INPUT_SIZE) { hiai::AIStatus ret = hiai::SUCCESS; HIAI_StatusT hiai_ret = HIAI_OK; // receive data // arg0 indicates the input port (numbered 0) of the engine. If there are multiple input ports, you can use arg1 (numbered 1) and arg2 (numbered 2) to match the input ports. The data sent by the previous engine is obtained through the input port. std::shared_ptr<std::string> input_arg = std::static_pointer_cast<std::string>(arg0); if (nullptr == input_arg) { HIAI_ENGINE_LOG(this, HIAI_INVALID_INPUT_MSG, "[DEBUG] input arg is invalid"); return HIAI_INVALID_INPUT_MSG; } std::cout<<"FrameworkerEngine Process"<<std::endl; // prapare for calling the process of ai_model_manager_ std::vector<std::shared_ptr<hiai::IAITensor>> input_data_vec; uint32_t len = 75264; HIAI_ENGINE_LOG("FrameworkerEngine:Go to Process"); std::cout << "HIAIAippOp::Go to process" << std::endl; std::shared_ptr<hiai::AINeuralNetworkBuffer> neural_buffer = std::shared_ptr<hiai::AINeuralNetworkBuffer>(new hiai::AINeuralNetworkBuffer());//std::static_pointer_cast<hiai::AINeuralNetworkBuffer>(input_data); neural_buffer->SetBuffer((void*)(input_arg->c_str()), (uint32_t)(len)); std::shared_ptr<hiai::IAITensor> input_data = std::static_pointer_cast<hiai::IAITensor>(neural_buffer); input_data_vec.push_back(input_data); // call Process and inference hiai::AIContext ai_context; std::vector<std::shared_ptr<hiai::IAITensor>> output_data_vec; ret = ai_model_manager_->CreateOutputTensor(input_data_vec, output_data_vec); if (hiai::SUCCESS != ret) { HIAI_ENGINE_LOG(this, HIAI_AI_MODEL_MANAGER_PROCESS_FAIL, "[DEBUG] fail to process ai_model"); return HIAI_AI_MODEL_MANAGER_PROCESS_FAIL; } ret = ai_model_manager_->Process(ai_context, input_data_vec, output_data_vec, 0); if (hiai::SUCCESS != ret) { HIAI_ENGINE_LOG(this, HIAI_AI_MODEL_MANAGER_PROCESS_FAIL, "[DEBUG] fail to process ai_model"); return HIAI_AI_MODEL_MANAGER_PROCESS_FAIL; } std::cout<<"[DEBUG] output_data_vec size is "<< output_data_vec.size()<<std::endl; for (uint32_t index = 0; index < output_data_vec.size(); index++) { // send data of inference to destEngine std::shared_ptr<hiai::AINeuralNetworkBuffer> output_data = std::static_pointer_cast<hiai::AINeuralNetworkBuffer>(output_data_vec[index]); std::shared_ptr<std::string> output_string_ptr = std::shared_ptr<std::string>(new std::string((char*)output_data->GetBuffer(), output_data->GetSize())); hiai_ret = SendData(0, "string", std::static_pointer_cast<void>(output_string_ptr)); if (HIAI_OK != hiai_ret) { HIAI_ENGINE_LOG(this, HIAI_SEND_DATA_FAIL, "fail to send data"); return HIAI_SEND_DATA_FAIL; } } return HIAI_OK; }
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot