Optimization of Data Backhaul
After the inference is complete, the inference result or inference end signal needs to be sent to the host. If the inference engine calls SendData to send data back to the host, the time of the inference engine is consumed. It is recommended that an independent engine (for example, DataOptEngine) be deployed to return data. After the inference is completed, the inference engine transparently transmits the processed data to DataOptEngine, which forwards the data to an engine (for example, DstEngine) on the host side.
// DataOptEngine on the device side sends data to the host side. HIAI_IMPL_ENGINE_PROCESS("DataOptEngine", DataOptEngine, 1) { HIAI_StatusT hiaiRet = HIAI_OK; if (arg0 == nullptr) { HIAI_ENGINE_LOG(HIAI_INVALID_INPUT_MSG, "get inference result timeout"); return HIAI_INVALID_INPUT_MSG; } hiaiRet = SendData(0, "EngineTransNewT", arg0); } // DataOptEngine on the device side sends data to the host side. HIAI_IMPL_ENGINE_PROCESS("DstEngine", DstEngine, 1) { HIAI_StatusT ret = HIAI_OK; if (nullptr != arg0) { printf("dest engine had receive data already\n"); std::shared_ptr<std::string> result = std::static_pointer_cast<std::string>(arg0); ret = SendData(0, "string", result); if (HIAI_OK != ret) { HIAI_ENGINE_LOG(ret, "DstEngine SendData to recv failed"); return ret; } } else { HIAI_ENGINE_LOG(HIAI_INVALID_INPUT_MSG, "DestEngine Fail to receive data"); printf("destengine do not receive data arg0 is null\n"); return HIAI_INVALID_INPUT_MSG; } return HIAI_OK; }
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot