AIModelManager::IsPreAllocateOutputMem
Determines whether the output memory can be pre-allocated.
Related function:
AIStatus AIModelManager::CreateOutputTensor(const std::vector<std::shared_ptr<IAITensor>> &in_data, std::vector<std::shared_ptr<IAITensor>> &out_data);
Syntax
virtual bool AIModelManager::IsPreAllocateOutputMem() override ;
Parameter Description
None
Return Value
- True: You can call CreateOutputTensor to apply for output memory.
- False: You cannot call CreateOutputTensor.
True is returned only when the model manager loads a model.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot