Updated on 2022-03-13 GMT+08:00

AIModelManager::IsPreAllocateOutputMem

Determines whether the output memory can be pre-allocated.

Related function:

AIStatus AIModelManager::CreateOutputTensor(const std::vector<std::shared_ptr<IAITensor>> &in_data, std::vector<std::shared_ptr<IAITensor>> &out_data);

Syntax

virtual bool AIModelManager::IsPreAllocateOutputMem() override ;

Parameter Description

None

Return Value

  • True: You can call CreateOutputTensor to apply for output memory.
  • False: You cannot call CreateOutputTensor.

    True is returned only when the model manager loads a model.