Help Center/
PanguLargeModels/
FAQs/
FAQs Related to LLM Usage/
What Is the Mapping Between a Training or Inference Unit and Computing Power?
Updated on 2025-11-04 GMT+08:00
What Is the Mapping Between a Training or Inference Unit and Computing Power?
When a training task or a deployment task is created, the mapping between a consumed training or inference unit and computing power is as follows:
- The computing power of one training unit is 313 TFLOPS.
- The computing power of one inference unit is 313 TFLOPS.
Parent topic: FAQs Related to LLM Usage
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
The system is busy. Please try again later.
For any further questions, feel free to contact us through the chatbot.
Chatbot