Help Center/ PanguLargeModels/ FAQs/ FAQs Related to LLM Usage/ What Is the Mapping Between a Training or Inference Unit and Computing Power?
Updated on 2025-11-04 GMT+08:00

What Is the Mapping Between a Training or Inference Unit and Computing Power?

When a training task or a deployment task is created, the mapping between a consumed training or inference unit and computing power is as follows:

  • The computing power of one training unit is 313 TFLOPS.
  • The computing power of one inference unit is 313 TFLOPS.