Help Center/
DataArts Fabric/
User Guide/
Large Model Inference Scenarios/
Viewing All Metrics on AOM
Updated on 2025-06-16 GMT+08:00
Viewing All Metrics on AOM
To enable you to query the usage of inference instance resources, the cloud service platform reports metrics to AOM.
Prerequisites
- You have a valid Huawei Cloud account.
- You have at least one workspace available.
- At least one inference instance is available.
Procedure
- Log in to the AOM console.
- Choose Metric Browsing and set Metric Sources to Prometheus_AOM_Default.
- Enter a metric name in the All metrics tab to query the metric.
Table 1 Monitoring metrics Metric
Description
mu_usage
This metric indicates the actual MU usage of the current inference instance.
Unit: number.
Parent topic: Large Model Inference Scenarios
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
The system is busy. Please try again later.
For any further questions, feel free to contact us through the chatbot.
Chatbot