Updated on 2025-06-16 GMT+08:00

Viewing All Metrics on AOM

To enable you to query the usage of inference instance resources, the cloud service platform reports metrics to AOM.

Prerequisites

  • You have a valid Huawei Cloud account.
  • You have at least one workspace available.
  • At least one inference instance is available.

Procedure

  1. Log in to the AOM console.
  2. Choose Metric Browsing and set Metric Sources to Prometheus_AOM_Default.

  3. Enter a metric name in the All metrics tab to query the metric.

    Table 1 Monitoring metrics

    Metric

    Description

    mu_usage

    This metric indicates the actual MU usage of the current inference instance.

    Unit: number.