Help Center> Data Lake Insight> FAQs> Problems Related to Spark Jobs> Job Development> How Do I View the Resource Usage of DLI Spark Jobs?
Updated on 2023-05-19 GMT+08:00

How Do I View the Resource Usage of DLI Spark Jobs?

Viewing the Configuration of a Spark Job

Log in to the DLI console. In the navigation pane, choose Job Management > Spark Jobs. In the job list, locate the target job and click next to Job ID to view the parameters of the job.

The content is displayed only when the parameters in Advanced Settings are configured during Spark job creation.

Figure 1 Viewing the configuration of a Spark job

Viewing Real-Time Resource Usage of a Spark Job

Perform the following operations to view the number of running CUs occupied by a Spark job in real time:

  1. Log in to the DLI console. In the navigation pane, choose Job Management > Spark Jobs. In the job list, locate the target job and click SparkUI in the Operation column.
  2. On the Spark UI page, view the real-time running resources of the Spark job.
    Figure 2 SparkUI
  3. On the Spark UI page, view the original configuration of the Spark job (available only to new clusters).

    On the Spark UI page, click Environment to view Driver and Executor information.

    Figure 3 Driver information
    Figure 4 Executor information

Job Development FAQs

more