Updated on 2022-07-04 GMT+08:00

Spark Job Management

Spark job management provides the following functions:

In addition, you can click Quick Links to switch to the details on User Guide.

Job Management Page

On the Overview page, click Spark Jobs to go to the SQL job management page. Alternatively, you can click Job Management > Spark Jobs. The Spark Jobs page displays all Spark jobs. If there are a large number of jobs, they will be displayed on multiple pages. DLI allows you to view jobs in all statuses.

After the job is executed successfully, the job record is saved for only 6 hours.

Table 1 Job management parameters

Parameter

Description

Job ID

ID of a submitted Spark job, which is generated by the system by default.

Job Name

Name of a submitted Spark job.

Executed By

Name of the user who executed the Spark job.

Job Status

Job status. The following values are available:

  • Starting: The job is being started.
  • Running: The job is being executed.
  • Failed: The session has exited.
  • Finished: The session is successfully executed.
  • Restoring: The job is being restored.

Queue

Queue where the submitted Spark job is located.

Created

Time when a job is created. Jobs can be displayed in ascending or descending order of the job creation time.

Last Modified

Time when a job is completed.

Operation

  • Edit: You can modify the current job configuration and re-execute the job.
  • SparkUI: After you click this button, the Spark job execution page is displayed.
    NOTE:

    The SparkUI page cannot be viewed for jobs in the Starting state.

  • Terminate Job: Cancel the current job.
  • Archive Log: Save job logs to the temporary bucket created by DLI.
  • Export Log: Export logs to the created OBS bucket.
    NOTE:
    • You have the permission to create OBS buckets.
    • If the job is in the Running state, logs cannot be exported.
  • Commit Log: View the logs of submitted jobs.
  • Driver Log: View the logs of running jobs.

Re-executing a Job

On the Spark Jobs page, click Edit in the Operation column of the job. On the Spark job creation page that is displayed, modify parameters as required and execute the job.

Searching for a Job

On the Spark Jobs page, select Job Status or Queue. The system displays the jobs that meet the filter condition in the job list.

Terminating a Job

On the Spark Jobs page, choose More > Terminate Job in the Operation column of the job to stop the job.

Exporting Logs

On the Spark Jobs page, choose More > Export Log in the Operation column of the corresponding job. In the dialog box that is displayed, enter the path of the created OBS bucket and click OK.