Spark Job Management
Spark job management provides the following functions:
In addition, you can click Quick Links to switch to the details on User Guide.
Job Management Page
On the Overview page, click Spark Jobs to go to the SQL job management page. Alternatively, you can click Job Management > Spark Jobs. The Spark Jobs page displays all Spark jobs. If there are a large number of jobs, they will be displayed on multiple pages. DLI allows you to view jobs in all statuses.
After the job is executed successfully, the job record is saved for only 6 hours.
Parameter |
Description |
---|---|
Job ID |
ID of a submitted Spark job, which is generated by the system by default. |
Job Name |
Name of a submitted Spark job. |
Executed By |
Name of the user who executed the Spark job. |
Job Status |
Job status. The following values are available:
|
Queue |
Queue where the submitted Spark job is located. |
Created |
Time when a job is created. Jobs can be displayed in ascending or descending order of the job creation time. |
Last Modified |
Time when a job is completed. |
Operation |
|
Re-executing a Job
On the Spark Jobs page, click Edit in the Operation column of the job. On the Spark job creation page that is displayed, modify parameters as required and execute the job.
Searching for a Job
On the Spark Jobs page, select Job Status or Queue. The system displays the jobs that meet the filter condition in the job list.
Terminating a Job
On the Spark Jobs page, choose More > Terminate Job in the Operation column of the job to stop the job.
Exporting Logs
On the Spark Jobs page, choose More > Export Log in the Operation column of the corresponding job. In the dialog box that is displayed, enter the path of the created OBS bucket and click OK.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot