Managing Spark Jobs
Viewing Basic Information
On the Overview page, click Spark Jobs to go to the SQL job management page. Alternatively, you can click Job Management > Spark Jobs. The page displays all Spark jobs. If there are a large number of jobs, they will be displayed on multiple pages. DLI allows you to view jobs in all statuses.
Parameter |
Description |
---|---|
Job ID |
ID of a submitted Spark job, which is generated by the system by default. |
Name |
Name of a submitted Spark job. |
Queues |
Queue where the submitted Spark job runs |
Username |
Name of the user who executed the Spark job |
Status |
Job status. The following values are available:
|
Created |
Time when a job is created. Jobs can be displayed in ascending or descending order of the job creation time. |
Last Modified |
Time when a job is completed. |
Operation |
|
Re-executing a Job
On the Spark Jobs page, click Edit in the Operation column of the job. On the Spark job creation page that is displayed, modify parameters as required and execute the job.
Searching for a Job
On the Spark Jobs page, select Status or Queues. The system displays the jobs that meet the filter condition in the job list.
Terminating a Job
On the Spark Jobs page, choose More > Terminate Job in the Operation column of the job that you want to stop.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot