Updated on 2022-12-14 GMT+08:00

Querying Information About a Job

Function

This API is used to query information about a specified job in an MRS cluster.

URI

  • Format

    GET /v2/{project_id}/clusters/{cluster_id}/job-executions/{job_execution_id}

  • Parameter description
    Table 1 URI parameter description

    Parameter

    Mandatory

    Description

    project_id

    Yes

    Project ID. For details on how to obtain the project ID, see Obtaining a Project ID.

    cluster_id

    Yes

    Cluster ID. For details on how to obtain the cluster ID, see Obtaining a Cluster ID.

    job_execution_id

    Yes

    Job ID. For details on how to obtain the job ID, see Obtaining a Job ID.

Request

Request parameters

None.

Response

Table 2 Response parameter description

Parameter

Type

Description

job_detail

Object

Job details. For details about the parameter, see Table 3.

Table 3 Job parameter description

Parameter

Type

Description

job_id

String

Job ID.

user

String

Name of the user who submits a job.

job_name

String

Job name. It contains 1 to 64 characters. Only letters, digits, hyphens (-), and underscores (_) are allowed.

job_result

String

Final result of a job.

  • FAILED: indicates that the job fails to be executed.
  • KILLED: indicates that the job is manually terminated during execution.
  • UNDEFINED: indicates that the job is being executed.
  • SUCCEEDED: indicates that the job has been successfully executed.

job_state

String

Execution status of a job.

  • FAILED: failed
  • KILLED: indicates that the job is terminated.
  • New: indicates that the job is created.
  • NEW_SAVING: indicates that the job has been created and is being saved.
  • SUBMITTED: indicates that the job is submitted.
  • ACCEPTED: indicates that the job is accepted.
  • RUNNING: indicates that the job is running.
  • FINISHED: indicates that the job is completed.

job_progress

Float

Job execution progress.

job_type

String

Type of a job.

  • MapReduce
  • SparkSubmit
  • SparkSubmit: Select SparkSubmit when you call an API to query a SparkPython job.
  • HiveScript
  • HiveSql
  • DistCp, importing and exporting data
  • SparkScript
  • SparkSql
  • Flink

started_time

Long

Start time to run a job. Unit: ms.

submitted_time

Long

Time when a job is submitted. Unit: ms.

finished_time

Long

End time to run a job. Unit: ms.

elapsed_time

Long

Running duration of a job. Unit: ms.

arguments

Array

Running parameter. The parameter contains a maximum of 4,096 characters, excluding special characters such as ;|&>'<$, and can be left blank.

properties

Object

Configuration parameter, which is used to configure -d parameters. The parameter contains a maximum of 2,048 characters, excluding special characters such as ><|'`&!\, and can be left blank.

launcher_id

String

Launcher job ID.

app_id

String

Actual job ID.

Example

  • Example request

    None.

  • Example response
    • Example of a successful response
      {
          "job_detail": {
              "job_id": "431b135e-c090-489f-b1db-0abe3822b855",
              "user": "xxxx",
              "job_name": "pyspark1",
              "job_result": "SUCCEEDED",
              "job_state": "FINISHED",
              "job_progress": 100,
              "job_type": "SparkSubmit",
              "started_time": 1564626578817,
              "submitted_time": 1564626561541,
              "finished_time": 1564626664930,
              "elapsed_time": 86113,
              "queue": "default",
              "arguments": "[--class, org.apache.spark.examples.SparkPi, --driver-memory, 512MB, --num-executors, 1, --executor-cores, 1, --master, yarn-cluster, obs://obs-test/jobs/spark/spark-examples_2.11-2.1.0.jar, 10000]",
              "launcher_id": "application_1564622673393_0006",
              "app_id": "application_1564622673393_0007",
              "properties": "{}"
          }
      }
    • Example of a failed response
      {
      "error_msg": "Failed to query the job."
      "error_code":"0162"
      }

Status Code

For details about status codes, see Status Codes.