Updated on 2022-02-22 GMT+08:00

MRS Spark

Functions

The MRS Spark node is used to execute a predefined Spark job on MRS.

Parameters

Table 1 and Table 2 describe the parameters of the MRS Spark node.

Table 1 Parameters of MRS Spark nodes

Parameter

Mandatory

Description

Node Name

Yes

Name of the node. Must consist of 1 to 128 characters and contain only letters, digits, underscores (_), hyphens (-), slashes (/), less-than signs (<), and greater-than signs (>).

MRS Cluster Name

Yes

Name of the MRS cluster.

To create an MRS cluster, use either of the following methods:
  • Click . On the Clusters page, create an MRS cluster.
  • Go to the MRS console to create an MRS cluster.

Spark Job Name

Yes

Name of the MRS job. Must consist of 1 to 64 characters and contain only letters, digits, and underscores (_).

JAR Package

Yes

Select a JAR package. Before selecting a JAR package, you need to upload the JAR package to the OBS bucket, create a resource on the Resource Management page, and add the JAR package to the resource management list. For details, see Creating a Resource.

JAR File Parameters

No

Parameters of the JAR package.

Program Parameter

No

Used to configure optimization parameters such as threads, memory, and vCPUs for the job to optimize resource usage and improve job execution performance.

NOTE:

This parameter is mandatory if the cluster version is MRS 1.8.7 or later than MRS 2.0.1.

For details about the program parameters of MRS Spark jobs, see Running a Spark Job in the MapReduce User Guide.

Input Data Path

No

Path where the input data resides.

Output Data Path

No

Path where the output data resides.

Table 2 Advanced parameters

Parameter

Mandatory

Description

Node Status Polling Interval (s)

Yes

Specifies how often the system check completeness of the node task. The value ranges from 1 to 60 seconds.

Max. Node Execution Duration

Yes

Execution timeout interval for the node. If retry is configured and the execution is not complete within the timeout interval, the node will not be retried and is set to the failed state.

Retry upon Failure

Yes

Indicates whether to re-execute a node task if its execution fails. Possible values:

  • Yes: The node task will be re-executed, and the following parameters must be configured:
    • Maximum Retries
    • Retry Interval (seconds)
  • No: The node task will not be re-executed. This is the default setting.
NOTE:

If Timeout Interval is configured for the node, the node will not be executed again after the execution times out. Instead, the node is set to the failure state.

Failure Policy

Yes

Operation that will be performed if the node task fails to be executed. Possible values:

  • End the current job execution plan
  • Go to the next job
  • Suspend the current job execution plan
  • Suspend execution plans of the current and subsequent nodes