DLI Spark
Function
The DLI Spark node is used to execute a predefined Spark job.
Parameters
Table 1, Table 2, and Table 3 describe the parameters of the DLI Spark node.
Parameter |
Mandatory |
Description |
---|---|---|
Node Name |
Yes |
Name of the node. The value must consist of 1 to 128 characters and contain only letters, digits, and the following special characters: _-/<> |
DLI Queue |
Yes |
Select a queue from the drop-down list box. |
Job Type |
No |
Select a custom image and the corresponding version. This parameter is available only when the DLI queue is a containerized queue. A custom image is a feature of DLI. You can use the Spark or Flink basic images provided by DLI to pack the dependencies (files, JAR packages, or software) required into an image using Dockerfile, generate a custom image, and release the image to SWR. Then, select the generated image and run the job. Custom images can change the container runtime environments of Spark and Flink jobs. You can embed private capabilities into custom images to enhance the functions and performance of jobs. . |
Job Name |
Yes |
Name of the DLI Spark job. The name must contain 1 to 64 characters, including only letters, numbers, and underscores (_). The default value is the same as the node name. |
Job Running Resources |
No |
Select the running resource specifications of the job.
|
Major Job Class |
Yes |
Name of the major class of the Spark job. When the application type is .jar, the main class name cannot be empty. |
Spark program resource package |
Yes |
JAR file on which the Spark job depends. You can enter the JAR package name or the corresponding OBS path. The format is as follows: obs://Bucket name/Folder name/Package name. Before selecting a resource package, upload the JAR package and its dependency packages to the OBS bucket and create resources on the Manage Resource page. For details, see Creating a Resource. |
Resource Type |
Yes |
Select OBS path or DLI program package.
|
Group |
No |
This parameter is mandatory when Resource Type is set to DLI program package. You can select Use existing, Create new, or Do not use. |
Group Name |
No |
This parameter is mandatory when Resource Type is set to DLI program package.
|
Major-Class Entry Parameters |
No |
User-defined parameters. Separate multiple parameters by Enter. These parameters can be replaced by global variables. For example, if you create a global variable batch_num on the Global Configuration > Global Variables page, you can use {{batch_num}} to replace a parameter with this variable after the job is submitted. |
Spark Job Running Parameters |
No |
Enter a parameter in the format of key/value. Press Enter to separate multiple key-value pairs. For details about the parameters, see Spark Configuration. These parameters can be replaced by global variables. For example, if you create a global variable custom_class on the Global Configuration > Global Variables page, you can use "spark.sql.catalog"={{custom_class}} to replace a parameter with this variable after the job is submitted.
NOTE:
The JVM garbage collection algorithm cannot be customized for Spark jobs. |
Module Name |
No |
Dependency modules provided by DLI for executing datasource connection jobs. To access different services, you need to select different modules.
DLI internal modules include:
|
Metadata Access |
Yes |
Whether to access metadata through Spark jobs. For details, see section "Using the Spark Job to Access DLI Metadata" in Data Lake Insight Developer Guide. |
Parameter |
Mandatory |
Description |
---|---|---|
Node Status Polling Interval (s) |
Yes |
How often the system checks completeness of the node task. The value ranges from 1 to 60 seconds. |
Max. Node Execution Duration |
Yes |
Maximum duration of executing a node. When Retry upon Failure is set to Yes for a node, the node can be re-executed for numerous times upon an execution failure within the maximum duration. |
Retry upon Failure |
Yes |
Whether to re-execute a node after the node fails to be executed.
NOTE:
If Max. Node Execution Duration is configured for the node, the node will not be executed again after the execution times out. Instead, the node is set to the failure state. |
Failure Policy |
Yes |
Policies to be performed after the node fails to be executed:
|
Dry run |
No |
If you select this option, the node will not be executed, and a success message will be returned. |
Parameter |
Description |
---|---|
Input |
|
Add |
Click Add. In the Type drop-down list, select the type to be created. The value can be DWS, OBS, CSS, HIVE, DLI, or CUSTOM.
|
OK |
Click OK to save the parameter settings. |
Cancel |
Click Cancel to cancel the parameter settings. |
Modify |
Click to modify the parameter settings. After the modification, save the settings. |
Delete |
Click to delete the parameter settings. |
View Details |
Click to view details about the table created based on the input lineage. |
Output |
|
Add |
Click Add. In the Type drop-down list, select the type to be created. The value can be DWS, OBS, CSS, HIVE, DLI, or CUSTOM.
|
OK |
Click OK to save the parameter settings. |
Cancel |
Click Cancel to cancel the parameter settings. |
Modify |
Click to modify the parameter settings. After the modification, save the settings. |
Delete |
Click to delete the parameter settings. |
View Details |
Click to view details about the table created based on the output lineage. |
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot