Bu sayfa henüz yerel dilinizde mevcut değildir. Daha fazla dil seçeneği eklemek için yoğun bir şekilde çalışıyoruz. Desteğiniz için teşekkür ederiz.
- What's New
- Service Overview
-
User Guide
- Preparations
- IAM Permissions Management
- Data Management
- Data Integration
- Data Development
- Solution
- O&M and Scheduling
- Configuration and Management
- Specifications
- Usage Tutorials
-
References
-
Nodes
- Node Overview
- CDM Job
- DIS Stream
- DIS Dump
- DIS Client
- Rest Client
- Import GES
- MRS Kafka
- Kafka Client
- CS Job
- DLI SQL
- DLI Spark
- DWS SQL
- MRS SparkSQL
- MRS Hive SQL
- MRS Presto SQL
- MRS Spark
- MRS Spark Python
- MRS Flink Job
- MRS MapReduce
- CSS
- Shell
- RDS SQL
- ETL Job
- OCR
- Create OBS
- Delete OBS
- OBS Manager
- Open/Close Resource
- Data Quality Monitor
- Subjob
- SMN
- Dummy
- For Each
- EL
-
Nodes
- Change History
- API Reference
-
FAQs
- What Is DLF?
- What Is DLF Used For?
- What Is a Job?
- How Many Jobs Can Be Created in DLF?
- Can I Adjust the Job Quota in DLF?
- Does DLF Provide Job Templates?
- What Is a Node?
- Is There a Limit on the Number of Nodes Contained in a DLF Job?
- How Can I Quickly Rectify a Deleted CDM Cluster Associated with a Job?
- What Causes a Large Difference Between the Plan Time and Start Time of a Job?
- How Do I Change the Message Language?
- What Can I Do When a Job Fails?
- Glossary
- General Reference
Show all
Copied.
DLI Spark
Functions
The DLI Spark node is used to execute a predefined Spark job.
Parameters
Table 1 and Table 2 describe the parameters of the DLI Spark node.
Parameter |
Mandatory |
Description |
---|---|---|
Node Name |
Yes |
Name of the node. Must consist of 1 to 128 characters and contain only letters, digits, underscores (_), hyphens (-), slashes (/), less-than signs (<), and greater-than signs (>). |
DLI Queue |
Yes |
Select a queue from the drop-down list box. |
Job Name |
Yes |
Name of the DLI Spark job. It must consist of 1 to 64 characters and contain only letters, digits, and underscores (_). The default value is the same as the node name. |
Job Running Resources |
No |
Select the running resource specifications of the job.
|
Major Job Class |
Yes |
Main class of the DLI Spark job, that is, the main class of the JAR package. |
Spark program resource package |
Yes |
JAR package of the user-defined Spark application. Before selecting a resource package, you need to upload the JAR package and its dependency packages to the OBS bucket and create resources on the resource management page. For details, see Creating a Resource. |
Major-Class Entry Parameters |
No |
Enter the entry parameters of the program. Press Enter to separate the parameters. |
Spark Job Running Parameters |
No |
Enter a parameter in the format of key/value. Press Enter to separate multiple key-value pairs. For details about the parameters, see Spark Configuration. |
Module Name |
No |
Dependency modules provided by DLI for executing datasource connection jobs. To access different services, you need to select different modules.
|
Parameter |
Mandatory |
Description |
---|---|---|
Node Status Polling Interval (s) |
Yes |
Specifies how often the system check completeness of the node task. The value ranges from 1 to 60 seconds. |
Max. Node Execution Duration |
Yes |
Execution timeout interval for the node. If retry is configured and the execution is not complete within the timeout interval, the node will not be retried and is set to the failed state. |
Retry upon Failure |
Yes |
Indicates whether to re-execute a node task if its execution fails. Possible values:
If Timeout Interval is configured for the node, the node will not be executed again after the execution times out. Instead, the node is set to the failure state. |
Failure Policy |
Yes |
Operation that will be performed if the node task fails to be executed. Possible values:
|
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot