Esta página ainda não está disponível no idioma selecionado. Estamos trabalhando para adicionar mais opções de idiomas. Agradecemos sua compreensão.
- What's New
- Service Overview
-
User Guide
- Preparations
- IAM Permissions Management
- Data Management
- Data Integration
- Data Development
- Solution
- O&M and Scheduling
- Configuration and Management
- Specifications
- Usage Tutorials
-
References
-
Nodes
- Node Overview
- CDM Job
- DIS Stream
- DIS Dump
- DIS Client
- Rest Client
- Import GES
- MRS Kafka
- Kafka Client
- CS Job
- DLI SQL
- DLI Spark
- DWS SQL
- MRS SparkSQL
- MRS Hive SQL
- MRS Presto SQL
- MRS Spark
- MRS Spark Python
- MRS Flink Job
- MRS MapReduce
- CSS
- Shell
- RDS SQL
- ETL Job
- OCR
- Create OBS
- Delete OBS
- OBS Manager
- Open/Close Resource
- Data Quality Monitor
- Subjob
- SMN
- Dummy
- For Each
- EL
-
Nodes
- Change History
- API Reference
-
FAQs
- What Is DLF?
- What Is DLF Used For?
- What Is a Job?
- How Many Jobs Can Be Created in DLF?
- Can I Adjust the Job Quota in DLF?
- Does DLF Provide Job Templates?
- What Is a Node?
- Is There a Limit on the Number of Nodes Contained in a DLF Job?
- How Can I Quickly Rectify a Deleted CDM Cluster Associated with a Job?
- What Causes a Large Difference Between the Plan Time and Start Time of a Job?
- How Do I Change the Message Language?
- What Can I Do When a Job Fails?
- Glossary
- General Reference
Copied.
MRS Flink Job
Function
The MRS Flink node is used to execute predefined Flink jobs in MRS.
Parameter
Configure the parameters of the MRS Flink node by referring to Table 1 and Table 2.
Parameter |
Mandatory |
Description |
---|---|---|
Node Name |
Yes |
Name of the node. Must consist of 1 to 128 characters and contain only letters, digits, underscores (_), hyphens (-), slashes (/), less-than signs (<), and greater-than signs (>). |
MRS Cluster Name |
Yes |
Select the MRS cluster.
To create an MRS cluster, use either of the following methods:
|
Flink Job Name |
Yes |
Name of the MRS Flink job. Must consist of 1 to 64 characters and contain only letters, digits, and underscores (_). |
Flink Job Resource Package |
Yes |
Select a JAR package. Before selecting a JAR package, you need to upload the JAR package to the OBS bucket, create a resource on the Resource Management page, and add the JAR package to the resource management list. For details, see Creating a Resource. |
Flink Job Parameter |
No |
Key parameter of the program that executes the Flink job. This parameter is specified by a function in the user program. Separate multiple parameters with space. |
Program Parameter |
No |
Used to configure optimization parameters such as threads, memory, and vCPUs for the job to optimize resource usage and improve job execution performance. This parameter is mandatory if the cluster version is MRS 1.8.7 or later than MRS 2.0.1. For details about the program parameters of MRS Flink jobs, see Running a Flink Job in the MapReduce User Guide. |
Input Data Path |
No |
Path where the input data resides. |
Output Data Path |
No |
Path where the output data resides. |
Parameter |
Mandatory |
Description |
---|---|---|
Max. Node Execution Duration |
Yes |
Maximum duration of executing a node. When Retry upon Failure is set to Yes for a node, the node can be re-executed for numerous times upon an execution failure within the maximum duration. |
Retry upon Failure |
Yes |
Specifies whether to re-execute a node after the node fails to be executed.
If Timeout Interval is configured for the node, the node will not be executed again after the execution times out. Instead, the node is set to the failure state. |
Failure Policy |
Yes |
Policies to be performed after the node fails to be executed:
|
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot