Halaman ini belum tersedia dalam bahasa lokal Anda. Kami berusaha keras untuk menambahkan lebih banyak versi bahasa. Terima kasih atas dukungan Anda.
- What's New
- Service Overview
-
User Guide
- Preparations
- IAM Permissions Management
- Data Management
- Data Integration
- Data Development
- Solution
- O&M and Scheduling
- Configuration and Management
- Specifications
- Usage Tutorials
-
References
-
Nodes
- Node Overview
- CDM Job
- DIS Stream
- DIS Dump
- DIS Client
- Rest Client
- Import GES
- MRS Kafka
- Kafka Client
- CS Job
- DLI SQL
- DLI Spark
- DWS SQL
- MRS SparkSQL
- MRS Hive SQL
- MRS Presto SQL
- MRS Spark
- MRS Spark Python
- MRS Flink Job
- MRS MapReduce
- CSS
- Shell
- RDS SQL
- ETL Job
- OCR
- Create OBS
- Delete OBS
- OBS Manager
- Open/Close Resource
- Data Quality Monitor
- Subjob
- SMN
- Dummy
- For Each
- EL
-
Nodes
- Change History
- API Reference
-
FAQs
- What Is DLF?
- What Is DLF Used For?
- What Is a Job?
- How Many Jobs Can Be Created in DLF?
- Can I Adjust the Job Quota in DLF?
- Does DLF Provide Job Templates?
- What Is a Node?
- Is There a Limit on the Number of Nodes Contained in a DLF Job?
- How Can I Quickly Rectify a Deleted CDM Cluster Associated with a Job?
- What Causes a Large Difference Between the Plan Time and Start Time of a Job?
- How Do I Change the Message Language?
- What Can I Do When a Job Fails?
- Glossary
- General Reference
Show all
Copied.
MRS Hive SQL
Functions
The MRS Hive SQL node is used to execute a predefined Hive SQL script on DLF.
Parameters
Table 1 and Table 2 describe the parameters of the MRS Hive SQL node.
Parameter |
Mandatory |
Description |
---|---|---|
SQL Script |
Yes |
Path to a script to be executed. If the script is not created, create and develop the script by referring to Creating a Script and Developing an SQL Script. |
Data Connection |
Yes |
Data connection that is configured in the SQL script. The value can be changed. |
Database |
Yes |
Database that is configured in the SQL script. The value can be changed. |
Script Parameter |
No |
If the associated SQL script uses a parameter, the parameter name is displayed. Set the parameter value in the text box next to the parameter name. The parameter value can be a built-in function and EL expression. For details about built-in functions and EL expressions, see Expression Overview. If the parameters of the associated SQL script are changed, click |
Program Parameter |
No |
Used to configure optimization parameters such as threads, memory, and vCPUs for the job to optimize resource usage and improve job execution performance. This parameter is mandatory if the cluster version is MRS 1.8.7 or later than MRS 2.0.1. For details about the program parameters of MRS SparkSQL jobs, see Running a HiveSql Job > Table 2 Program Parameter parameters in the MapReduce User Guide. |
Node Name |
Yes |
Name of the SQL script. The value can be changed. The rules are as follows: Name of the node. Must consist of 1 to 128 characters and contain only letters, digits, underscores (_), hyphens (-), slashes (/), less-than signs (<), and greater-than signs (>). |
Parameter |
Mandatory |
Description |
---|---|---|
Node Status Polling Interval (s) |
Yes |
Specifies how often the system check completeness of the node task. The value ranges from 1 to 60 seconds. |
Max. Node Execution Duration |
Yes |
Execution timeout interval for the node. If retry is configured and the execution is not complete within the timeout interval, the node will not be retried and is set to the failed state. |
Retry upon Failure |
Yes |
Indicates whether to re-execute a node task if its execution fails. Possible values:
If Timeout Interval is configured for the node, the node will not be executed again after the execution times out. Instead, the node is set to the failure state. |
Failure Policy |
Yes |
Operation that will be performed if the node task fails to be executed. Possible values:
|
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot