หน้านี้ยังไม่พร้อมใช้งานในภาษาท้องถิ่นของคุณ เรากำลังพยายามอย่างหนักเพื่อเพิ่มเวอร์ชันภาษาอื่น ๆ เพิ่มเติม ขอบคุณสำหรับการสนับสนุนเสมอมา
- What's New
- Service Overview
-
User Guide
- Preparations
- IAM Permissions Management
- Data Management
- Data Integration
- Data Development
- Solution
- O&M and Scheduling
- Configuration and Management
- Specifications
- Usage Tutorials
-
References
-
Nodes
- Node Overview
- CDM Job
- DIS Stream
- DIS Dump
- DIS Client
- Rest Client
- Import GES
- MRS Kafka
- Kafka Client
- CS Job
- DLI SQL
- DLI Spark
- DWS SQL
- MRS SparkSQL
- MRS Hive SQL
- MRS Presto SQL
- MRS Spark
- MRS Spark Python
- MRS Flink Job
- MRS MapReduce
- CSS
- Shell
- RDS SQL
- ETL Job
- OCR
- Create OBS
- Delete OBS
- OBS Manager
- Open/Close Resource
- Data Quality Monitor
- Subjob
- SMN
- Dummy
- For Each
- EL
-
Nodes
- Change History
- API Reference
-
FAQs
- What Is DLF?
- What Is DLF Used For?
- What Is a Job?
- How Many Jobs Can Be Created in DLF?
- Can I Adjust the Job Quota in DLF?
- Does DLF Provide Job Templates?
- What Is a Node?
- Is There a Limit on the Number of Nodes Contained in a DLF Job?
- How Can I Quickly Rectify a Deleted CDM Cluster Associated with a Job?
- What Causes a Large Difference Between the Plan Time and Start Time of a Job?
- How Do I Change the Message Language?
- What Can I Do When a Job Fails?
- Glossary
- General Reference
Copied.
Functions
Data Management
- Manages multiple data warehouses, such as DWS, MRS Hive, and DLI.
- Manages data tables using the visual interface or data definition language (DDL).
Data Integration
Works with Cloud Data Migration (CDM) to enable reliable and effective data transmission between 20+ disparate data sources and effortlessly integrate data sources into data warehouses.
Script Development
- Provides an online script editor that allows more than one operator to collaboratively develop and debug SQL and Shell scripts online.
- Allows usage of variables and functions.
Job Development
- Provides a graphical designer that allows you to quickly build a data processing workflow by drag-and-drop.
- Presets multiple task types such as data integration, MR, Spark, machining learning, SQL, and shell and completes data analysis and processing by dependency between tasks.
- Supports job import and export.
Resource Management
Supports unified management of file, jar, and archive resources used during script and job development.
Job Scheduling
Supports Run once, Run periodically, and Event-driven. If Run periodically is selected, you can run a job by Minute, Hour, Day, Week, or Month.
Monitoring
- Supports basic management of a job, including run, pause, restore, and terminate.
- Allows you to view the operation details of each job and each node in the job.
- Supports diverse alert methods so that the related personnel can be notified when a job or task error occurs.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot