Esta página ainda não está disponível no idioma selecionado. Estamos trabalhando para adicionar mais opções de idiomas. Agradecemos sua compreensão.
- What's New
- Service Overview
-
User Guide
- Preparations
- IAM Permissions Management
- Data Management
- Data Integration
- Data Development
- Solution
- O&M and Scheduling
- Configuration and Management
- Specifications
- Usage Tutorials
-
References
-
Nodes
- Node Overview
- CDM Job
- DIS Stream
- DIS Dump
- DIS Client
- Rest Client
- Import GES
- MRS Kafka
- Kafka Client
- CS Job
- DLI SQL
- DLI Spark
- DWS SQL
- MRS SparkSQL
- MRS Hive SQL
- MRS Presto SQL
- MRS Spark
- MRS Spark Python
- MRS Flink Job
- MRS MapReduce
- CSS
- Shell
- RDS SQL
- ETL Job
- OCR
- Create OBS
- Delete OBS
- OBS Manager
- Open/Close Resource
- Data Quality Monitor
- Subjob
- SMN
- Dummy
- For Each
- EL
-
Nodes
- Change History
- API Reference
-
FAQs
- What Is DLF?
- What Is DLF Used For?
- What Is a Job?
- How Many Jobs Can Be Created in DLF?
- Can I Adjust the Job Quota in DLF?
- Does DLF Provide Job Templates?
- What Is a Node?
- Is There a Limit on the Number of Nodes Contained in a DLF Job?
- How Can I Quickly Rectify a Deleted CDM Cluster Associated with a Job?
- What Causes a Large Difference Between the Plan Time and Start Time of a Job?
- How Do I Change the Message Language?
- What Can I Do When a Job Fails?
- Glossary
- General Reference
Copied.
Creating a Job
A job is composed of one or more nodes that are performed collaboratively to complete data operations. Before developing a job, create a new one.
(Optional) Creating a Directory
If a directory exists, you do not need to create one.
- Log in to the DLF console.
- In the navigation tree of the Data Development console, choose Data Development > Develop Job.
- In the directory list, right-click a directory and choose Create Directory from the shortcut menu.
- In the displayed dialog box, configure directory parameters. Table 1 describes the directory parameters.
Table 1 Job directory parameters Parameter
Description
Directory Name
Name of the job directory. Must consist of 1 to 32 characters and contain only letters, digits, underscores (_), and hyphens (-).
Select Directory
Parent directory of the job directory. The parent directory is the root directory by default.
- Click OK.
Creating a Job
The quantity of jobs is less than the maximum quota (10,000).
- In the navigation tree of the Data Development console, choose Data Development > Develop Job.
- Create a job using either of the following methods:
Method 1: In the area on the right, click Create Job.
Method 2: In the directory list, right-click a directory and choose Create Job from the shortcut menu.
- In the displayed dialog box, configure job parameters. Table 2 describes the job parameters.
Table 2 Job parameters Parameter
Description
Job Name
Name of the job. Must consist of 1 to 128 characters and contain only letters, digits, hyphens (-), underscores (_), and periods (.).
Processing Mode
Type of the job.
- Batch: Data is processed periodically in batches based on the scheduling plan, which is used in scenarios with low real-time requirements.
- Real-Time: Data is processed in real time, which is used in scenarios with high real-time performance.
Creation Method
Selects a job creation mode.
- Create Empty Job: Create an empty job.
- Create Based on Template: Create a job using a template.
Select Directory
Directory to which the job belongs. The root directory is selected by default.
Job Owner
Owner of the job.
Job Priority
Priority of the job. The value can be High, Medium, or Low.
Log Path
Selects the OBS path to save job logs. By default, logs are stored in a bucket named dlf-log-{Projectid}.
NOTE:
If you want to customize a storage path, select the bucket that you have created on OBS by referring to the instructions in Configuring a Log Storage Path.
- Click OK.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot