Developing a Job
DLF allows you to develop existing jobs.
Prerequisites
You have created a job. For details about how to create a job, see Creating a Job.
Compiling Job Nodes
- Log in to the DLF console.
- In the navigation tree of the Data Development console, choose .
- In the job directory, double-click a job that you want to develop. The job development page is displayed.
- Drag the desired node to the canvas, move the mouse over the node, and select the icon and drag it to connect to another node.
Each job can contain a maximum of 200 nodes.
Configuring Basic Job Information
After you configure the owner and priority for a job, you can search for the job by the owner and priority. The procedure is as follows:
Select a job. On the job development page, click the Basic Job Information tab. On the displayed page, configure parameters. Table 1 describes the parameters.
Parameter |
Description |
---|---|
Owner |
An owner configured during job creation is automatically matched. This parameter value can be modified. |
Executor |
User that executes the job. When you enter an executor, the job is executed by the executor. If the executor is left unspecified, the job is executed by the user who submitted the job for startup. |
Priority |
Priority configured during job creation is automatically matched. This parameter value can be modified. |
Execution Timeout |
Timeout of the job instance. If this parameter is set to 0 or is not set, this parameter does not take effect. If the notification function is enabled for the job and the execution time of the job instance exceeds the preset value, the system sends a specified notification. |
Custom Parameter |
Set the name and value of the parameter. |
Configuring Job Parameters
Job parameters can be globally used in any node in jobs. The procedure is as follows:
Select a job. On the job development page, click the Job Parameter Setup tab. On the displayed page, configure parameters. Table 2 describes the parameters.
Configuring Job Scheduling Tasks
You can configure job scheduling tasks for batch jobs. There are three scheduling types available: Run once, Run periodically, and Event-driven. The procedure is as follows:
Select a job. On the job development page, click the Scheduling Parameter Setup tab. On the displayed page, configure parameters. Table 3 describes the parameters.
Parameter |
Description |
---|---|
Schedule Type |
Job schedule type. Possible values:
|
Parameters for Run periodically |
|
Effective Time |
Period during which a job runs. |
Schedule Cycle |
Frequency at which a job is run. The job can be run once every:
|
Dependency Job |
Job that is depended on. The constraints are as follows:
|
Action After Dependency Job Failure |
Action that will be performed on the current job if the dependency job fails to be executed. Possible values:
|
Run upon completion of the dependency job's last schedule cycle. |
Specify whether, before executing the current job, the dependency job's last schedule cycle must end. |
Cross-Cycle Dependency |
Dependency between instances in the job. Possible values:
|
Parameters for Event-driven |
|
Triggering Event Type |
Type of the event that triggers job running. |
DIS Stream Name |
Name of the DIS stream. When a new message is sent to the specified DIS stream, Data Development transfers the new message to the job to trigger the job running. |
Concurrent Events |
Number of jobs that can be concurrently processed. The maximum number of concurrent events is 128. |
Event Detection Interval |
Interval at which the system detects the DIS stream or OBS path for new messages. The unit of the interval can be set to second or minute. |
Access Policy |
Select the location where data is to be read:
|
Failure Policy |
Select a policy to be performed after scheduling fails.
|
Configuring Node Properties
Click a node in the canvas. On the displayed Node Properties page, configure node properties. For details, see Node Overview.
Configuring Node Scheduling Tasks
You can configure node scheduling tasks for real-time jobs. There are three scheduling types available: Run once, Run periodically, and Event-driven. The procedure is as follows:
Select a node. On the node development page, click the Scheduling Parameter Setup tab. On the displayed page, configure parameters. Table 4 describes the parameters.
Parameter |
Description |
---|---|
Schedule Type |
Job schedule type. Possible values:
|
Parameters for Run periodically |
|
Effective Time |
Period during which a job runs. |
Schedule Cycle |
Frequency at which a job is run. The job can be run once every:
|
Cross-Cycle Dependency |
Dependency between instances in the job. Possible values:
|
Parameters for Event-driven |
|
Triggering Event Type |
Type of the event that triggers job running. |
DIS Stream Name |
Name of the DIS stream. When a new message is sent to the specified DIS stream, Data Development transfers the new message to the job to trigger the job running. |
Concurrent Events |
Number of jobs that can be concurrently processed. The maximum number of concurrent events is 10. |
Event Detection Interval |
Interval at which the system detects the DIS stream or OBS path for new messages. The unit of the interval can be set to second or minute. |
Failure Policy |
Select a policy to be performed after scheduling fails.
|
More Node Functions
To obtain more node functions, right-click the node icon in the canvas and choose a function as required. Table 5 describes the node functions.
Function |
Description |
---|---|
Configure |
Goes to the Node Property page of the node. |
Delete |
Deletes one or more nodes at the same time.
|
Copy |
Copies one or more nodes to any job.
|
Test Run |
Runs the node for a test. |
Upload Simulation Data |
Uploads the simulation data to the DIS Stream node. The following types of simulation data are provided:
|
Add Note |
Adds a note to the node. Each node can have multiple notes. |
Edit Script |
Goes to the script editing page and edits the associated script. This option is available only for the node associated with a script. |
Saving and Starting Jobs
After a job is configured, complete the following operations:
Batch processing job
- Click to test the job.
- After the test is completed, click to save the job configuration information.
- Click to start the job.
Processing jobs in real time
- Click to save the job configuration.
- Click to submit and start the job.
DLF allows you to modify a running real-time processing job. After the job is modified, click Save. On the page that is displayed, click OK to save the job.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot