Using DLF to Re-run Jobs

RES provides an API for re-running jobs to re-run the jobs with the same configurations and update the task results generated offline. Call this API at a fixed interval to obtain recommendation results in real time and make better recommendations.

The preceding functions can also be dragged to the list of functions by using DLF. The procedure is as follows:

  1. Log in to the DGC management console. On the displayed page, click Data Development to enter the DLF page. Then click Develop Job.
  2. On the Workspace page, click Create Job.
    1. Change the job name to a name that you can identify. The value can contain only letters, digits, Chinese characters, hyphens (-), commas (,), and periods (.), and can contain 1 to 128 characters.
    2. Retain the default job type, creation mode, selected directory, job owner, job priority, and log path.
  3. Click OK. A message indicating that the job created successfully is displayed.
  4. Select RestClient under Data Integration from Node Library, drag two nodes to the blank area on the right, as shown in Figure 1. For details about how to set the parameters of the RestClient node, see Data Lake Factory > User Guide > References > Nodes >Rest Client.
    Figure 1 Node configurations for re-running jobs

    When DLF is used, authentication is not required.

  5. Choose Save > Test. Ensure that the functions are normal.
  6. On the right of the page, click Scheduling Setup and set the scheduling period and other parameters as required.
  7. On the job page, choose Monitoring > Monitor Job and click Submit. After the scheduling is complete, click on the left of the job name to view the scheduling details.

    For the pricing details of DLF, see the DLF Purchase Guide.