Managing Flink Jobs
After creating a job, you can manage it by performing various operations such as editing its basic information, starting or stopping it, and importing or exporting it.
Editing a Job
You can edit a created job, for example, by modifying the SQL statement, job name, job description, or job configurations.
- In the left navigation pane of the DLI management console, choose Flink Jobs page is displayed. > . The
- In the row where the job you want to edit locates, click Edit in the Operation column to switch to the editing page.
- Edit the job as required.
For details, see Creating a Flink OpenSource SQL Job and Creating a Flink Jar Job.
Starting a Job
You can start a saved or stopped job.
- In the left navigation pane of the DLI management console, choose Flink Jobs page is displayed. > . The
- Use either of the following methods to start jobs:
- Starting a single job
Select a job and click Start in the Operation column.
Alternatively, you can select the row where the job you want to start locates and click Start in the upper left of the job list.
- Batch starting jobs
Select the rows where the jobs you want to start locate and click Start in the upper left of the job list.
After you click Start, the Start Flink Jobs page is displayed.
- Starting a single job
- On the Start Flink Jobs page, confirm the job information and price. If they are correct, click Start Now.
After a job is started, you can view the job execution result in the
column.
Stopping a Job
You can stop a job in the Running or Submitting state.
- In the left navigation pane of the DLI management console, choose Flink Jobs page is displayed. > . The
- Stop a job using either of the following methods:
- Stopping a job
Locate the row that contains the job to be stopped, click More in the Operation column, and select Stop.
Alternatively, you can select the row where the job you want to stop locates and click Stop in the upper left of the job list.
- Batch stopping jobs
Locate the rows containing the jobs you want to stop and click Stop in the upper left of the job list.
- Stopping a job
- In the displayed Stop Job dialog box, click OK to stop the job.
Figure 1 Stopping a job
- Before stopping a job, you can trigger a savepoint to save the job status information. When you start the job again, you can choose whether to restore the job from the savepoint.
- If you select Trigger savepoint, a savepoint is created. If Trigger savepoint is not selected, no savepoint is created. By default, the savepoint function is disabled.
- The lifecycle of a savepoint starts when the savepoint is triggered and stops the job, and ends when the job is restarted. The savepoint is automatically deleted after the job is restarted.
When a job is being stopped, the job status is displayed in the Status column of the job list. The details are as follows:- Stopping: indicates that the job is being stopped.
- Stopped: indicates that the job is stopped successfully.
- Stop failed: indicates that the job failed to be stopped.
Deleting a Job
If you do not need to use a job, perform the following operations to delete it. A deleted job cannot be restored. Therefore, exercise caution when deleting a job.
- In the left navigation pane of the DLI management console, choose Flink Jobs page is displayed. > . The
- Perform either of the following methods to delete jobs:
- Deleting a single job
Locate the row containing the job you want to delete and click in the Operation column.
Alternatively, you can select the row containing the job you want to delete and click Delete in the upper left of the job list.
- Deleting jobs in batches
Select the rows containing the jobs you want to delete and click Delete in the upper left of the job list.
- Deleting a single job
- Click Yes.
Exporting a Job
You can export the created Flink jobs to an OBS bucket.
This mode is applicable to the scenario where a large number of jobs need to be created when you switch to another region, project, or user. In this case, you do not need to create a job. You only need to export the original job, log in to the system in a new region or project, or use a new user to import the job.
When switching to another project or user, you need to grant permissions to the new project or user. For details, see Managing Flink Job Permissions.
- In the left navigation pane of the DLI management console, choose Flink Jobs page is displayed. > . The
- Click Export Job in the upper right corner. The Export Job dialog box is displayed.
Figure 2 Exporting a job
- Select the OBS bucket where the job is stored. Click Next.
- Select job information you want to export.
By default, configurations of all jobs are exported. You can enable the Custom Export function to export configurations of the desired jobs.
- Click Confirm to export the job.
Importing a Job
You can import the Flink job configuration file stored in the OBS bucket to the Flink Jobs page of DLI.
This mode is applicable to the scenario where a large number of jobs need to be created when you switch to another region, project, or user. In this case, you do not need to create a job. You only need to export the original job, log in to the system in a new region or project, or use a new user to import the job.
To import a self-created job, use the job creation function.
For details, see Creating a Flink OpenSource SQL Job and Creating a Flink Jar Job.
- When switching to another project or user, you need to grant permissions to the new project or user. For details, see Managing Flink Job Permissions.
- Only jobs whose data format is the same as that of Flink jobs exported from DLI can be imported.
- In the left navigation pane of the DLI management console, choose Flink Jobs page is displayed. > . The
- Click Import Job in the upper right corner. The Import Job dialog box is displayed.
- Select the complete OBS path of the job configuration file to be imported. Click Next.
- Configure the same-name job policy and click next. Click Next.
- Select Overwrite job of the same name. If the name of the job to be imported already exists, the existing job configuration will be overwritten and the job status switches to Draft.
- If Overwrite job of the same name is not selected and the name of the job to be imported already exists, the job will not be imported.
- Ensure that Config File and Overwrite Same-Name Job are correctly configured. Click Confirm to import the job.
Modifying Name and Description
You can change the job name and description as required.
- In the left navigation pane of the DLI management console, choose Flink Jobs page is displayed. > . The
- In the Operation column of the job whose name and description need to be modified, choose . The Modify Name and Description dialog box is displayed. Change the name or modify the description of a job.
- Click OK.
Triggering a Savepoint
When you need to stop a job, you can create a savepoint to save the job status information. In this case, when you restart the job, you can choose to restore the job from the latest savepoint.
- You can click Trigger Savepoint for jobs in the Running status to save the job status.
- The lifecycle of a savepoint starts when the savepoint is triggered and stops the job, and ends when the job is restarted. The savepoint is automatically deleted after the job is restarted.
Importing to a Savepoint
You can import a savepoint to restore the job status. For details about the savepoint, see Checkpointing at the official website of Flink.
You need to select the OBS path of the save point.
Runtime Configuration
You can select Runtime Configuration to configure job exception alarms and restart options.
Flink SQL, Flink Jar, and Flink OpenSource SQL jobs are supported.
- In the Operation column of the Flink job, choose More > Runtime Configuration.
- In the Runtime Configuration dialog box, set the following parameters:
Figure 3 Runtime configuration
Table 1 Running parameters Parameter
Description
Name
Job name.
Alarm Generation upon Job Exception
Whether to report job exceptions, for example, abnormal job running or exceptions due to an insufficient balance, to users via SMS or email.
If this option is selected, you need to set the following parameters:
SMN Topic
Select a custom SMN topic. For how to create a custom SMN topic, see Creating a Topic.
Auto Restart upon Exception
Whether to enable automatic restart. If this function is enabled, any job that has become abnormal will be automatically restarted.
If this option is selected, you need to set the following parameters:
- Max. Retry Attempts: maximum number of retries upon an exception. The unit is times/hour.
- Unlimited: The number of retries is unlimited.
- Limited: The number of retries is user-defined.
- Restore Job from Checkpoint: Restore the job from the saved checkpoint.
NOTE:
This parameter cannot be configured for Flink SQL jobs or Flink OpenSource SQL jobs.
If this parameter is selected, you need to set Checkpoint Path for Flink Jar jobs.
Checkpoint Path: Select the checkpoint saving path. The checkpoint path must be the same as that you set in the application package. Note that the checkpoint path for each job must be unique. Otherwise, the checkpoint cannot be obtained.
- Max. Retry Attempts: maximum number of retries upon an exception. The unit is times/hour.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot