Orchestrating Pipeline Jobs
A job is the minimum manageable execution unit in a pipeline. Jobs can be orchestrated in serial and parallel mode in a stage.
Orchestrating Pipeline Jobs
- Access the CodeArts Pipeline homepage.
- On the pipeline list page, search for the target pipeline, click in the Operation column, and click Edit.
- On the Task Orchestration page, click Job under a stage.
- Click under a job to add a serial job. For example, a build job and deployment job must be executed sequentially.
- Click Parallel Job to add a parallel job. For example, a code check job and a build job can be executed at the same time.
- Configure extensions for the job by referring to the following table.
Table 1 Job configuration Operation
Description
Adding an extension
There are five types of extensions: build, code check, deployment, test, and normal extensions. You can filter or search for extensions by type. For more information, see Managing Pipeline Extensions.
Move the cursor to an extension card and click Add. Configure the following information:
- Enter an extension name.
- Select a job to be called. If no proper job is available, create a job as prompted.
- If the called job has parameters, the parameters will be displayed. Configure parameters as needed.
- You can add only one extension with flag Job to a single job. Extensions with flag draft indicate that they are draft extensions.
- The extension for suspending a pipeline can only be added to stages that do not contain parallel jobs.
Deleting an extension
Move the cursor to an extension card, click , and select Delete to delete the extension.
Replacing an extension
Move the cursor to an extension card, click , and select Replace to replace the extension. Or, click Replace Extension above the extension name to choose another extension.
Sorting extensions
Click, hold, and move an extension card to adjust the extension sequence.
Configuring jobs
Set the job ID, executor, and execution condition.
- Job ID: The job ID should be unique. Enter only letters, digits, hyphens (-), and underscores (_) with a maximum of 128 characters.
- You can use the built-in executor or customize one.
- Built-in executor: provided by CodeArts Pipeline with out-of-the-box availability.
- Custom executor: allows you to configure tools and running environments as needed. Before using a custom executor, add an agent pool. For details, see Agent Pools.
NOTE:You only need to configure executors for non-job-level extensions.
- Select Job
- Always: Job will always be selected for execution and cannot be canceled.
- Disabled: Job cannot be selected for execution.
- Selected by default: Job is selected for execution by default.
- Not selected by default: Job is not selected for execution by default.
- Execution conditions are the triggers for executing jobs in a pipeline.
- Even when previous job is not selected: The current job is executed if the previous job is completed or not selected.
- When previous job succeeds: The current job is executed only when the previous job is successfully executed.
- If previous job fails: The current job is executed only when the previous job fails.
- Always: The current job is always executed regardless of the previous job's final state (failed, completed, canceled, or ignored).
- With expression: When the previous job is COMPLETED, FAILED, CANCELED, and IGNORED and the expression result is true, the current job will be executed. The expression is in the format of ${{value}} and can be any combination of contexts, operators, functions, or literals. For details about the expression, see Expressions.
Example:
If the current job is executed regardless of whether the previous job (ID: job_1) succeeded or failed, the expression can be as follows:
${{ jobs.job_1.status == 'COMPLETED' || jobs.job_1.status == 'FAILED' }}
- After configuring the job, click OK. After the job is added, you can edit, clone, delete, or move the job.
Table 2 Job management Operation
Description
Editing a job
Click a job card to edit the job.
Cloning a job
Click on the job card to clone a serial job.
Deleting a job
Click on the job card and confirm the deletion as prompted.
Sorting jobs
Click, hold, and move a job card to adjust the sequence.
NOTE:Job sequence cannot be adjusted when jobs are executed in parallel.
- After the configuration, save the pipeline.
Expressions
An expression can be any combination of contexts, operators, functions, or literals. You can use an expression as the execution condition to control job execution. Contexts can be accessed programmatically with expressions, so information such as pipeline runs, sources, variables, and jobs can be transferred within a pipeline.
- Pipeline contexts
Contexts are a way to access information about pipeline runs, sources, variables, and jobs. Each context is an object that contains various attributes. The following table lists pipeline contexts.
Table 3 Pipeline contexts Context
Type
Description
pipeline
object
Information about the pipeline run.
sources
object
Information about the pipeline sources in each pipeline run.
env
object
Information about the custom parameters in each pipeline run.
jobs
object
Information about jobs that have reached the final states in each pipeline run.
- Context reference format
${{ <context>.<attribute_name> }}
context indicates the pipeline context, attribute_name indicates the attribute.
- Context scenarios
Most contexts can be used in any job and step of a pipeline.
- You can use contexts to specify the execution condition of a job.
- You can use contexts when configuring parameters to query information.
Figure 1 Referencing pipeline contexts
The following expression shows how to obtain all pipeline run information.${{ pipeline }}
The following expression shows how to obtain the triggering mode of a pipeline.${{ pipeline.trigger_type }}
- Contexts attributes
Table 4 Context attributes Context
Attribute
Type
Description
Example
pipeline context
pipeline
object
Information about the pipeline run. This object contains the following attributes: project_id, pipeline_id, run_number, timestamp, trigger_type, and run_id.
- Content example
The following example shows the pipeline context information contained in a manually executed pipeline.
{ "project_id": "6428c2e2b4b64affa14ec80896695c49", "pipeline_id": "f9981060660249a3856f46c2c402f244", "run_number": "168", "timestamp": "20231016000004", "trigger_type": "Manual", "run_id": "c2f507f93510459190b543e47f6c9bec" }
- Usage example
To obtain the triggering mode of the current pipeline, you can use the following syntax:
${{ pipeline.trigger_type }}
pipeline.project_id
string
ID of the project to which the current pipeline belongs. This string is the same as the predefined parameter PROJECT_ID.
pipeline.pipeline_id
string
Current pipeline ID. This string is the same as the predefined parameter PIPELINE_ID.
pipeline.run_number
string
Pipeline execution number. This string is the same as the predefined parameter PIPELINE_NUMBER.
pipeline.timestamp
string
Pipeline execution timestamp. This string is the same as the predefined parameter TIMESTAMP. The format is yyyyMMddHHmmss. For example, 20211222124301.
pipeline.trigger_type
string
Pipeline triggering type. This string is the same as the predefined parameter PIPELINE_TRIGGER_TYPE.
pipeline.run_id
string
Pipeline execution ID. This string is the same as the predefined parameter PIPELINE_RUN_ID.
sources context
sources
object
Information about the pipeline sources in each pipeline run. This object contains the following attributes: alias, repo_name, commit_id, commit_id_short, commit_message, repo_url, repo_type, repo_name, ssh_repo_url, tag, merge_id, source_branch, and target_branch.
- Content example
The following example shows the sources context information contained in a manually executed pipeline with a single code source. The alias of pipeline source is my_repo.
{ "my_repo": { "commit_id": "dedb73bb9abfdaab7d810f2616bae9d2b6632ecc", "commit_id_short": "dedb73bb", "commit_message": "maven0529 update pipeline0615.yml", "repo_url": "https://example.com/clsyz00001/maven0529.git", "repo_type": "codehub", "repo_name": "maven0529", "ssh_repo_url": "git@example.com:clsyz00001/maven0529.git", "target_branch": "master" } }
- Usage example
To obtain the running branch of the pipeline, you can use the following syntax:
${{ sources.my_repo.target_branch }}
sources.<alias>
object
Information about the pipeline source which has an alias.
sources.<repo_name>
object
Information about the pipeline source which does not have an alias but only a repository name. It contains the same information as that in sources.<alias>.
sources.<alias>.commit_id
string
The last commit ID before execution. This string is the same as the predefined parameter COMMIT_ID.
sources.<alias>.commit_id_short
string
The first 8 characters of the last commit ID before execution. This string is the same as the predefined parameter COMMIT_ID_SHORT.
sources.<alias>.commit_message
string
The commit information from the last code commit before the pipeline execution.
sources.<alias>.repo_url
string
Code repository address (HTTPS). This string is the same as the predefined parameter REPO_URL.
sources.<alias>.repo_type
string
Type of the code repository. For example, codehub, gitlab, github, gitee, and general_git.
sources.<alias>.repo_name
string
Name of the code repository.
sources.<alias>.ssh_repo_url
string
Code repository address (SSH).
sources.<alias>.tag
string
Tag name when the tag is triggered.
sources.<alias>.merge_id
string
Merge request ID when the merge request is triggered.
sources.<alias>.source_branch
string
Source branch name when the merge request is triggered.
sources.<alias>.target_branch
string
If the merge request is triggered, this string indicates the name of the target branch. Otherwise, this string indicates the name of the running branch.
env context
name
string
Name of a custom parameter.
value
string
Value of a custom parameter.
jobs context
jobs
object
Information about jobs in a pipeline. This object contains the following attributes: job_id, status, outputs, output_name, metrics, and metric_name.
- Content example
The following example shows the jobs context information in a run. There are two successfully executed jobs. The output of the check_job job is two metrics, and the output of the demo_job job is two general outputs.
{ "check_job": { "status": "COMPLETED", "metrics": { "critical": "0", "major": "0" } }, "demo_job": { "status": "COMPLETED", "outputs": { "output1": "val1", "output2": "val2" } } }
- Usage example
To obtain the value of output1 of demo_job, you can use the following syntax:
${{ jobs.demo_job.outputs.output1 }}
jobs.<job_id>
object
Information about the job with a specified ID.
jobs.<job_id>.status
string
Job execution result. The value can be INIT, QUEUED, RUNNING, CANCELED, COMPLETED, FAILED, PAUSED, IGNORED, SUSPEND, or UNSELECTED.
jobs.<job_id>.outputs
object
The running value, as a key-value pair.
jobs.<job_id>.outputs.<output_name>
string
The running value name.
jobs.<job_id>.metrics
object
The running metrics of a job. For example, the number of code check issues and the test pass rate.
jobs.<job_id>.metrics.<metric_name>
string
The running metric name of a job.
- Content example
- Context reference format
- Operator
The following table lists the operators that can be used in expressions.
Table 5 Expression operators Operator
Description
.
Attribute reference. For example, the ${{ pipeline.trigger_type }} expression can be used to obtain the trigger type.
!
False. For example, the ${{ !startsWith(sources.my_repo.target_branch, 'release') }} can be used to check whether the branch of the pipeline's code source does not start with "release".
==
Equal. For example, the ${{ pipeline.trigger_type == 'Manual' }} expression can be used to check whether a pipeline is triggered manually.
!=
Not equal. For example, the ${{ pipeline.trigger_type != 'Manual' }} expression can be used to check whether a pipeline is not triggered manually.
&&
And. For example, the ${{ pipeline.trigger_type == 'Manual' && sources.my_repo.target_branch == 'master' }} expression can be used to check whether a pipeline is triggered manually and the branch of the pipeline code source is master.
||
Or. For example, the ${{ pipeline.trigger_type == 'Manual' || sources.my_repo.target_branch == 'master' }} expression can be used to check whether a pipeline is triggered manually or the branch of the pipeline code source is master.
- Function
The following table lists the functions that can be used in expressions.
Table 6 Expression functions Function
Description
contains
startsWith
endsWith
Object filter
You can use the * syntax to apply a filter and select matching items in a collection.
The following is the context of a job execution.
{ "check_job": { "status": "COMPLETED", "metrics": { "critical": "0", "major": "0" } }, "demo_job": { "status": "FAILED" } }
- jobs.*.status indicates the status of all jobs. Therefore, ['COMPLETED', 'FAILED'] is returned.
- Filters can be used together with the contains function. For example, contains(jobs.*.status, 'FAILED') will return true because jobs.*.status contains FAILED.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot