Importing a Job
Function
This API is used to import one or more job files from OBS to DLF.
Before using this API, store job files in OBS buckets.
URI
- Parameter description
Table 1 URI parameter Parameter
Mandatory
Type
Description
project_id
Yes
String
Project ID. For details about how to obtain a project ID, see Project ID and Account ID.
Request
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
path |
Yes |
String |
If OBS is deployed, it refers to the OBS path for storing the job definition file. For the format of the job definition file, see the response message of the exported job, for example, obs://myBucket/jobs.zip. |
params |
No |
Map<String,String> |
Public job parameter. |
sameNamePolicy |
No |
String |
Policy for specifying how to handle duplicate names. The options are as follows:
Default value: SKIP |
jobsParam |
No |
List<JobParam> |
Job parameter. For details, see Table 3. |
executeUser |
No |
String |
User that executes the job. |
Response
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
taskId |
Yes |
String |
ID of the task. Used to call the API for querying system tasks to obtain the import status. |
Example
Import jobs from OBS to DLF. If there are jobs and scripts with the same name, overwrite them.
- Request
POST /v1/b384b9e9ab9b4ee8994c8633aabc9505/jobs/import { "path": "obs://aaaaa/job_batch.zip", "sameNamePolicy": "OVERWRITE", "jobsParam": [{ "name": "job_batch", "params": { "streamName": "dis-AHTr" } }] }
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot