Creating a Data Import or Export Task
Function
This API is used to create a data import or export task.
Constraints
This API is only supported for SFS Turbo 1,000 MB/s/TiB, 500 MB/s/TiB, 250 MB/s/TiB, 125 MB/s/TiB, 40 MB/s/TiB, and 20 MB/s/TiB file systems.
URI
POST /v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
project_id |
Yes |
String |
The project ID. |
share_id |
Yes |
String |
The file system ID. |
Request Parameters
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
X-Auth-Token |
Yes |
String |
The account token. |
Content-Type |
Yes |
String |
The MIME type. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
type |
Yes |
String |
The task type, which can be import (additional metadata import), import_metadata (quick import), preload (data preload), or export (export). import: Object metadata, including the name, size, and last modification time, as well as the additional metadata like uid, gid, and mode previously exported from SFS Turbo will all be imported. import_metadata: Only the object metadata, including the name, size, and last modification time will be imported. After the import, SFS Turbo will, by default, generate the additional metadata. preload: Both the metadata and data will be imported. The metadata includes only the object metadata. Additional metadata like uid, gid, and mode will not be imported. export: SFS Turbo will export to the OBS bucket the files created in the interworking directory as well as the data previously imported from OBS and then modified in SFS Turbo. Enumeration values:
|
src_target |
Yes |
String |
The interworking directory name. |
src_prefix |
No |
String |
The source path prefix specified in a data import or export task. Do not include the OBS bucket name for an import task, and do not include the name of the interworking directory for an export task. For a data preload task, the source path prefix must be a directory path ended with a slash (/) or the path of a specific object. If this parameter is not specified, a data import task will import all objects from the OBS bucket, and a data export task will export all files in the interworking directory to the bucket. |
dest_target |
Yes |
String |
Keep it the same as src_target. |
dest_prefix |
No |
String |
Keep it the same as src_prefix. |
attributes |
No |
ObsTargetAttributes object |
The attributes of the storage backend. This parameter is not supported for file systems that are created on or before June 30, 2024 and have not been upgraded. Submit a service ticket if you need it. |
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
file_mode |
No |
Integer |
The permissions on the imported file. Value range: 0 to 777 The first digit indicates the permissions of the file owner, and its value ranges from 0 to 7. The second digit indicates the permissions of the user group to which the file belongs, and its value ranges from 0 to 7. The third digit indicates the permissions of other users, and its value ranges from 0 to 7. The file owner is specified by UID, and the user group to which the file belongs is specified by GID. Users who are not the file owner and not in the user group to which the file belongs are other users. Values 4, 2, and 1 indicate the read, write, and execute permissions respectively. The total value between 1 and 7 represents the access permissions. For example, the first digit 7 in 750 indicates that the file owner has the read, write, and execute permissions on the file, the second digit 5 indicates that the user group to which the file belongs has the read and execute permissions on the file, and the third digit 0 indicates that other users have no permission on the file. |
dir_mode |
No |
Integer |
The permissions on the imported directory. Value range: 0 to 777 The first digit indicates the permissions of the directory owner, and its value ranges from 0 to 7. The second digit indicates the permissions of the user group to which the directory belongs, and its value ranges from 0 to 7. The third digit indicates the permissions of other users, and its value ranges from 0 to 7. The directory owner is specified by UID, and the user group to which the directory belongs is specified by GID. Users who are not the directory owner and not in the user group to which the directory belongs are other users. Values 4, 2, and 1 indicate the read, write, and execute permissions respectively. The total value between 1 and 7 represents the access permissions. For example, the first digit 7 in 750 indicates that the directory owner has the read, write, and execute permissions on the directory, the second digit 5 indicates that the user group to which the directory belongs has the read and execute permissions on the directory, and the third digit 0 indicates that other users have no permission on the directory. |
uid |
No |
Integer |
The ID of the user who owns the imported object. The default value is 0. The value ranges from 0 to 4294967294 (2^32-2). |
gid |
No |
Integer |
The ID of the user group to which the imported object belongs. The default value is 0. The value ranges from 0 to 4294967294 (2^32-2). |
Response Parameters
Status code: 202
Parameter |
Type |
Description |
---|---|---|
X-request-id |
String |
The request ID. |
Parameter |
Type |
Description |
---|---|---|
task_id |
String |
The task ID. |
Status code: 400
Parameter |
Type |
Description |
---|---|---|
X-request-id |
String |
The request ID. |
Parameter |
Type |
Description |
---|---|---|
errCode |
String |
The error code. |
errMsg |
String |
The error message. |
Status code: 500
Parameter |
Type |
Description |
---|---|---|
X-request-id |
String |
The request ID. |
Parameter |
Type |
Description |
---|---|---|
errCode |
String |
The error code. |
errMsg |
String |
The error message. |
Example Requests
-
Creating a data import task (with the task type set to import_metadata, the interworking directory name set to sfs-link-directory, and the prefix of the source path in the OBS bucket set to input/datasets/)
POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task { "type" : "import_metadata", "src_target" : "sfs-link-directory", "src_prefix" : "input/datasets/", "dest_target" : "sfs-link-directory", "dest_prefix" : "input/datasets/" }
-
Creating a data import task (with the task type set to import_metadata, the interworking directory name set to sfs-link-directory, the prefix of the source path in the OBS bucket set to input/datasets/, the permissions of the imported files set to 640, and the permissions of the imported directories set to 750)
POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task { "type" : "import_metadata", "src_target" : "sfs-link-directory", "src_prefix" : "input/datasets/", "dest_target" : "sfs-link-directory", "dest_prefix" : "input/datasets/", "attributes" : { "file_mode" : 640, "dir_mode" : 750 } }
-
Creating a data import task (with the task type set to preload, the interworking directory name set to sfs-link-directory, the prefix of the source path in the OBS bucket set to input/datasets/, the permissions of the imported files set to 640, the permissions of the imported directories set to 750, and the UID and GID both set to 0)
POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task { "type" : "preload", "src_target" : "sfs-link-directory", "src_prefix" : "input/datasets/", "dest_target" : "sfs-link-directory", "dest_prefix" : "input/datasets/", "attributes" : { "file_mode" : 640, "dir_mode" : 750, "uid" : 0, "gid" : 0 } }
Example Responses
Status code: 202
Accepted
{
"task_id" : "7bd2a9b6-xxxx-4605-xxxx-512d636001b0"
}
Status code: 400
Client error
{
"errCode" : "SFS.TURBO.0001",
"errMsg" : "parameter error"
}
Status code: 500
Internal error
{
"errCode" : "SFS.TURBO.0005",
"errMsg" : "Internal server error"
}
Status Codes
Status Code |
Description |
---|---|
202 |
Accepted |
400 |
Client error |
500 |
Internal error |
Error Codes
See Error Codes.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot