Updated on 2024-11-13 GMT+08:00

Creating an Import or Export Task

Function

This API is used to create an import or export task.

Constraints

This API is only supported for SFS Turbo 1,000 MB/s/TiB, 500 MB/s/TiB, 250 MB/s/TiB, 125 MB/s/TiB, 40 MB/s/TiB, and 20 MB/s/TiB file systems.

URI

POST /v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID

share_id

Yes

String

File system ID

Request Parameters

Table 2 Request header parameters

Parameter

Mandatory

Type

Description

X-Auth-Token

Yes

String

Account token

Content-Type

Yes

String

MIME type

Table 3 Request body parameters

Parameter

Mandatory

Type

Description

type

Yes

String

Task type, which can be import (additional metadata import), import_metadata (quick import), preload (data preload), or export (export).

import: Object metadata, including the name, size, and last modified time, as well as the additional metadata like uid, gid, and mode previously exported from SFS Turbo will all be imported.

import_metadata: Only the object metadata, including the name, size, and last modified time will be imported. After the import, SFS Turbo will, by default, generate the additional metadata.

preload: Both the metadata and data will be imported. The metadata includes only the object metadata. Additional metadata like uid, gid, and mode will not be imported.

export: SFS Turbo will export to the OBS bucket the files created in the interworking directory as well as the data previously imported from OBS and then modified in SFS Turbo.

src_target

Yes

String

Name of the interworking directory

src_prefix

No

String

Prefix of the source path of an import or export task. The OBS bucket name does not need to be included during import, and the linkage directory name does not need to be included during export.

For data preheating import, the source path prefix must be a directory or an object that ends with a slash (/).

If this field is not specified, all objects in the bound OBS bucket are imported, and all files in the linkage directory are exported.

dest_target

Yes

String

Keep the same as src_target.

dest_prefix

No

String

Keep the same as src_prefix.

attributes

No

ObsTargetAttributes object

Properties of the storage backend

Table 4 ObsTargetAttributes

Parameter

Mandatory

Type

Description

file_mode

No

Integer

Permissions on the imported file. Value range: 0 to 777

The first digit indicates the permissions of the file owner, and its value ranges from 0 to 7. The second digit indicates the permissions of the user group to which the file belongs, and its value ranges from 0 to 7. The third digit indicates the permissions of other users, and its value ranges from 0 to 7. The file owner is specified by UID, and the user group to which the file belongs is specified by GID. Users who are not the file owner and not in the user group to which the file belongs are other users.

Values 4, 2, and 1 indicate the read, write, and execute permissions respectively. The total value between 1 and 7 represents the access permissions. For example, the first digit 7 in 750 indicates that the file owner has the read, write, and execute permissions on the file, the second digit 5 indicates that the user group to which the file belongs has the read and execute permissions on the file, and the third digit 0 indicates that other users have no permission on the file.

dir_mode

No

Integer

Permissions on the imported directory. Value range: 0 to 777

The first digit indicates the permissions of the directory owner, and its value ranges from 0 to 7. The second digit indicates the permissions of the user group to which the directory belongs, and its value ranges from 0 to 7. The third digit indicates the permissions of other users, and its value ranges from 0 to 7. The directory owner is specified by UID, and the user group to which the directory belongs is specified by GID. Users who are not the directory owner and not in the user group to which the directory belongs are other users.

Values 4, 2, and 1 indicate the read, write, and execute permissions respectively. The total value between 1 and 7 represents the access permissions. For example, the first digit 7 in 750 indicates that the directory owner has the read, write, and execute permissions on the directory, the second digit 5 indicates that the user group to which the directory belongs has the read and execute permissions on the directory, and the third digit 0 indicates that other users have no permission on the directory.

uid

No

Integer

ID of the user who imports the object. The default value is 0. The value ranges from 0 to 4,294,967,294 (2^32 - 2).

gid

No

Integer

ID of the user group to which the imported object belongs. The default value is 0. The value ranges from 0 to 4,294,967,294 (2^32 - 2).

Response Parameters

Status code: 202

Table 5 Response header parameters

Parameter

Type

Description

X-request-id

String

Request ID

Table 6 Response body parameters

Parameter

Type

Description

task_id

String

Task ID

Status code: 400

Table 7 Response header parameters

Parameter

Type

Description

X-request-id

String

Request ID

Table 8 Response body parameters

Parameter

Type

Description

errCode

String

Error code

errMsg

String

Error description

Status code: 500

Table 9 Response header parameters

Parameter

Type

Description

X-request-id

String

Request ID

Table 10 Response body parameters

Parameter

Type

Description

errCode

String

Error code

errMsg

String

Error description

Example Requests

  • Creating an import task and choosing to import the metadata (with the interworking directory name set to sfs-link-directory and prefix of the source path in the OBS bucket set to input/datasets/)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "import_metadata",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/"
    }
  • Creating an import task and choosing to import the metadata (with the interworking directory name set to sfs-link-directory, the prefix of the source path in the OBS bucket set to input/datasets/, the permissions of the imported files set to 755, and the permissions of the imported directories set to 755)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "import_metadata",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/",
      "attributes" : {
        "file_mode" : 755,
        "dir_mode" : 755
      }
    }
  • Creating an import task and choosing to preload data (with the interworking directory name set to sfs-link-directory, the prefix of the source path in the OBS bucket set to input/datasets/, the permissions of the imported files set to 755, the permissions of the imported directories set to 755, and the UID and GID both set to 0)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "preload",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/",
      "attributes" : {
        "file_mode" : 755,
        "dir_mode" : 755,
        "uid" : 0,
        "gid" : 0
      }
    }

Example Responses

Status code: 202

Accepted

{
  "task_id" : "7bd2a9b6-xxxx-4605-xxxx-512d636001b0"
}

Status code: 400

Client error

{
  "errCode" : "SFS.TURBO.0001",
  "errMsg" : "parameter error"
}

Status code: 500

Internal error

{
  "errCode" : "SFS.TURBO.0005",
  "errMsg" : "Internal server error"
}

Status Codes

Status Code

Description

202

Accepted

400

Client error

500

Internal error

Error Codes

See Error Codes.