Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Situation Awareness
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Creating an Import or Export Task

Updated on 2024-11-13 GMT+08:00

Function

This API is used to create an import or export task.

Constraints

This API is only supported for SFS Turbo 1,000 MB/s/TiB, 500 MB/s/TiB, 250 MB/s/TiB, 125 MB/s/TiB, 40 MB/s/TiB, and 20 MB/s/TiB file systems.

URI

POST /v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID

share_id

Yes

String

File system ID

Request Parameters

Table 2 Request header parameters

Parameter

Mandatory

Type

Description

X-Auth-Token

Yes

String

Account token

Content-Type

Yes

String

MIME type

Table 3 Request body parameters

Parameter

Mandatory

Type

Description

type

Yes

String

Task type, which can be import (additional metadata import), import_metadata (quick import), preload (data preload), or export (export).

import: Object metadata, including the name, size, and last modified time, as well as the additional metadata like uid, gid, and mode previously exported from SFS Turbo will all be imported.

import_metadata: Only the object metadata, including the name, size, and last modified time will be imported. After the import, SFS Turbo will, by default, generate the additional metadata.

preload: Both the metadata and data will be imported. The metadata includes only the object metadata. Additional metadata like uid, gid, and mode will not be imported.

export: SFS Turbo will export to the OBS bucket the files created in the interworking directory as well as the data previously imported from OBS and then modified in SFS Turbo.

src_target

Yes

String

Name of the interworking directory

src_prefix

No

String

Prefix of the source path of an import or export task. The OBS bucket name does not need to be included during import, and the linkage directory name does not need to be included during export.

For data preheating import, the source path prefix must be a directory or an object that ends with a slash (/).

If this field is not specified, all objects in the bound OBS bucket are imported, and all files in the linkage directory are exported.

dest_target

Yes

String

Keep the same as src_target.

dest_prefix

No

String

Keep the same as src_prefix.

attributes

No

ObsTargetAttributes object

Properties of the storage backend

Table 4 ObsTargetAttributes

Parameter

Mandatory

Type

Description

file_mode

No

Integer

Permissions on the imported file. Value range: 0 to 777

The first digit indicates the permissions of the file owner, and its value ranges from 0 to 7. The second digit indicates the permissions of the user group to which the file belongs, and its value ranges from 0 to 7. The third digit indicates the permissions of other users, and its value ranges from 0 to 7. The file owner is specified by UID, and the user group to which the file belongs is specified by GID. Users who are not the file owner and not in the user group to which the file belongs are other users.

Values 4, 2, and 1 indicate the read, write, and execute permissions respectively. The total value between 1 and 7 represents the access permissions. For example, the first digit 7 in 750 indicates that the file owner has the read, write, and execute permissions on the file, the second digit 5 indicates that the user group to which the file belongs has the read and execute permissions on the file, and the third digit 0 indicates that other users have no permission on the file.

dir_mode

No

Integer

Permissions on the imported directory. Value range: 0 to 777

The first digit indicates the permissions of the directory owner, and its value ranges from 0 to 7. The second digit indicates the permissions of the user group to which the directory belongs, and its value ranges from 0 to 7. The third digit indicates the permissions of other users, and its value ranges from 0 to 7. The directory owner is specified by UID, and the user group to which the directory belongs is specified by GID. Users who are not the directory owner and not in the user group to which the directory belongs are other users.

Values 4, 2, and 1 indicate the read, write, and execute permissions respectively. The total value between 1 and 7 represents the access permissions. For example, the first digit 7 in 750 indicates that the directory owner has the read, write, and execute permissions on the directory, the second digit 5 indicates that the user group to which the directory belongs has the read and execute permissions on the directory, and the third digit 0 indicates that other users have no permission on the directory.

uid

No

Integer

ID of the user who imports the object. The default value is 0. The value ranges from 0 to 4,294,967,294 (2^32 - 2).

gid

No

Integer

ID of the user group to which the imported object belongs. The default value is 0. The value ranges from 0 to 4,294,967,294 (2^32 - 2).

Response Parameters

Status code: 202

Table 5 Response header parameters

Parameter

Type

Description

X-request-id

String

Request ID

Table 6 Response body parameters

Parameter

Type

Description

task_id

String

Task ID

Status code: 400

Table 7 Response header parameters

Parameter

Type

Description

X-request-id

String

Request ID

Table 8 Response body parameters

Parameter

Type

Description

errCode

String

Error code

errMsg

String

Error description

Status code: 500

Table 9 Response header parameters

Parameter

Type

Description

X-request-id

String

Request ID

Table 10 Response body parameters

Parameter

Type

Description

errCode

String

Error code

errMsg

String

Error description

Example Requests

  • Creating an import task and choosing to import the metadata (with the interworking directory name set to sfs-link-directory and prefix of the source path in the OBS bucket set to input/datasets/)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "import_metadata",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/"
    }
  • Creating an import task and choosing to import the metadata (with the interworking directory name set to sfs-link-directory, the prefix of the source path in the OBS bucket set to input/datasets/, the permissions of the imported files set to 755, and the permissions of the imported directories set to 755)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "import_metadata",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/",
      "attributes" : {
        "file_mode" : 755,
        "dir_mode" : 755
      }
    }
  • Creating an import task and choosing to preload data (with the interworking directory name set to sfs-link-directory, the prefix of the source path in the OBS bucket set to input/datasets/, the permissions of the imported files set to 755, the permissions of the imported directories set to 755, and the UID and GID both set to 0)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "preload",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/",
      "attributes" : {
        "file_mode" : 755,
        "dir_mode" : 755,
        "uid" : 0,
        "gid" : 0
      }
    }

Example Responses

Status code: 202

Accepted

{
  "task_id" : "7bd2a9b6-xxxx-4605-xxxx-512d636001b0"
}

Status code: 400

Client error

{
  "errCode" : "SFS.TURBO.0001",
  "errMsg" : "parameter error"
}

Status code: 500

Internal error

{
  "errCode" : "SFS.TURBO.0005",
  "errMsg" : "Internal server error"
}

Status Codes

Status Code

Description

202

Accepted

400

Client error

500

Internal error

Error Codes

See Error Codes.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback