Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
Huawei Cloud Astro Canvas
Huawei Cloud Astro Zero
CodeArts Governance
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance (CCI)
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Cloud Transformation
Well-Architected Framework
Cloud Adoption Framework
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Blockchain
Blockchain Service
Web3 Node Engine Service
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Creating a Data Import or Export Task

Updated on 2025-04-28 GMT+08:00

Function

This API is used to create a data import or export task.

Constraints

This API is only supported for SFS Turbo 1,000 MB/s/TiB, 500 MB/s/TiB, 250 MB/s/TiB, 125 MB/s/TiB, 40 MB/s/TiB, and 20 MB/s/TiB file systems.

URI

POST /v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

The project ID.

share_id

Yes

String

The file system ID.

Request Parameters

Table 2 Request header parameters

Parameter

Mandatory

Type

Description

X-Auth-Token

Yes

String

The account token.

Content-Type

Yes

String

The MIME type.

Table 3 Request body parameters

Parameter

Mandatory

Type

Description

type

Yes

String

The task type, which can be import (additional metadata import), import_metadata (quick import), preload (data preload), or export (export).

import: Object metadata, including the name, size, and last modification time, as well as the additional metadata like uid, gid, and mode previously exported from SFS Turbo will all be imported.

import_metadata: Only the object metadata, including the name, size, and last modification time will be imported. After the import, SFS Turbo will, by default, generate the additional metadata.

preload: Both the metadata and data will be imported. The metadata includes only the object metadata. Additional metadata like uid, gid, and mode will not be imported.

export: SFS Turbo will export to the OBS bucket the files created in the interworking directory as well as the data previously imported from OBS and then modified in SFS Turbo.

Enumeration values:

  • import

  • export

  • import_metadata

  • preload

src_target

Yes

String

The interworking directory name.

src_prefix

No

String

The source path prefix specified in a data import or export task. Do not include the OBS bucket name for an import task, and do not include the name of the interworking directory for an export task.

For a data preload task, the source path prefix must be a directory path ended with a slash (/) or the path of a specific object.

If this parameter is not specified, a data import task will import all objects from the OBS bucket, and a data export task will export all files in the interworking directory to the bucket.

dest_target

Yes

String

Keep it the same as src_target.

dest_prefix

No

String

Keep it the same as src_prefix.

attributes

No

ObsTargetAttributes object

The attributes of the storage backend. This parameter is not supported for file systems that are created on or before June 30, 2024 and have not been upgraded. Submit a service ticket if you need it.

Table 4 ObsTargetAttributes

Parameter

Mandatory

Type

Description

file_mode

No

Integer

The permissions on the imported file. Value range: 0 to 777

The first digit indicates the permissions of the file owner, and its value ranges from 0 to 7. The second digit indicates the permissions of the user group to which the file belongs, and its value ranges from 0 to 7. The third digit indicates the permissions of other users, and its value ranges from 0 to 7. The file owner is specified by UID, and the user group to which the file belongs is specified by GID. Users who are not the file owner and not in the user group to which the file belongs are other users.

Values 4, 2, and 1 indicate the read, write, and execute permissions respectively. The total value between 1 and 7 represents the access permissions. For example, the first digit 7 in 750 indicates that the file owner has the read, write, and execute permissions on the file, the second digit 5 indicates that the user group to which the file belongs has the read and execute permissions on the file, and the third digit 0 indicates that other users have no permission on the file.

dir_mode

No

Integer

The permissions on the imported directory. Value range: 0 to 777

The first digit indicates the permissions of the directory owner, and its value ranges from 0 to 7. The second digit indicates the permissions of the user group to which the directory belongs, and its value ranges from 0 to 7. The third digit indicates the permissions of other users, and its value ranges from 0 to 7. The directory owner is specified by UID, and the user group to which the directory belongs is specified by GID. Users who are not the directory owner and not in the user group to which the directory belongs are other users.

Values 4, 2, and 1 indicate the read, write, and execute permissions respectively. The total value between 1 and 7 represents the access permissions. For example, the first digit 7 in 750 indicates that the directory owner has the read, write, and execute permissions on the directory, the second digit 5 indicates that the user group to which the directory belongs has the read and execute permissions on the directory, and the third digit 0 indicates that other users have no permission on the directory.

uid

No

Integer

The ID of the user who owns the imported object. The default value is 0. The value ranges from 0 to 4294967294 (2^32-2).

gid

No

Integer

The ID of the user group to which the imported object belongs. The default value is 0. The value ranges from 0 to 4294967294 (2^32-2).

Response Parameters

Status code: 202

Table 5 Response header parameters

Parameter

Type

Description

X-request-id

String

The request ID.

Table 6 Response body parameters

Parameter

Type

Description

task_id

String

The task ID.

Status code: 400

Table 7 Response header parameters

Parameter

Type

Description

X-request-id

String

The request ID.

Table 8 Response body parameters

Parameter

Type

Description

errCode

String

The error code.

errMsg

String

The error message.

Status code: 500

Table 9 Response header parameters

Parameter

Type

Description

X-request-id

String

The request ID.

Table 10 Response body parameters

Parameter

Type

Description

errCode

String

The error code.

errMsg

String

The error message.

Example Requests

  • Creating a data import task (with the task type set to import_metadata, the interworking directory name set to sfs-link-directory, and the prefix of the source path in the OBS bucket set to input/datasets/)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "import_metadata",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/"
    }
  • Creating a data import task (with the task type set to import_metadata, the interworking directory name set to sfs-link-directory, the prefix of the source path in the OBS bucket set to input/datasets/, the permissions of the imported files set to 640, and the permissions of the imported directories set to 750)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "import_metadata",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/",
      "attributes" : {
        "file_mode" : 640,
        "dir_mode" : 750
      }
    }
  • Creating a data import task (with the task type set to preload, the interworking directory name set to sfs-link-directory, the prefix of the source path in the OBS bucket set to input/datasets/, the permissions of the imported files set to 640, the permissions of the imported directories set to 750, and the UID and GID both set to 0)

    POST HTTPS://{endpoint}/v1/{project_id}/sfs-turbo/{share_id}/hpc-cache/task
    
    {
      "type" : "preload",
      "src_target" : "sfs-link-directory",
      "src_prefix" : "input/datasets/",
      "dest_target" : "sfs-link-directory",
      "dest_prefix" : "input/datasets/",
      "attributes" : {
        "file_mode" : 640,
        "dir_mode" : 750,
        "uid" : 0,
        "gid" : 0
      }
    }

Example Responses

Status code: 202

Accepted

{
  "task_id" : "7bd2a9b6-xxxx-4605-xxxx-512d636001b0"
}

Status code: 400

Client error

{
  "errCode" : "SFS.TURBO.0001",
  "errMsg" : "parameter error"
}

Status code: 500

Internal error

{
  "errCode" : "SFS.TURBO.0005",
  "errMsg" : "Internal server error"
}

Status Codes

Status Code

Description

202

Accepted

400

Client error

500

Internal error

Error Codes

See Error Codes.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback