Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Situation Awareness
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
Software Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Querying the Dataset Import Task List

Updated on 2024-05-30 GMT+08:00

Function

This API is used to query the dataset import task list by page.

Debugging

You can debug this API through automatic authentication in API Explorer or use the SDK sample code generated by API Explorer.

URI

GET /v2/{project_id}/datasets/{dataset_id}/import-tasks

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

dataset_id

Yes

String

Dataset ID.

project_id

Yes

String

Project ID. For details about how to obtain a project ID, see Obtaining a Project ID and Name.

Table 2 Query Parameters

Parameter

Mandatory

Type

Description

limit

No

Integer

Maximum number of records returned on each page. The value ranges from 1 to 100. The default value is 10.

offset

No

Integer

Start page of the paging list. The default value is 0.

Request Parameters

None

Response Parameters

Status code: 200

Table 3 Response body parameters

Parameter

Type

Description

import_tasks

Array of ImportTaskStatusResp objects

List of import tasks.

total_count

Integer

Number of import tasks.

Table 4 ImportTaskStatusResp

Parameter

Type

Description

annotated_sample_count

Long

Number of labeled samples.

create_time

Long

Time when a task is created.

data_source

DataSource object

Data source.

dataset_id

String

Dataset ID.

elapsed_time

Long

Task running time, in seconds.

error_code

String

Error code.

error_msg

String

Error message.

file_statistics

FileCopyProgress object

File replication progress

finished_file_count

Long

Number of files that have been transferred.

finished_file_size

Long

Size of the file that has been transferred, in bytes.

import_path

String

OBS path or manifest path to be imported.

  • When importing a manifest file, ensure that the path is accurate to the manifest file.

  • When a path is imported as a directory, the dataset type can only support image classification, object detection, text classification, or sound classification.

import_type

Integer

Import mode. Options:

  • 0: Import by directory.

  • 1: Import by manifest file.

imported_sample_count

Long

Number of imported samples.

imported_sub_sample_count

Long

Number of imported subsamples.

processor_task_id

String

ID of a preprocessing task.

processor_task_status

Integer

Status of a preprocessing task.

status

String

Status of an import task. Options:

  • QUEUING: queuing

  • STARTING: execution started

  • RUNNING: running

  • COMPLETED: completed

  • FAILED: failed

  • NOT_EXIST: not found

task_id

String

Task ID.

total_file_count

Long

Total number of files.

total_file_size

Long

Total file size, in bytes.

total_sample_count

Long

Total number of samples.

total_sub_sample_count

Long

Total number of subsamples generated from the parent samples.

unconfirmed_sample_count

Long

Number of samples to be confirmed.

update_ms

Long

Time when a task is updated.

Table 5 DataSource

Parameter

Type

Description

data_path

String

Data source path.

data_type

Integer

Data type. Options:

  • 0: OBS bucket (default value)

  • 1: GaussDB(DWS)

  • 2: DLI

  • 3: RDS

  • 4: MRS

  • 5: AI Gallery

  • 6: Inference service

schema_maps

Array of SchemaMap objects

Schema mapping information corresponding to the table data.

source_info

SourceInfo object

Information required for importing a table data source.

with_column_header

Boolean

Whether the first row in the file is a column name. This field is valid for the table dataset. Options:

  • true: The first row in the file is the column name.

  • false: The first row in the file is not the column name.

Table 6 SchemaMap

Parameter

Type

Description

dest_name

String

Name of the destination column.

src_name

String

Name of the source column.

Table 7 SourceInfo

Parameter

Type

Description

cluster_id

String

MRS cluster ID. You can log in to the MRS console to view the information.

cluster_mode

String

Running mode of an MRS cluster. Options:

  • 0: normal cluster

  • 1: security cluster

cluster_name

String

MRS cluster name You can log in to the MRS console to view the information.

database_name

String

Name of the database to which the table dataset is imported.

input

String

HDFS path of the table data set. For example, /datasets/demo.

ip

String

IP address of your GaussDB(DWS) cluster.

port

String

Port number of your GaussDB(DWS) cluster.

queue_name

String

DLI queue name of a table dataset.

subnet_id

String

Subnet ID of an MRS cluster.

table_name

String

Name of the table to which a table dataset is imported.

user_name

String

Username, which is mandatory for GaussDB(DWS) data.

user_password

String

User password, which is mandatory for GaussDB(DWS) data.

vpc_id

String

ID of the VPC where an MRS cluster resides.

Table 8 FileCopyProgress

Parameter

Type

Description

file_num_finished

Long

Number of files that have been transferred.

file_num_total

Long

Total number of files.

file_size_finished

Long

Size of the file that has been transferred, in bytes.

file_size_total

Long

Total file size, in bytes.

Example Requests

Obtaining the Dataset Import Task List

GET https://{endpoint}/v2/{project_id}/datasets/{dataset_id}/import-tasks

Example Responses

Status code: 200

OK

{
  "total_count" : 1,
  "import_tasks" : [ {
    "status" : "COMPLETED",
    "task_id" : "gfghHSokody6AJigS5A_RHJ1zOkIoI3Nzwxj8nh",
    "dataset_id" : "gfghHSokody6AJigS5A",
    "import_path" : "obs://test-obs/daoLu_images/animals/",
    "import_type" : 0,
    "total_sample_count" : 20,
    "imported_sample_count" : 20,
    "annotated_sample_count" : 20,
    "total_sub_sample_count" : 0,
    "imported_sub_sample_count" : 0,
    "total_file_size" : 0,
    "finished_file_count" : 0,
    "finished_file_size" : 0,
    "total_file_count" : 0,
    "create_time" : 1606114833874,
    "elapsed_time" : 2
  } ]
}

Status Codes

Status Code

Description

200

OK

401

Unauthorized

403

Forbidden

404

Not Found

Error Codes

See Error Codes.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback