Updated on 2024-10-21 GMT+08:00

Querying Dump Task Details

Function

This API is used to query details of a dump task.

Calling Method

For details, see Calling APIs.

URI

GET /v2/{project_id}/streams/{stream_name}/transfer-tasks/{task_name}

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID

stream_name

Yes

String

Name of the stream

task_name

Yes

String

Name of the dump task to be queried

Request Parameters

Table 2 Request header parameters

Parameter

Mandatory

Type

Description

X-Auth-Token

Yes

String

User token.

It can be obtained by calling the IAM API used to obtain a user token. The value of X-Subject-Token in the response header is the user token.

Response Parameters

Status code: 200

Table 3 Response body parameters

Parameter

Type

Description

stream_name

String

Name of the stream to which the dump task belongs

task_name

String

Name of the dump task

state

String

Dump task status.

  • ERROR: An error occurs.

  • STARTING: The task is starting.

  • PAUSED: The task has been stopped.

  • RUNNING: The task is running.

  • DELETE: The task has been deleted.

  • ABNORMAL: The task is abnormal.

Enumeration values:

  • ERROR

  • STARTING

  • PAUSED

  • RUNNING

  • DELETE

  • ABNORMAL

destination_type

String

Type of the dump task.

  • OBS: Data is dumped to OBS.

  • MRS: Data is dumped to MRS.

  • DLI: Data is dumped to DLI.

  • CLOUDTABLE: Data is dumped to CloudTable.

  • DWS: Data is dumped to DWS.

Enumeration values:

  • OBS

  • MRS

  • DLI

  • CLOUDTABLE

  • DWS

create_time

Long

Time when the dump task is created

last_transfer_timestamp

Long

Latest dump time of the dump task

partitions

Array of PartitionResult objects

List of partition dump details

obs_destination_description

OBSDestinationDescriptorRequest object

Parameter list of OBS to which data in the DIS stream will be dumped

dws_destination_descripton

DWSDestinationDescriptorRequest object

Parameter list of the DWS to which data in the DIS stream will be dumped

mrs_destination_description

MRSDestinationDescriptorRequest object

Parameter list of the MRS to which data in the DIS stream will be dumped

dli_destination_description

DliDestinationDescriptorRequest object

Parameter list of the DLI to which data in the DIS stream will be dumped

cloudtable_destination_descripton

CloudtableDestinationDescriptorRequest object

Parameter list of the CloudTable to which data in the DIS stream will be dumped

Table 4 PartitionResult

Parameter

Type

Description

status

String

Current status of the partition

  • CREATING

  • ACTIVE

  • DELETED

  • EXPIRED

Enumeration values:

  • CREATING

  • ACTIVE

  • DELETED

  • EXPIRED

partition_id

String

Unique identifier of the partition

hash_range

String

Possible value range of the hash key used by the partition

sequence_number_range

String

Sequence number range of the partition

parent_partitions

String

Parent partition

Table 5 OBSDestinationDescriptorRequest

Parameter

Type

Description

task_name

String

Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters.

agency_name

String

Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency.

Maximum: 64

deliver_time_interval

Integer

User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated.

Unit: second

Minimum: 30

Maximum: 900

Default: 300

consumer_strategy

String

Offset.

  • LATEST: maximum offset, indicating that the latest data will be extracted.

  • TRIM_HORIZON: minimum offset, indicating that the earliest data will be extracted.

Default: LATEST

Enumeration values:

  • LATEST

  • TRIM_HORIZON

file_prefix

String

Directory to store files that will be dumped to OBS. Different directory levels are separated by slashes (/) and cannot start with slashes.

The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/).

This parameter is left blank by default.

Maximum: 50

partition_format

String

Directory structure of the object file written into OBS. The directory structure is in the format of yyyy/MM/dd/HH/mm (time at which the dump task was created).- N/A: If this parameter is left blank, the time directory format will not be used.- yyyy: year.- yyyy/MM: year and month.- yyyy/MM/dd: year, month, and day.- yyyy/MM/dd/HH: year, month, day, and hour.- yyyy/MM/dd/HH/mm: year, month, day, hour, and minute.For example, if the dump task was created at 14:49 on November 10, 2017, then the directory structure is 2017 > 11 > 10 > 14 > 49.Default value: emptyNote:After the data is dumped successfully, the storage directory structure is obs_bucket_path/file_prefix/partition_format.

Enumeration values:

  • yyyy

  • yyyy/MM

  • yyyy/MM/dd

  • yyyy/MM/dd/HH

  • yyyy/MM/dd/HH/mm

obs_bucket_path

String

Name of the OBS bucket used to store the stream data

destination_file_type

String

Dump file format.

  • text: This is the default value.

  • parquet

  • carbon

Note:

The parquet or carbon format can be selected only when Source Data Type is set to JSON and Dump Destination is set to OBS.

Default: text

Enumeration values:

  • text

  • parquet

  • carbon

processing_schema

ProcessingSchema object

Dump time directory generated based on the timestamp of the source data and the configured partition_format. Directory structure of the object file written into OBS. The directory structure is in the format of yyyy/MM/dd/HH/mm.

record_delimiter

String

Delimiter for the dump file, which is used to separate the user data that is written into the dump file.

Options:

  • Comma (,): default value

  • Semicolon (;)

  • Vertical bar (|)

  • Newline (\n)

Default: \n

Table 6 ProcessingSchema

Parameter

Type

Description

timestamp_name

String

Attribute name of the source data timestamp

timestamp_type

String

Type of the source data timestamp.

  • String

  • Timestamp: 13-bit timestamp of the long type

timestamp_format

String

OBS directory generated based on the timestamp format. This parameter is mandatory when the timestamp type of the source data is String.

Enumeration values:

  • yyyy/MM/dd HH:mm:ss

  • MM/dd/yyyy HH:mm:ss

  • dd/MM/yyyy HH:mm:ss

  • yyyy-MM-dd HH:mm:ss

  • MM-dd-yyyy HH:mm:ss

  • dd-MM-yyyy HH:mm:ss

Table 7 DWSDestinationDescriptorRequest

Parameter

Type

Description

task_name

String

Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters.

agency_name

String

Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency.

Maximum: 64

deliver_time_interval

Integer

User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated.

Unit: second

Minimum: 30

Maximum: 900

Default: 300

consumer_strategy

String

Offset.

  • LATEST: maximum offset, indicating that the latest data will be extracted.

  • TRIM_HORIZON: minimum offset, indicating that the earliest data will be extracted.

Default: LATEST

Enumeration values:

  • LATEST

  • TRIM_HORIZON

dws_cluster_name

String

Name of the DWS cluster that stores the data in the stream

dws_cluster_id

String

ID of the DWS cluster to which will be dumped

dws_database_name

String

Name of the DWS database that stores the data in the stream

dws_schema

String

Schema of the DWS database to which data will be dumped

dws_table_name

String

Name of the DWS table that stores the data in the stream

dws_delimiter

String

Delimiter used to separate the columns in the DWS tables into which user data is inserted.

The delimiter can be a comma (,), semicolon (;), or vertical bar (|).

user_name

String

Username of the DWS database to which data will be dumped

user_password

String

Password of the DWS database to which data will be dumped

kms_user_key_name

String

Name of the key created in KMS and used to encrypt the password of the DWS database

kms_user_key_id

String

ID of the key created in KMS and used to encrypt the password of the DWS database

obs_bucket_path

String

Name of the OBS bucket used to temporarily store data in the DIS stream

file_prefix

String

Self-defined directory created in the OBS bucket and used to temporarily store data in the DIS stream. Directory levels are separated by slashes (/) and cannot start with slashes.

The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/).

This parameter is left blank by default.

retry_duration

String

Duration when you can constantly retry dumping data to DWS after the dump fails. If the duration expires but the dump still fails, the data will be backed up to the OBS bucket name/file_prefix/dws_error directory.Value range: 0–7200Unit: secondDefault value: 1800

dws_table_columns

String

Column to be dumped to the DWS table. If the value is null or empty, all columns are dumped by default. For example, c1,c2 indicates that columns c1 and c2 in the schema are dumped to DWS.

This parameter is left blank by default.

options

Options object

DWS fault tolerance option (used to specify various parameters of foreign table data).

Table 8 Options

Parameter

Type

Description

fill_missing_fields

String

Whether to set the field to Null or enable an error message to be displayed in the error table when the last field in a row of the data source file is missing during database import

Options:

  • true/on

  • false/off

Default value: false or off

Enumeration values:

  • true/on

  • false/off

ignore_extra_data

String

Whether to ignore the extra columns when the number of fields in the data source file is greater than the number of columns defined in the external table. This parameter is used only during data import.

Options:

  • true/on

  • false/off

Default value: false or off

Enumeration values:

  • true/on

  • false/off

compatible_illegal_chars

String

Specifies whether to tolerate invalid characters during data import. Specifies whether to convert invalid characters based on the conversion rule and import them to the database, or to report an error and stop the import.

Options:

  • true/on

  • false/off

Default value: false or off

Enumeration values:

  • true/on

  • false/off

reject_limit

String

Maximum number of data format errors allowed during the data import. If the number of data format errors does not reach the maximum, the data import is successful.

Options:

  • Integer value

  • unlimited

Default value: 0, indicating that an error message is returned immediately a data format error occurs.

error_table_name

String

Name of the error table that records data format errors. After the parallel import is complete, you can query the error information table to obtain the detailed error information.

Table 9 MRSDestinationDescriptorRequest

Parameter

Type

Description

task_name

String

Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters.

agency_name

String

Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency.

Maximum: 64

deliver_time_interval

Integer

User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated.

Unit: second

Minimum: 30

Maximum: 900

Default: 300

consumer_strategy

String

Offset.

  • LATEST: maximum offset, indicating that the latest data will be extracted.

  • TRIM_HORIZON: minimum offset, indicating that the earliest data will be extracted.

Default: LATEST

Enumeration values:

  • LATEST

  • TRIM_HORIZON

mrs_cluster_name

String

Name of the MRS cluster to which data in the DIS stream will be dumped.

Note:

Only MRS clusters with non-Kerberos authentication are supported.

mrs_cluster_id

String

ID of the MRS cluster to which data in the DIS stream will be dumped

mrs_hdfs_path

String

Hadoop Distributed File System (HDFS) path of the MRS cluster to which data in the DIS stream will be dumped

file_prefix

String

Self-defined directory created in the OBS bucket and used to temporarily store data in the DIS stream. Directory levels are separated by slashes (/) and cannot start with slashes.

The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/).

This parameter is left blank by default.

hdfs_prefix_folder

String

Directory to store files that will be dumped to the chosen MRS cluster. Different directory levels are separated by slash (/).

The directory name contains 0 to 50 characters.

This parameter is left blank by default.

obs_bucket_path

String

Name of the OBS bucket used to temporarily store data in the DIS stream

retry_duration

String

Time duration for DIS to retry if data fails to be dumped. If the retry time exceeds the value of this parameter, the data that fails to be dumped is backed up to the OBS bucket/ file_prefix/mrs_error directory.

Value range: 0–7200

Unit: second

Default value: 1800

If the value is set to 0, no retry is allowed.

Table 10 DliDestinationDescriptorRequest

Parameter

Type

Description

task_name

String

Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters.

agency_name

String

Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency.

Maximum: 64

deliver_time_interval

Integer

User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated.

Unit: second

Minimum: 30

Maximum: 900

Default: 300

consumer_strategy

String

Offset.

  • LATEST: maximum offset, indicating that the latest data will be extracted.

  • TRIM_HORIZON: minimum offset, indicating that the earliest data will be extracted.

Default: LATEST

Enumeration values:

  • LATEST

  • TRIM_HORIZON

dli_database_name

String

Name of the DLI database to which data in the DIS stream will be dumped

dli_table_name

String

Name of the DLI table to which data in the DIS stream will be dumped.

Note:

Only tables whose data location is DLI are supported, and you must have the permission to insert data into the tables.

obs_bucket_path

String

Name of the OBS bucket used to temporarily store data in the DIS stream

file_prefix

String

Self-defined directory created in the OBS bucket and used to temporarily store data in the DIS stream. Directory levels are separated by slashes (/) and cannot start with slashes.

The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/).

This parameter is left blank by default.

retry_duration

String

Time duration for DIS to retry if data fails to be dumped to DLI. If the retry time exceeds the value of this parameter, the data that fails to be dumped is backed up to the OBS bucket/file_prefix/dli_error directory.Value range: 0–7200Unit: secondDefault value: 1800If the value is set to 0, no retry is allowed.

Table 11 CloudtableDestinationDescriptorRequest

Parameter

Type

Description

task_name

String

Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters.

agency_name

String

Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency.

Maximum: 64

deliver_time_interval

Integer

User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated.

Unit: second

Minimum: 30

Maximum: 900

Default: 300

consumer_strategy

String

Offset.

  • LATEST: maximum offset, indicating that the latest data will be extracted.

  • TRIM_HORIZON: minimum offset, indicating that the earliest data will be extracted.

Default: LATEST

Enumeration values:

  • LATEST

  • TRIM_HORIZON

cloudtable_cluster_name

String

Name of the CloudTable cluster to which data will be dumped.

If you choose to dump data to OpenTSDB, OpenTSDB must be enabled for the cluster.

cloudtable_cluster_id

String

ID of the CloudTable cluster to which data will be dumped.

If you choose to dump data to OpenTSDB, OpenTSDB must be enabled for the cluster.

cloudtable_table_name

String

HBase table name of the CloudTable cluster to which data will be dumped. The parameter is mandatory when data is dumped to the CloudTable HBase.

cloudtable_schema

CloudtableSchema object

Schema configuration of the CloudTable HBase data. You can set either this parameter or opentsdb_schema, but this parameter is mandatory when data will be dumped to HBase. After this parameter is set, the JSON data in the stream can be converted to another format and then be imported to the CloudTable HBase.

opentsdb_schema

Array of OpenTSDBSchema objects

Schema configuration of the CloudTable OpenTSDB data. You can set either this parameter or cloudtable_schema, but this parameter is mandatory when data will be dumped to OpenTSDB. After this parameter is set, the JSON data in the stream can be converted to another format and then be imported to the CloudTable OpenTSDB.

cloudtable_row_key_delimiter

String

Delimiter used to separate the user data that generates HBase row keys. Value range: ,. |;-_~Default value: .

obs_backup_bucket_path

String

Name of the OBS bucket used to back up data that failed to be dumped to CloudTable

backup_file_prefix

String

Self-defined directory created in the OBS bucket and used to back up data that failed to be dumped to CloudTable. Directory levels are separated by slashes (/) and cannot start with slashes.Value range: a string of letters, digits, and underscores (_)Maximum length: 50 charactersThis parameter is left blank by default.

retry_duration

String

Time duration for DIS to retry if data fails to be dumped to CloudTable. If the duration is exceeded but the dump still fails, the data will be backed up to OBS bucket name/backup_file_prefix/cloudtable_error or OBS bucket name/backup_file_prefix/opentsdb_error.Value range: 0–7200Unit: secondDefault value: 1800

Table 12 CloudtableSchema

Parameter

Type

Description

row_key

Array of RowKey objects

HBase rowkey schema used by the CloudTable cluster to convert JSON data into HBase rowkeys.

Value range:1–64

columns

Array of Column objects

HBase column schema used by the CloudTable cluster to convert JSON data into HBase columns.

Value range: 1–4096

Table 13 RowKey

Parameter

Type

Description

value

String

JSON attribute name, which is used to generate HBase rowkeys for JSON data in the DIS stream.

type

String

JSON attribute type of JSON data in the DIS stream

Enumeration values:

  • Bigint

  • Double

  • Boolean

  • Timestamp

  • String

  • Decimal

Table 14 Column

Parameter

Type

Description

column_family_name

String

Name of the HBase column family to which data will be dumped

column_name

String

Name of the HBase column to which data will be dumped.

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

value

String

JSON attribute name, which is used to generate HBase column values for JSON data in the DIS stream.

type

String

JSON attribute type of JSON data in the DIS stream

Enumeration values:

  • Bigint

  • Double

  • Boolean

  • Timestamp

  • String

  • Decimal

Table 15 OpenTSDBSchema

Parameter

Type

Description

metric

Array of OpenTSDBMetric objects

Schema configuration of the OpenTSDB data metric in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the metric of the OpenTSDB data.

timestamp

OpenTSDBTimestamp object

Schema configuration of the OpenTSDB data timestamp in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the timestamp of the OpenTSDB data.

value

OpenTSDBValue object

Schema configuration of the OpenTSDB data value in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the value of the OpenTSDB data.

tags

Array of OpenTSDBTags objects

Schema configuration of the OpenTSDB data tags in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the tags of the OpenTSDB data.

Table 16 OpenTSDBMetric

Parameter

Type

Description

type

String

  • Constant: The value of metric is the value of Value.

  • String: The value of metric is the value of the JSON attribute of the user data in the stream.

Enumeration values:

  • Constant

  • String

value

String

Constant or JSON attribute name of the user data in the stream

The value contains 1 to 32 characters. Only letters, digits, and periods (.) are allowed.

Table 17 OpenTSDBTimestamp

Parameter

Type

Description

type

String

  • Timestamp: The value type of the JSON attribute of the user data in the stream is Timestamp, and the timestamp of OpenTSDB can be generated without converting the data format.- String: The value type of the JSON attribute of the user data in the stream is Date, and the timestamp of OpenTSDB can be generated only after the data format is converted.

value

String

JSON attribute name of the user data in the stream

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

format

String

This parameter is mandatory when type is set to String. When the value type of the JSON attribute of the user data in the stream is Date, format is required to convert the data format to generate the timestamp of OpenTSDB.Options:- yyyy/MM/dd HH:mm:ss- MM/dd/yyyy HH:mm:ss- dd/MM/yyyy HH:mm:ss- yyyy-MM-dd HH:mm:ss- MM-dd-yyyy HH:mm:ss- dd-MM-yyyy HH:mm:ss

Enumeration values:

  • yyyy/MM/dd HH:mm:ss

  • MM/dd/yyyy HH:mm:ss

  • dd/MM/yyyy HH:mm:ss

  • yyyy-MM-dd HH:mm:ss

  • MM-dd-yyyy HH:mm:ss

  • dd-MM-yyyy HH:mm:ss

Table 18 OpenTSDBValue

Parameter

Type

Description

type

String

Type name of the JSON attribute of the user data in the stream

Options:

  • Bigint

  • Double

  • Boolean

  • Timestamp

  • String

  • Decimal

value

String

Constant or JSON attribute name of the user data in the stream

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

Table 19 OpenTSDBTags

Parameter

Type

Description

name

String

Tag name of the OpenTSDB data that stores the data in the stream

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

type

String

Type name of the JSON attribute of the user data in the stream

Options:

  • Bigint

  • Double

  • Boolean

  • Timestamp

  • String

  • Decimal

value

String

Constant or JSON attribute name of the user data in the stream

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

Example Requests

Querying Dump Task Details

GET https://{Endpoint}/v2/{project_id}/streams/{stream_name}/transfer-tasks/{task_name}

Example Responses

Status code: 200

Normal response

{
  "stream_id" : "RdMFID6edQdf8eDzc9e",
  "stream_name" : "newstream",
  "task_name" : "newtask",
  "task_id" : "As805BudhcH1lDs6gbn",
  "destination_type" : "OBS",
  "state" : "RUNNING",
  "create_time" : 1606554932552,
  "last_transfer_timestamp" : 1606984428612,
  "obs_destination_description" : {
    "agency_name" : "dis_admin_agency",
    "file_prefix\"" : "",
    "partition_format" : "yyyy/MM/dd",
    "obs_bucket_path" : "obsbucket",
    "deliver_time_interval" : 60,
    "consumer_strategy" : "LATEST",
    "retry_duration" : 0,
    "destination_file_type" : "text",
    "record_delimiter" : ""
  },
  "partitions" : [ {
    "partitionId" : "shardId-0000000000",
    "discard" : 0,
    "state" : "RUNNING",
    "last_transfer_timestamp" : 1606984428612,
    "last_transfer_offset" : 289897
  } ]
}

SDK Sample Code

The SDK sample code is as follows.

Java

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
package com.huaweicloud.sdk.test;

import com.huaweicloud.sdk.core.auth.ICredential;
import com.huaweicloud.sdk.core.auth.BasicCredentials;
import com.huaweicloud.sdk.core.exception.ConnectionException;
import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
import com.huaweicloud.sdk.core.exception.ServiceResponseException;
import com.huaweicloud.sdk.dis.v2.region.DisRegion;
import com.huaweicloud.sdk.dis.v2.*;
import com.huaweicloud.sdk.dis.v2.model.*;


public class ShowTransferTaskSolution {

    public static void main(String[] args) {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        String ak = System.getenv("CLOUD_SDK_AK");
        String sk = System.getenv("CLOUD_SDK_SK");

        ICredential auth = new BasicCredentials()
                .withAk(ak)
                .withSk(sk);

        DisClient client = DisClient.newBuilder()
                .withCredential(auth)
                .withRegion(DisRegion.valueOf("<YOUR REGION>"))
                .build();
        ShowTransferTaskRequest request = new ShowTransferTaskRequest();
        try {
            ShowTransferTaskResponse response = client.showTransferTask(request);
            System.out.println(response.toString());
        } catch (ConnectionException e) {
            e.printStackTrace();
        } catch (RequestTimeoutException e) {
            e.printStackTrace();
        } catch (ServiceResponseException e) {
            e.printStackTrace();
            System.out.println(e.getHttpStatusCode());
            System.out.println(e.getRequestId());
            System.out.println(e.getErrorCode());
            System.out.println(e.getErrorMsg());
        }
    }
}

Python

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
# coding: utf-8

import os
from huaweicloudsdkcore.auth.credentials import BasicCredentials
from huaweicloudsdkdis.v2.region.dis_region import DisRegion
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkdis.v2 import *

if __name__ == "__main__":
    # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak = os.environ["CLOUD_SDK_AK"]
    sk = os.environ["CLOUD_SDK_SK"]

    credentials = BasicCredentials(ak, sk)

    client = DisClient.new_builder() \
        .with_credentials(credentials) \
        .with_region(DisRegion.value_of("<YOUR REGION>")) \
        .build()

    try:
        request = ShowTransferTaskRequest()
        response = client.show_transfer_task(request)
        print(response)
    except exceptions.ClientRequestException as e:
        print(e.status_code)
        print(e.request_id)
        print(e.error_code)
        print(e.error_msg)

Go

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
package main

import (
	"fmt"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
    dis "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dis/v2"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dis/v2/model"
    region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dis/v2/region"
)

func main() {
    // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak := os.Getenv("CLOUD_SDK_AK")
    sk := os.Getenv("CLOUD_SDK_SK")

    auth := basic.NewCredentialsBuilder().
        WithAk(ak).
        WithSk(sk).
        Build()

    client := dis.NewDisClient(
        dis.DisClientBuilder().
            WithRegion(region.ValueOf("<YOUR REGION>")).
            WithCredential(auth).
            Build())

    request := &model.ShowTransferTaskRequest{}
	response, err := client.ShowTransferTask(request)
	if err == nil {
        fmt.Printf("%+v\n", response)
    } else {
        fmt.Println(err)
    }
}

More

For SDK sample code of more programming languages, see the Sample Code tab in API Explorer. SDK sample code can be automatically generated.

Status Codes

Status Code

Description

200

Normal response

Error Codes

See Error Codes.