Querying Dump Task Details
Function
This API is used to query details of a dump task.
Calling Method
For details, see Calling APIs.
URI
GET /v2/{project_id}/streams/{stream_name}/transfer-tasks/{task_name}
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
project_id |
Yes |
String |
Project ID |
stream_name |
Yes |
String |
Name of the stream |
task_name |
Yes |
String |
Name of the dump task to be queried |
Request Parameters
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
X-Auth-Token |
Yes |
String |
User token. It can be obtained by calling the IAM API used to obtain a user token. The value of X-Subject-Token in the response header is the user token. |
Response Parameters
Status code: 200
Parameter |
Type |
Description |
---|---|---|
stream_name |
String |
Name of the stream to which the dump task belongs |
task_name |
String |
Name of the dump task |
state |
String |
Dump task status.
Enumeration values:
|
destination_type |
String |
Type of the dump task.
Enumeration values:
|
create_time |
Long |
Time when the dump task is created |
last_transfer_timestamp |
Long |
Latest dump time of the dump task |
partitions |
Array of PartitionResult objects |
List of partition dump details |
obs_destination_description |
Parameter list of OBS to which data in the DIS stream will be dumped |
|
dws_destination_descripton |
Parameter list of the DWS to which data in the DIS stream will be dumped |
|
mrs_destination_description |
Parameter list of the MRS to which data in the DIS stream will be dumped |
|
dli_destination_description |
Parameter list of the DLI to which data in the DIS stream will be dumped |
|
cloudtable_destination_descripton |
Parameter list of the CloudTable to which data in the DIS stream will be dumped |
Parameter |
Type |
Description |
---|---|---|
status |
String |
Current status of the partition
Enumeration values:
|
partition_id |
String |
Unique identifier of the partition |
hash_range |
String |
Possible value range of the hash key used by the partition |
sequence_number_range |
String |
Sequence number range of the partition |
parent_partitions |
String |
Parent partition |
Parameter |
Type |
Description |
---|---|---|
task_name |
String |
Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters. |
agency_name |
String |
Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency. Maximum: 64 |
deliver_time_interval |
Integer |
User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated. Unit: second Minimum: 30 Maximum: 900 Default: 300 |
consumer_strategy |
String |
Offset.
Default: LATEST Enumeration values:
|
file_prefix |
String |
Directory to store files that will be dumped to OBS. Different directory levels are separated by slashes (/) and cannot start with slashes. The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/). This parameter is left blank by default. Maximum: 50 |
partition_format |
String |
Directory structure of the object file written into OBS. The directory structure is in the format of yyyy/MM/dd/HH/mm (time at which the dump task was created).- N/A: If this parameter is left blank, the time directory format will not be used.- yyyy: year.- yyyy/MM: year and month.- yyyy/MM/dd: year, month, and day.- yyyy/MM/dd/HH: year, month, day, and hour.- yyyy/MM/dd/HH/mm: year, month, day, hour, and minute.For example, if the dump task was created at 14:49 on November 10, 2017, then the directory structure is 2017 > 11 > 10 > 14 > 49.Default value: emptyNote:After the data is dumped successfully, the storage directory structure is obs_bucket_path/file_prefix/partition_format. Enumeration values:
|
obs_bucket_path |
String |
Name of the OBS bucket used to store the stream data |
destination_file_type |
String |
Dump file format.
Note: The parquet or carbon format can be selected only when Source Data Type is set to JSON and Dump Destination is set to OBS. Default: text Enumeration values:
|
processing_schema |
ProcessingSchema object |
Dump time directory generated based on the timestamp of the source data and the configured partition_format. Directory structure of the object file written into OBS. The directory structure is in the format of yyyy/MM/dd/HH/mm. |
record_delimiter |
String |
Delimiter for the dump file, which is used to separate the user data that is written into the dump file. Options:
Default: \n |
Parameter |
Type |
Description |
---|---|---|
timestamp_name |
String |
Attribute name of the source data timestamp |
timestamp_type |
String |
Type of the source data timestamp.
|
timestamp_format |
String |
OBS directory generated based on the timestamp format. This parameter is mandatory when the timestamp type of the source data is String. Enumeration values:
|
Parameter |
Type |
Description |
---|---|---|
task_name |
String |
Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters. |
agency_name |
String |
Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency. Maximum: 64 |
deliver_time_interval |
Integer |
User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated. Unit: second Minimum: 30 Maximum: 900 Default: 300 |
consumer_strategy |
String |
Offset.
Default: LATEST Enumeration values:
|
dws_cluster_name |
String |
Name of the DWS cluster that stores the data in the stream |
dws_cluster_id |
String |
ID of the DWS cluster to which will be dumped |
dws_database_name |
String |
Name of the DWS database that stores the data in the stream |
dws_schema |
String |
Schema of the DWS database to which data will be dumped |
dws_table_name |
String |
Name of the DWS table that stores the data in the stream |
dws_delimiter |
String |
Delimiter used to separate the columns in the DWS tables into which user data is inserted. The delimiter can be a comma (,), semicolon (;), or vertical bar (|). |
user_name |
String |
Username of the DWS database to which data will be dumped |
user_password |
String |
Password of the DWS database to which data will be dumped |
kms_user_key_name |
String |
Name of the key created in KMS and used to encrypt the password of the DWS database |
kms_user_key_id |
String |
ID of the key created in KMS and used to encrypt the password of the DWS database |
obs_bucket_path |
String |
Name of the OBS bucket used to temporarily store data in the DIS stream |
file_prefix |
String |
Self-defined directory created in the OBS bucket and used to temporarily store data in the DIS stream. Directory levels are separated by slashes (/) and cannot start with slashes. The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/). This parameter is left blank by default. |
retry_duration |
String |
Duration when you can constantly retry dumping data to DWS after the dump fails. If the duration expires but the dump still fails, the data will be backed up to the OBS bucket name/file_prefix/dws_error directory.Value range: 0–7200Unit: secondDefault value: 1800 |
dws_table_columns |
String |
Column to be dumped to the DWS table. If the value is null or empty, all columns are dumped by default. For example, c1,c2 indicates that columns c1 and c2 in the schema are dumped to DWS. This parameter is left blank by default. |
options |
Options object |
DWS fault tolerance option (used to specify various parameters of foreign table data). |
Parameter |
Type |
Description |
---|---|---|
fill_missing_fields |
String |
Whether to set the field to Null or enable an error message to be displayed in the error table when the last field in a row of the data source file is missing during database import Options:
Default value: false or off Enumeration values:
|
ignore_extra_data |
String |
Whether to ignore the extra columns when the number of fields in the data source file is greater than the number of columns defined in the external table. This parameter is used only during data import. Options:
Default value: false or off Enumeration values:
|
compatible_illegal_chars |
String |
Specifies whether to tolerate invalid characters during data import. Specifies whether to convert invalid characters based on the conversion rule and import them to the database, or to report an error and stop the import. Options:
Default value: false or off Enumeration values:
|
reject_limit |
String |
Maximum number of data format errors allowed during the data import. If the number of data format errors does not reach the maximum, the data import is successful. Options:
Default value: 0, indicating that an error message is returned immediately a data format error occurs. |
error_table_name |
String |
Name of the error table that records data format errors. After the parallel import is complete, you can query the error information table to obtain the detailed error information. |
Parameter |
Type |
Description |
---|---|---|
task_name |
String |
Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters. |
agency_name |
String |
Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency. Maximum: 64 |
deliver_time_interval |
Integer |
User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated. Unit: second Minimum: 30 Maximum: 900 Default: 300 |
consumer_strategy |
String |
Offset.
Default: LATEST Enumeration values:
|
mrs_cluster_name |
String |
Name of the MRS cluster to which data in the DIS stream will be dumped. Note: Only MRS clusters with non-Kerberos authentication are supported. |
mrs_cluster_id |
String |
ID of the MRS cluster to which data in the DIS stream will be dumped |
mrs_hdfs_path |
String |
Hadoop Distributed File System (HDFS) path of the MRS cluster to which data in the DIS stream will be dumped |
file_prefix |
String |
Self-defined directory created in the OBS bucket and used to temporarily store data in the DIS stream. Directory levels are separated by slashes (/) and cannot start with slashes. The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/). This parameter is left blank by default. |
hdfs_prefix_folder |
String |
Directory to store files that will be dumped to the chosen MRS cluster. Different directory levels are separated by slash (/). The directory name contains 0 to 50 characters. This parameter is left blank by default. |
obs_bucket_path |
String |
Name of the OBS bucket used to temporarily store data in the DIS stream |
retry_duration |
String |
Time duration for DIS to retry if data fails to be dumped. If the retry time exceeds the value of this parameter, the data that fails to be dumped is backed up to the OBS bucket/ file_prefix/mrs_error directory. Value range: 0–7200 Unit: second Default value: 1800 If the value is set to 0, no retry is allowed. |
Parameter |
Type |
Description |
---|---|---|
task_name |
String |
Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters. |
agency_name |
String |
Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency. Maximum: 64 |
deliver_time_interval |
Integer |
User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated. Unit: second Minimum: 30 Maximum: 900 Default: 300 |
consumer_strategy |
String |
Offset.
Default: LATEST Enumeration values:
|
dli_database_name |
String |
Name of the DLI database to which data in the DIS stream will be dumped |
dli_table_name |
String |
Name of the DLI table to which data in the DIS stream will be dumped. Note: Only tables whose data location is DLI are supported, and you must have the permission to insert data into the tables. |
obs_bucket_path |
String |
Name of the OBS bucket used to temporarily store data in the DIS stream |
file_prefix |
String |
Self-defined directory created in the OBS bucket and used to temporarily store data in the DIS stream. Directory levels are separated by slashes (/) and cannot start with slashes. The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/). This parameter is left blank by default. |
retry_duration |
String |
Time duration for DIS to retry if data fails to be dumped to DLI. If the retry time exceeds the value of this parameter, the data that fails to be dumped is backed up to the OBS bucket/file_prefix/dli_error directory.Value range: 0–7200Unit: secondDefault value: 1800If the value is set to 0, no retry is allowed. |
Parameter |
Type |
Description |
---|---|---|
task_name |
String |
Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters. |
agency_name |
String |
Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency. Maximum: 64 |
deliver_time_interval |
Integer |
User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated. Unit: second Minimum: 30 Maximum: 900 Default: 300 |
consumer_strategy |
String |
Offset.
Default: LATEST Enumeration values:
|
cloudtable_cluster_name |
String |
Name of the CloudTable cluster to which data will be dumped. If you choose to dump data to OpenTSDB, OpenTSDB must be enabled for the cluster. |
cloudtable_cluster_id |
String |
ID of the CloudTable cluster to which data will be dumped. If you choose to dump data to OpenTSDB, OpenTSDB must be enabled for the cluster. |
cloudtable_table_name |
String |
HBase table name of the CloudTable cluster to which data will be dumped. The parameter is mandatory when data is dumped to the CloudTable HBase. |
cloudtable_schema |
CloudtableSchema object |
Schema configuration of the CloudTable HBase data. You can set either this parameter or opentsdb_schema, but this parameter is mandatory when data will be dumped to HBase. After this parameter is set, the JSON data in the stream can be converted to another format and then be imported to the CloudTable HBase. |
opentsdb_schema |
Array of OpenTSDBSchema objects |
Schema configuration of the CloudTable OpenTSDB data. You can set either this parameter or cloudtable_schema, but this parameter is mandatory when data will be dumped to OpenTSDB. After this parameter is set, the JSON data in the stream can be converted to another format and then be imported to the CloudTable OpenTSDB. |
cloudtable_row_key_delimiter |
String |
Delimiter used to separate the user data that generates HBase row keys. Value range: ,. |;-_~Default value: . |
obs_backup_bucket_path |
String |
Name of the OBS bucket used to back up data that failed to be dumped to CloudTable |
backup_file_prefix |
String |
Self-defined directory created in the OBS bucket and used to back up data that failed to be dumped to CloudTable. Directory levels are separated by slashes (/) and cannot start with slashes.Value range: a string of letters, digits, and underscores (_)Maximum length: 50 charactersThis parameter is left blank by default. |
retry_duration |
String |
Time duration for DIS to retry if data fails to be dumped to CloudTable. If the duration is exceeded but the dump still fails, the data will be backed up to OBS bucket name/backup_file_prefix/cloudtable_error or OBS bucket name/backup_file_prefix/opentsdb_error.Value range: 0–7200Unit: secondDefault value: 1800 |
Parameter |
Type |
Description |
---|---|---|
row_key |
Array of RowKey objects |
HBase rowkey schema used by the CloudTable cluster to convert JSON data into HBase rowkeys. Value range:1–64 |
columns |
Array of Column objects |
HBase column schema used by the CloudTable cluster to convert JSON data into HBase columns. Value range: 1–4096 |
Parameter |
Type |
Description |
---|---|---|
value |
String |
JSON attribute name, which is used to generate HBase rowkeys for JSON data in the DIS stream. |
type |
String |
JSON attribute type of JSON data in the DIS stream Enumeration values:
|
Parameter |
Type |
Description |
---|---|---|
column_family_name |
String |
Name of the HBase column family to which data will be dumped |
column_name |
String |
Name of the HBase column to which data will be dumped. Value range: 1–32. The value can contain only letters, digits, and underscores (_). |
value |
String |
JSON attribute name, which is used to generate HBase column values for JSON data in the DIS stream. |
type |
String |
JSON attribute type of JSON data in the DIS stream Enumeration values:
|
Parameter |
Type |
Description |
---|---|---|
metric |
Array of OpenTSDBMetric objects |
Schema configuration of the OpenTSDB data metric in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the metric of the OpenTSDB data. |
timestamp |
OpenTSDBTimestamp object |
Schema configuration of the OpenTSDB data timestamp in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the timestamp of the OpenTSDB data. |
value |
OpenTSDBValue object |
Schema configuration of the OpenTSDB data value in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the value of the OpenTSDB data. |
tags |
Array of OpenTSDBTags objects |
Schema configuration of the OpenTSDB data tags in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the tags of the OpenTSDB data. |
Parameter |
Type |
Description |
---|---|---|
type |
String |
Enumeration values:
|
value |
String |
Constant or JSON attribute name of the user data in the stream The value contains 1 to 32 characters. Only letters, digits, and periods (.) are allowed. |
Parameter |
Type |
Description |
---|---|---|
type |
String |
|
value |
String |
JSON attribute name of the user data in the stream Value range: 1–32. The value can contain only letters, digits, and underscores (_). |
format |
String |
This parameter is mandatory when type is set to String. When the value type of the JSON attribute of the user data in the stream is Date, format is required to convert the data format to generate the timestamp of OpenTSDB.Options:- yyyy/MM/dd HH:mm:ss- MM/dd/yyyy HH:mm:ss- dd/MM/yyyy HH:mm:ss- yyyy-MM-dd HH:mm:ss- MM-dd-yyyy HH:mm:ss- dd-MM-yyyy HH:mm:ss Enumeration values:
|
Parameter |
Type |
Description |
---|---|---|
type |
String |
Type name of the JSON attribute of the user data in the stream Options:
|
value |
String |
Constant or JSON attribute name of the user data in the stream Value range: 1–32. The value can contain only letters, digits, and underscores (_). |
Parameter |
Type |
Description |
---|---|---|
name |
String |
Tag name of the OpenTSDB data that stores the data in the stream Value range: 1–32. The value can contain only letters, digits, and underscores (_). |
type |
String |
Type name of the JSON attribute of the user data in the stream Options:
|
value |
String |
Constant or JSON attribute name of the user data in the stream Value range: 1–32. The value can contain only letters, digits, and underscores (_). |
Example Requests
Querying Dump Task Details
GET https://{Endpoint}/v2/{project_id}/streams/{stream_name}/transfer-tasks/{task_name}
Example Responses
Status code: 200
Normal response
{ "stream_id" : "RdMFID6edQdf8eDzc9e", "stream_name" : "newstream", "task_name" : "newtask", "task_id" : "As805BudhcH1lDs6gbn", "destination_type" : "OBS", "state" : "RUNNING", "create_time" : 1606554932552, "last_transfer_timestamp" : 1606984428612, "obs_destination_description" : { "agency_name" : "dis_admin_agency", "file_prefix\"" : "", "partition_format" : "yyyy/MM/dd", "obs_bucket_path" : "obsbucket", "deliver_time_interval" : 60, "consumer_strategy" : "LATEST", "retry_duration" : 0, "destination_file_type" : "text", "record_delimiter" : "" }, "partitions" : [ { "partitionId" : "shardId-0000000000", "discard" : 0, "state" : "RUNNING", "last_transfer_timestamp" : 1606984428612, "last_transfer_offset" : 289897 } ] }
SDK Sample Code
The SDK sample code is as follows.
Java
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
package com.huaweicloud.sdk.test; import com.huaweicloud.sdk.core.auth.ICredential; import com.huaweicloud.sdk.core.auth.BasicCredentials; import com.huaweicloud.sdk.core.exception.ConnectionException; import com.huaweicloud.sdk.core.exception.RequestTimeoutException; import com.huaweicloud.sdk.core.exception.ServiceResponseException; import com.huaweicloud.sdk.dis.v2.region.DisRegion; import com.huaweicloud.sdk.dis.v2.*; import com.huaweicloud.sdk.dis.v2.model.*; public class ShowTransferTaskSolution { public static void main(String[] args) { // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security. // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment String ak = System.getenv("CLOUD_SDK_AK"); String sk = System.getenv("CLOUD_SDK_SK"); ICredential auth = new BasicCredentials() .withAk(ak) .withSk(sk); DisClient client = DisClient.newBuilder() .withCredential(auth) .withRegion(DisRegion.valueOf("<YOUR REGION>")) .build(); ShowTransferTaskRequest request = new ShowTransferTaskRequest(); try { ShowTransferTaskResponse response = client.showTransferTask(request); System.out.println(response.toString()); } catch (ConnectionException e) { e.printStackTrace(); } catch (RequestTimeoutException e) { e.printStackTrace(); } catch (ServiceResponseException e) { e.printStackTrace(); System.out.println(e.getHttpStatusCode()); System.out.println(e.getRequestId()); System.out.println(e.getErrorCode()); System.out.println(e.getErrorMsg()); } } } |
Python
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
# coding: utf-8 import os from huaweicloudsdkcore.auth.credentials import BasicCredentials from huaweicloudsdkdis.v2.region.dis_region import DisRegion from huaweicloudsdkcore.exceptions import exceptions from huaweicloudsdkdis.v2 import * if __name__ == "__main__": # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security. # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment ak = os.environ["CLOUD_SDK_AK"] sk = os.environ["CLOUD_SDK_SK"] credentials = BasicCredentials(ak, sk) client = DisClient.new_builder() \ .with_credentials(credentials) \ .with_region(DisRegion.value_of("<YOUR REGION>")) \ .build() try: request = ShowTransferTaskRequest() response = client.show_transfer_task(request) print(response) except exceptions.ClientRequestException as e: print(e.status_code) print(e.request_id) print(e.error_code) print(e.error_msg) |
Go
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
package main import ( "fmt" "github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic" dis "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dis/v2" "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dis/v2/model" region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dis/v2/region" ) func main() { // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security. // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment ak := os.Getenv("CLOUD_SDK_AK") sk := os.Getenv("CLOUD_SDK_SK") auth := basic.NewCredentialsBuilder(). WithAk(ak). WithSk(sk). Build() client := dis.NewDisClient( dis.DisClientBuilder(). WithRegion(region.ValueOf("<YOUR REGION>")). WithCredential(auth). Build()) request := &model.ShowTransferTaskRequest{} response, err := client.ShowTransferTask(request) if err == nil { fmt.Printf("%+v\n", response) } else { fmt.Println(err) } } |
More
For SDK sample code of more programming languages, see the Sample Code tab in API Explorer. SDK sample code can be automatically generated.
Status Codes
Status Code |
Description |
---|---|
200 |
Normal response |
Error Codes
See Error Codes.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot