Updated on 2024-03-18 GMT+08:00

Querying a Dumping Task

Function

This API is used to query a dumping task.

This API is out-of-date and may not be maintained in the future. Please use the API described in Querying Smart Connect Tasks.

Call Method

For details, see How to Call an API.

URI

GET /v2/{project_id}/connectors/{connector_id}/sink-tasks/{task_id}

Table 1 URI parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID. For details, see Obtaining a Project ID.

connector_id

Yes

String

Instance dump ID.

For details, see Querying Instance IDs.

task_id

Yes

String

Dumping task ID.

Table 2 Query parameters

Parameter

Mandatory

Type

Description

topic-info

No

String

Whether topic information is contained. The default value is false.

Request Parameters

None

Response Parameters

Status code: 200

Table 3 Response body parameters

Parameter

Type

Description

task_name

String

Name of a dumping task.

destination_type

String

Type of the dumping task.

create_time

Long

Time when the dumping task is created.

status

String

Dumping task status.

topics

String

Topic list or topic regular expression of the dumping task.

obs_destination_descriptor

obs_destination_descriptor object

Description of the dump.

topics_info

Array of topics_info objects

Topic information.

Table 4 obs_destination_descriptor

Parameter

Type

Description

consumer_strategy

String

Message consumption policy:

  • latest: Messages are consumed from the end of the topic.
  • earliest: Messages are consumed from the start of the topic.

The default value is latest.

destination_file_type

String

Dump file format. Only text is supported.

obs_bucket_name

String

Name of the OBS bucket used to store the data.

obs_path

String

OBS path.

partition_format

String

Directory structure of the object file written into OBS. The directory structure is in the format of yyyy/MM/dd/HH/mm (time at which the dumping task was created).

  • yyyy: year.
  • yyyy/MM: year and month.
  • yyyy/MM/dd: year, month, and day.
  • yyyy/MM/dd/HH: year, month, day, and hour.
  • yyyy/MM/dd/HH/mm: year, month, day, hour, and minute. For example, 2017/11/10/14/49 means that the directory structure is 2017 > 11 > 10 > 14 > 49, where 2017 is the outermost folder.
NOTE:

After the data is dumped successfully, the storage directory structure is obs_bucket_path/file_prefix/partition_format. The default time zone is GMT+08:00.

record_delimiter

String

Delimiter for the dump file, which is used to separate the user data that is written into the dump file.

Value range:

  • Comma (,)
  • Semicolon (;)
  • Vertical bar (|)
  • Newline (\n)
  • NULL

Default value: newline (\n).

deliver_time_interval

Integer

No package files will be generated if there is no data within a time segment.

Value range: 30-900. Default value: 300. Unit: second.

NOTE:

This parameter is mandatory if streaming data is dumped to OBS.

obs_part_size

Long

Size (in bytes) of each file to be uploaded.

Default value: 5242880.

Table 5 topics_info

Parameter

Type

Description

topic

String

Topic name.

partitions

Array of partitions objects

Partition list.

Table 6 partitions

Parameter

Type

Description

partition_id

String

Partition ID.

status

String

Running status.

last_transfer_offset

String

Dumped message offset.

log_end_offset

String

Message offset.

lag

String

Number of stacked messages.

Example Request

Querying specified dumping task details

GET https://{endpoint}/v2/{project_id}/connectors/{connector_id}/sink-tasks/{task_id}?topic-info=true

Example Response

Status code: 200

Successfully querying a dumping task

{
  "task_name" : "obsTransfer-56997523",
  "destination_type" : "OBS",
  "create_time" : 1628126621283,
  "status" : "RUNNING",
  "topics" : "topic-sdk-no-delete",
  "obs_destination_descriptor" : {
    "consumer_strategy" : "earliest",
    "destination_file_type" : "TEXT",
    "obs_bucket_name" : "testobs",
    "obs_path" : "obsTransfer-56997523",
    "partition_format" : "yyyy/MM/dd/HH/mm",
    "record_delimiter" : "",
    "deliver_time_interval" : 300,
    "obs_part_size" : 5242880
  },
  "topics_info" : [ {
    "topic" : "topic-sdk-no-delete",
    "partitions" : [ {
      "partition_id" : "2",
      "status" : "RUNNING",
      "last_transfer_offset" : "3",
      "log_end_offset" : "3",
      "lag" : "0"
    }, {
      "partition_id" : "1",
      "status" : "RUNNING",
      "last_transfer_offset" : "3",
      "log_end_offset" : "3",
      "lag" : "0"
    }, {
      "partition_id" : "0",
      "status" : "RUNNING",
      "last_transfer_offset" : "3",
      "log_end_offset" : "3",
      "lag" : "0"
    } ]
  } ]
}

SDK Sample Code

The SDK sample code is as follows.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
package com.huaweicloud.sdk.test;

import com.huaweicloud.sdk.core.auth.ICredential;
import com.huaweicloud.sdk.core.auth.BasicCredentials;
import com.huaweicloud.sdk.core.exception.ConnectionException;
import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
import com.huaweicloud.sdk.core.exception.ServiceResponseException;
import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
import com.huaweicloud.sdk.kafka.v2.*;
import com.huaweicloud.sdk.kafka.v2.model.*;


public class ShowSinkTaskDetailSolution {

    public static void main(String[] args) {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        String ak = System.getenv("CLOUD_SDK_AK");
        String sk = System.getenv("CLOUD_SDK_SK");

        ICredential auth = new BasicCredentials()
                .withAk(ak)
                .withSk(sk);

        KafkaClient client = KafkaClient.newBuilder()
                .withCredential(auth)
                .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                .build();
        ShowSinkTaskDetailRequest request = new ShowSinkTaskDetailRequest();
        request.withTopicInfo(ShowSinkTaskDetailRequest.TopicInfoEnum.fromValue("<topic-info>"));
        try {
            ShowSinkTaskDetailResponse response = client.showSinkTaskDetail(request);
            System.out.println(response.toString());
        } catch (ConnectionException e) {
            e.printStackTrace();
        } catch (RequestTimeoutException e) {
            e.printStackTrace();
        } catch (ServiceResponseException e) {
            e.printStackTrace();
            System.out.println(e.getHttpStatusCode());
            System.out.println(e.getRequestId());
            System.out.println(e.getErrorCode());
            System.out.println(e.getErrorMsg());
        }
    }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
# coding: utf-8

from huaweicloudsdkcore.auth.credentials import BasicCredentials
from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkkafka.v2 import *

if __name__ == "__main__":
    # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak = __import__('os').getenv("CLOUD_SDK_AK")
    sk = __import__('os').getenv("CLOUD_SDK_SK")

    credentials = BasicCredentials(ak, sk) \

    client = KafkaClient.new_builder() \
        .with_credentials(credentials) \
        .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
        .build()

    try:
        request = ShowSinkTaskDetailRequest()
        request.topic_info = "<topic-info>"
        response = client.show_sink_task_detail(request)
        print(response)
    except exceptions.ClientRequestException as e:
        print(e.status_code)
        print(e.request_id)
        print(e.error_code)
        print(e.error_msg)
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
package main

import (
	"fmt"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
    kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
    region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
)

func main() {
    // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak := os.Getenv("CLOUD_SDK_AK")
    sk := os.Getenv("CLOUD_SDK_SK")

    auth := basic.NewCredentialsBuilder().
        WithAk(ak).
        WithSk(sk).
        Build()

    client := kafka.NewKafkaClient(
        kafka.KafkaClientBuilder().
            WithRegion(region.ValueOf("<YOUR REGION>")).
            WithCredential(auth).
            Build())

    request := &model.ShowSinkTaskDetailRequest{}
	topicInfoRequest:= model.GetShowSinkTaskDetailRequestTopicInfoEnum().<TOPIC_INFO>
	request.TopicInfo = &topicInfoRequest
	response, err := client.ShowSinkTaskDetail(request)
	if err == nil {
        fmt.Printf("%+v\n", response)
    } else {
        fmt.Println(err)
    }
}

Status Code

Status Code

Description

200

The dumping task is queried successfully.

Error Codes

See Error Codes.