Updated on 2024-06-07 GMT+08:00

Querying Smart Connect Tasks

Function

This API is used to query Smart Connect tasks.

Calling Method

For details, see Calling APIs.

URI

GET /v2/{project_id}/instances/{instance_id}/connector/tasks

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID. For details, see Obtaining a Project ID.

instance_id

Yes

String

Instance ID.

Table 2 Query Parameters

Parameter

Mandatory

Type

Description

offset

No

Integer

Offset, which is the position where the query starts. The value must be greater than or equal to 0.

limit

No

Integer

Maximum number of instances returned in the current query. The default value is 10. The value ranges from 1 to 50.

Request Parameters

None

Response Parameters

Status code: 200

Table 3 Response body parameters

Parameter

Type

Description

tasks

Array of SmartConnectTaskEntity objects

Smart Connect task details.

total_number

Integer

Number of Smart Connect tasks.

max_tasks

Integer

Maximum number of Smart Connect tasks.

quota_tasks

Integer

Smart Connect task quota.

Table 4 SmartConnectTaskEntity

Parameter

Type

Description

task_name

String

Smart Connect task name.

topics

String

Topic of a Smart Connect task.

topics_regex

String

Regular expression of the topic of a Smart Connect task.

source_type

String

Source type of a Smart Connect task.

source_task

SmartConnectTaskRespSourceConfig object

Source configuration of a Smart Connect task.

sink_type

String

Target type of a Smart Connect task.

sink_task

SmartConnectTaskRespSinkConfig object

Target type of a Smart Connect task.

id

String

ID of a Smart Connect task.

status

String

Smart Connect task status.

create_time

Long

Time when the Smart Connect task was created.

Table 5 SmartConnectTaskRespSourceConfig

Parameter

Type

Description

redis_address

String

Redis instance address. (Displayed only when the source type is Redis.)

redis_type

String

Redis instance type. (Displayed only when the source type is Redis.)

dcs_instance_id

String

DCS instance ID. (Displayed only when the source type is Redis.)

sync_mode

String

Synchronization type: RDB_ONLY indicates full synchronization; CUSTOM_OFFSET indicates full and incremental synchronization. (Displayed only when the source type is Redis.)

full_sync_wait_ms

Integer

Interval of full synchronization retries, in ms. (Displayed only when the source type is Redis.)

full_sync_max_retry

Integer

Max. retries of full synchronization. (Displayed only when the source type is Redis.)

ratelimit

Integer

Rate limit, in KB/s. -1: disable. (Displayed only when the source type is Redis.)

current_cluster_name

String

Current Kafka instance name. (Displayed only when the source type is Kafka.)

cluster_name

String

Target Kafka instance name. (Displayed only when the source type is Kafka.)

user_name

String

Username of the target Kafka instance. (Displayed only when the source type is Kafka.)

sasl_mechanism

String

Target Kafka authentication mode. (Displayed only when the source type is Kafka.)

instance_id

String

Target Kafka instance ID. (Displayed only when the source type is Kafka.)

bootstrap_servers

String

Target Kafka instance address. (Displayed only when the source type is Kafka.)

security_protocol

String

Target Kafka authentication. (Displayed only when the source type is Kafka.)

direction

String

Sync direction. (Displayed only when the source type is Kafka.)

sync_consumer_offsets_enabled

Boolean

Indicates whether to sync the consumption progress. (Displayed only when the source type is Kafka.)

replication_factor

Integer

Number of replicas. (Displayed only when the source type is Kafka.)

task_num

Integer

Number of tasks. (Displayed only when the source type is Kafka.)

rename_topic_enabled

Boolean

Indicates whether to rename a topic. (Displayed only when the source type is Kafka.)

provenance_header_enabled

Boolean

Indicates whether to add the source header. (Displayed only when the source type is Kafka.)

consumer_strategy

String

Start offset. latest: Obtain the latest data; earliest: Obtain the earliest data. (Displayed only when the source type is Kafka.)

compression_type

String

Compression algorithm. (Displayed only when the source type is Kafka.)

topics_mapping

String

Topic mapping. (Displayed only when the source type is Kafka.)

Table 6 SmartConnectTaskRespSinkConfig

Parameter

Type

Description

redis_address

String

Redis instance address. (Displayed only when the target type is Redis.)

redis_type

String

Redis instance type. (Displayed only when the target type is Redis.)

dcs_instance_id

String

DCS instance ID. (Displayed only when the target type is Redis.)

target_db

Integer

Target database. The default value is -1. (Displayed only when the target type is Redis.)

consumer_strategy

String

Start offset. latest: Obtain the latest data; earliest: Obtain the earliest data. (Displayed only when the target type is OBS.)

destination_file_type

String

Dump file format. Only TEXT is supported. (Displayed only when the target type is OBS.)

deliver_time_interval

Integer

Dumping period (s). (Displayed only when the target type is OBS.)

obs_bucket_name

String

Dumping address. (Displayed only when the target type is OBS.)

obs_path

String

Dump directory. (Displayed only when the target type is OBS.)

partition_format

String

Time directory format. (Displayed only when the target type is OBS.)

record_delimiter

String

Line break. (Displayed only when the target type is OBS.)

store_keys

Boolean

Storage key. (Displayed only when the target type is OBS.)

obs_part_size

Integer

Size (in bytes) of each file to be uploaded. The default value is 5242880. (Displayed only when the target type is OBS.)

flush_size

Integer

flush_size. (Displayed only when the target type is OBS.)

timezone

String

Time zone. (Displayed only when the target type is OBS.)

schema_generator_class

String

schema_generator class. The default value is io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator. (Displayed only when the target type is OBS.)

partitioner_class

String

partitioner class. The default value is io.confluent.connect.storage.partitioner.TimeBasedPartitioner. (Displayed only when the target type is OBS.)

value_converter

String

value_converter. The default value is org.apache.kafka.connect.converters.ByteArrayConverter. (Displayed only when the target type is OBS.)

key_converter

String

key_converter. The default value is org.apache.kafka.connect.converters.ByteArrayConverter. (Displayed only when the target type is OBS.)

kv_delimiter

String

kv_delimiter. The default value is :. (Displayed only when the target type is OBS.)

Example Requests

None

Example Responses

Status code: 200

Successful.

{
  "tasks" : [ {
    "task_name" : "smart-connect-1571576841",
    "topics" : "topic-1643449744",
    "source_task" : {
      "current_cluster_name" : "A",
      "cluster_name" : "B",
      "direction" : "pull",
      "bootstrap_servers" : "192.168.45.58:9092,192.168.44.1:9092,192.168.41.230:9092,192.168.43.112:9092",
      "instance_id" : "59f6d088-****-****-****-********",
      "consumer_strategy" : "earliest",
      "sync_consumer_offsets_enabled" : false,
      "rename_topic_enabled" : false,
      "provenance_header_enabled" : false,
      "security_protocol" : "PLAINTEXT",
      "sasl_mechanism" : "PLAIN",
      "user_name" : "",
      "topics_mapping" : "",
      "compression_type" : "none",
      "task_num" : 2,
      "replication_factor" : 3
    },
    "source_type" : "KAFKA_REPLICATOR_SOURCE",
    "sink_task" : null,
    "sink_type" : "NONE",
    "id" : "194917d0-****-****-****-********",
    "status" : "RUNNING",
    "create_time" : 1708427753133
  }, {
    "task_name" : "smart-connect-1",
    "topics_regex" : "topic-obs*",
    "source_task" : null,
    "source_type" : "NONE",
    "sink_task" : {
      "consumer_strategy" : "earliest",
      "destination_file_type" : "TEXT",
      "obs_bucket_name" : "abcabc",
      "obs_path" : "obsTransfer-1810125534",
      "partition_format" : "yyyy/MM/dd/HH/mm",
      "record_delimiter" : "\\n",
      "deliver_time_interva" : 300,
      "obs_part_size" : 5242880,
      "flush_size" : 1000000,
      "timezone" : "Asia/Chongqing",
      "schema_generator_class" : "io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator",
      "partitioner_class" : "io.confluent.connect.storage.partitioner.TimeBasedPartitioner",
      "value_converter" : "org.apache.kafka.connect.converters.ByteArrayConverter",
      "key_converter" : "org.apache.kafka.connect.converters.ByteArrayConverter",
      "store_keys" : false,
      "kv_delimiter" : ":"
    },
    "sink_type" : "OBS_SINK",
    "id" : "3c0ac4d1-****-****-****-********",
    "status" : "RUNNING",
    "create_time" : 1708565483911
  }, {
    "task_name" : "smart-connect-121248117",
    "topics" : "topic-sc",
    "source_task" : {
      "redis_address" : "192.168.91.179:6379",
      "redis_type" : "standalone",
      "dcs_instance_id" : "949190a2-598a-4afd-99a8-dad3cae1e7cd",
      "sync_mode" : "RDB_ONLY",
      "full_sync_wait_ms" : 13000,
      "full_sync_max_retry" : 4,
      "ratelimit" : -1
    },
    "source_type" : "REDIS_REPLICATOR_SOURCE",
    "sink_task" : {
      "redis_address" : "192.168.119.51:6379",
      "redis_type" : "standalone",
      "dcs_instance_id" : "9b981368-a8e3-416a-87d9-1581a968b41b",
      "target_db" : -1
    },
    "sink_type" : "REDIS_REPLICATOR_SINK",
    "id" : "8a205bbd-****-****-****-********",
    "status" : "RUNNING",
    "create_time" : 1708427753133
  } ],
  "total_number" : 3,
  "max_tasks" : 18,
  "quota_tasks" : 18
}

SDK Sample Code

The SDK sample code is as follows.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
package com.huaweicloud.sdk.test;

import com.huaweicloud.sdk.core.auth.ICredential;
import com.huaweicloud.sdk.core.auth.BasicCredentials;
import com.huaweicloud.sdk.core.exception.ConnectionException;
import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
import com.huaweicloud.sdk.core.exception.ServiceResponseException;
import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
import com.huaweicloud.sdk.kafka.v2.*;
import com.huaweicloud.sdk.kafka.v2.model.*;


public class ListConnectorTasksSolution {

    public static void main(String[] args) {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        String ak = System.getenv("CLOUD_SDK_AK");
        String sk = System.getenv("CLOUD_SDK_SK");
        String projectId = "{project_id}";

        ICredential auth = new BasicCredentials()
                .withProjectId(projectId)
                .withAk(ak)
                .withSk(sk);

        KafkaClient client = KafkaClient.newBuilder()
                .withCredential(auth)
                .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                .build();
        ListConnectorTasksRequest request = new ListConnectorTasksRequest();
        request.withInstanceId("{instance_id}");
        try {
            ListConnectorTasksResponse response = client.listConnectorTasks(request);
            System.out.println(response.toString());
        } catch (ConnectionException e) {
            e.printStackTrace();
        } catch (RequestTimeoutException e) {
            e.printStackTrace();
        } catch (ServiceResponseException e) {
            e.printStackTrace();
            System.out.println(e.getHttpStatusCode());
            System.out.println(e.getRequestId());
            System.out.println(e.getErrorCode());
            System.out.println(e.getErrorMsg());
        }
    }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# coding: utf-8

import os
from huaweicloudsdkcore.auth.credentials import BasicCredentials
from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkkafka.v2 import *

if __name__ == "__main__":
    # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak = os.environ["CLOUD_SDK_AK"]
    sk = os.environ["CLOUD_SDK_SK"]
    projectId = "{project_id}"

    credentials = BasicCredentials(ak, sk, projectId)

    client = KafkaClient.new_builder() \
        .with_credentials(credentials) \
        .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
        .build()

    try:
        request = ListConnectorTasksRequest()
        request.instance_id = "{instance_id}"
        response = client.list_connector_tasks(request)
        print(response)
    except exceptions.ClientRequestException as e:
        print(e.status_code)
        print(e.request_id)
        print(e.error_code)
        print(e.error_msg)
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
package main

import (
	"fmt"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
    kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
    region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
)

func main() {
    // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak := os.Getenv("CLOUD_SDK_AK")
    sk := os.Getenv("CLOUD_SDK_SK")
    projectId := "{project_id}"

    auth := basic.NewCredentialsBuilder().
        WithAk(ak).
        WithSk(sk).
        WithProjectId(projectId).
        Build()

    client := kafka.NewKafkaClient(
        kafka.KafkaClientBuilder().
            WithRegion(region.ValueOf("<YOUR REGION>")).
            WithCredential(auth).
            Build())

    request := &model.ListConnectorTasksRequest{}
	request.InstanceId = "{instance_id}"
	response, err := client.ListConnectorTasks(request)
	if err == nil {
        fmt.Printf("%+v\n", response)
    } else {
        fmt.Println(err)
    }
}

For SDK sample code of more programming languages, see the Sample Code tab in API Explorer. SDK sample code can be automatically generated.

Status Codes

Status Code

Description

200

Successful.

Error Codes

See Error Codes.