更新时间:2024-12-16 GMT+08:00
分享

查询Smart Connect任务列表

功能介绍

查询Smart Connect任务列表。

调用方法

请参见如何调用API

URI

GET /v2/{project_id}/instances/{instance_id}/connector/tasks

表1 路径参数

参数

是否必选

参数类型

描述

project_id

String

项目ID,获取方式请参见获取项目ID

instance_id

String

实例ID。

表2 Query参数

参数

是否必选

参数类型

描述

offset

Integer

偏移量,表示从此偏移量开始查询,offset大于等于0。

limit

Integer

当次查询返回的最大实例个数,默认值为10,取值范围为1~50。

请求参数

响应参数

状态码: 200

表3 响应Body参数

参数

参数类型

描述

tasks

Array of SmartConnectTaskEntity objects

Smart Connector任务详情。

total_number

Integer

Smart Connector任务数。

max_tasks

Integer

Smart Connector最大任务数。

quota_tasks

Integer

Smart Connector任务配额。

表4 SmartConnectTaskEntity

参数

参数类型

描述

task_name

String

SmartConnect任务名称。

topics

String

SmartConnect任务配置的Topic。

topics_regex

String

SmartConnect任务配置的Topic正则表达式。

source_type

String

SmartConnect任务的源端类型。

source_task

SmartConnectTaskRespSourceConfig object

SmartConnect任务的源端配置。

sink_type

String

SmartConnect任务的目标端类型。

sink_task

SmartConnectTaskRespSinkConfig object

SmartConnect任务的目标端配置。

id

String

SmartConnect任务的id。

status

String

SmartConnect任务的状态。

create_time

Long

SmartConnect任务的创建时间。

表5 SmartConnectTaskRespSourceConfig

参数

参数类型

描述

redis_address

String

Redis实例地址。(仅源端类型为Redis时会显示)

redis_type

String

Redis实例类型。(仅源端类型为Redis时会显示)

dcs_instance_id

String

DCS实例ID。(仅源端类型为Redis时会显示)

sync_mode

String

同步类型,“RDB_ONLY”为全量同步,“CUSTOM_OFFSET”为全量同步+增量同步。(仅源端类型为Redis时会显示)

full_sync_wait_ms

Integer

全量同步重试间隔时间,单位:毫秒。(仅源端类型为Redis时会显示)

full_sync_max_retry

Integer

全量同步最大重试次数。(仅源端类型为Redis时会显示)

ratelimit

Integer

限速,单位为KB/s。-1表示不限速(仅源端类型为Redis时会显示)

current_cluster_name

String

当前Kafka实例别名。(仅源端类型为Kafka时会显示)

cluster_name

String

对端Kafka实例别名。(仅源端类型为Kafka时会显示)

user_name

String

对端Kafka用户名。(仅源端类型为Kafka时会显示)

sasl_mechanism

String

对端Kafka认证机制。(仅源端类型为Kafka时会显示)

instance_id

String

对端Kafka实例ID。(仅源端类型为Kafka时会显示)

bootstrap_servers

String

对端Kafka实例地址。(仅源端类型为Kafka时会显示)

security_protocol

String

对端Kafka认证方式。(仅源端类型为Kafka时会显示)

direction

String

同步方向。(仅源端类型为Kafka时会显示)

sync_consumer_offsets_enabled

Boolean

是否同步消费进度。(仅源端类型为Kafka时会显示)

replication_factor

Integer

副本数。(仅源端类型为Kafka时会显示)

task_num

Integer

任务数。(仅源端类型为Kafka时会显示)

rename_topic_enabled

Boolean

是否重命名Topic。(仅源端类型为Kafka时会显示)

provenance_header_enabled

Boolean

是否添加来源header。(仅源端类型为Kafka时会显示)

consumer_strategy

String

启动偏移量,latest为获取最新的数据,earliest为获取最早的数据。(仅源端类型为Kafka时会显示)

compression_type

String

压缩算法。(仅源端类型为Kafka时会显示)

topics_mapping

String

topic映射。(仅源端类型为Kafka时会显示)

表6 SmartConnectTaskRespSinkConfig

参数

参数类型

描述

redis_address

String

Redis实例地址。(仅目标端类型为Redis时会显示)

redis_type

String

Redis实例类型。(仅目标端类型为Redis时会显示)

dcs_instance_id

String

DCS实例ID。(仅目标端类型为Redis时会显示)

target_db

Integer

目标数据库,默认为-1。(仅目标端类型为Redis时会显示)

consumer_strategy

String

转储启动偏移量,latest为获取最新的数据,earliest为获取最早的数据。(仅目标端类型为OBS时会显示)

destination_file_type

String

转储文件格式。当前只支持TEXT。(仅目标端类型为OBS时会显示)

deliver_time_interval

Integer

记数据转储周期(秒)。(仅目标端类型为OBS时会显示)

obs_bucket_name

String

转储地址。(仅目标端类型为OBS时会显示)

obs_path

String

转储目录。(仅目标端类型为OBS时会显示)

partition_format

String

时间目录格式。(仅目标端类型为OBS时会显示)

record_delimiter

String

记录分行符。(仅目标端类型为OBS时会显示)

store_keys

Boolean

存储Key。(仅目标端类型为OBS时会显示)

obs_part_size

Integer

每个传输文件多大后就开始上传,单位为byte;默认值5242880。(仅目标端类型为OBS时会显示)

flush_size

Integer

flush_size。(仅目标端类型为OBS时会显示)

timezone

String

时区。(仅目标端类型为OBS时会显示)

schema_generator_class

String

schema_generator类,默认为"io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator"。(仅目标端类型为OBS时会显示)

partitioner_class

String

partitioner类,默认"io.confluent.connect.storage.partitioner.TimeBasedPartitioner"。(仅目标端类型为OBS时会显示)

value_converter

String

value_converter,默认为"org.apache.kafka.connect.converters.ByteArrayConverter"。(仅目标端类型为OBS时会显示)

key_converter

String

key_converter,默认为"org.apache.kafka.connect.converters.ByteArrayConverter"。(仅目标端类型为OBS时会显示)

kv_delimiter

String

kv_delimiter,默认为":"。(仅目标端类型为OBS时会显示)

请求示例

响应示例

状态码: 200

查询Smart Connect任务列表成功。

{
  "tasks" : [ {
    "task_name" : "smart-connect-1571576841",
    "topics" : "topic-1643449744",
    "source_task" : {
      "current_cluster_name" : "A",
      "cluster_name" : "B",
      "direction" : "pull",
      "bootstrap_servers" : "192.168.45.58:9092,192.168.44.1:9092,192.168.41.230:9092,192.168.43.112:9092",
      "instance_id" : "59f6d088-****-****-****-********",
      "consumer_strategy" : "earliest",
      "sync_consumer_offsets_enabled" : false,
      "rename_topic_enabled" : false,
      "provenance_header_enabled" : false,
      "security_protocol" : "PLAINTEXT",
      "sasl_mechanism" : "PLAIN",
      "user_name" : "",
      "topics_mapping" : "",
      "compression_type" : "none",
      "task_num" : 2,
      "replication_factor" : 3
    },
    "source_type" : "KAFKA_REPLICATOR_SOURCE",
    "sink_task" : null,
    "sink_type" : "NONE",
    "id" : "194917d0-****-****-****-********",
    "status" : "RUNNING",
    "create_time" : 1708427753133
  }, {
    "task_name" : "smart-connect-1",
    "topics_regex" : "topic-obs*",
    "source_task" : null,
    "source_type" : "NONE",
    "sink_task" : {
      "consumer_strategy" : "earliest",
      "destination_file_type" : "TEXT",
      "obs_bucket_name" : "abcabc",
      "obs_path" : "obsTransfer-1810125534",
      "partition_format" : "yyyy/MM/dd/HH/mm",
      "record_delimiter" : "\\n",
      "deliver_time_interva" : 300,
      "obs_part_size" : 5242880,
      "flush_size" : 1000000,
      "timezone" : "Asia/Chongqing",
      "schema_generator_class" : "io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator",
      "partitioner_class" : "io.confluent.connect.storage.partitioner.TimeBasedPartitioner",
      "value_converter" : "org.apache.kafka.connect.converters.ByteArrayConverter",
      "key_converter" : "org.apache.kafka.connect.converters.ByteArrayConverter",
      "store_keys" : false,
      "kv_delimiter" : ":"
    },
    "sink_type" : "OBS_SINK",
    "id" : "3c0ac4d1-****-****-****-********",
    "status" : "RUNNING",
    "create_time" : 1708565483911
  }, {
    "task_name" : "smart-connect-121248117",
    "topics" : "topic-sc",
    "source_task" : {
      "redis_address" : "192.168.91.179:6379",
      "redis_type" : "standalone",
      "dcs_instance_id" : "949190a2-598a-4afd-99a8-dad3cae1e7cd",
      "sync_mode" : "RDB_ONLY",
      "full_sync_wait_ms" : 13000,
      "full_sync_max_retry" : 4,
      "ratelimit" : -1
    },
    "source_type" : "REDIS_REPLICATOR_SOURCE",
    "sink_task" : {
      "redis_address" : "192.168.119.51:6379",
      "redis_type" : "standalone",
      "dcs_instance_id" : "9b981368-a8e3-416a-87d9-1581a968b41b",
      "target_db" : -1
    },
    "sink_type" : "REDIS_REPLICATOR_SINK",
    "id" : "8a205bbd-****-****-****-********",
    "status" : "RUNNING",
    "create_time" : 1708427753133
  } ],
  "total_number" : 3,
  "max_tasks" : 18,
  "quota_tasks" : 18
}

SDK代码示例

SDK代码示例如下。

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
package com.huaweicloud.sdk.test;

import com.huaweicloud.sdk.core.auth.ICredential;
import com.huaweicloud.sdk.core.auth.BasicCredentials;
import com.huaweicloud.sdk.core.exception.ConnectionException;
import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
import com.huaweicloud.sdk.core.exception.ServiceResponseException;
import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
import com.huaweicloud.sdk.kafka.v2.*;
import com.huaweicloud.sdk.kafka.v2.model.*;


public class ListConnectorTasksSolution {

    public static void main(String[] args) {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        String ak = System.getenv("CLOUD_SDK_AK");
        String sk = System.getenv("CLOUD_SDK_SK");
        String projectId = "{project_id}";

        ICredential auth = new BasicCredentials()
                .withProjectId(projectId)
                .withAk(ak)
                .withSk(sk);

        KafkaClient client = KafkaClient.newBuilder()
                .withCredential(auth)
                .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                .build();
        ListConnectorTasksRequest request = new ListConnectorTasksRequest();
        request.withInstanceId("{instance_id}");
        try {
            ListConnectorTasksResponse response = client.listConnectorTasks(request);
            System.out.println(response.toString());
        } catch (ConnectionException e) {
            e.printStackTrace();
        } catch (RequestTimeoutException e) {
            e.printStackTrace();
        } catch (ServiceResponseException e) {
            e.printStackTrace();
            System.out.println(e.getHttpStatusCode());
            System.out.println(e.getRequestId());
            System.out.println(e.getErrorCode());
            System.out.println(e.getErrorMsg());
        }
    }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# coding: utf-8

import os
from huaweicloudsdkcore.auth.credentials import BasicCredentials
from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkkafka.v2 import *

if __name__ == "__main__":
    # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak = os.environ["CLOUD_SDK_AK"]
    sk = os.environ["CLOUD_SDK_SK"]
    projectId = "{project_id}"

    credentials = BasicCredentials(ak, sk, projectId)

    client = KafkaClient.new_builder() \
        .with_credentials(credentials) \
        .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
        .build()

    try:
        request = ListConnectorTasksRequest()
        request.instance_id = "{instance_id}"
        response = client.list_connector_tasks(request)
        print(response)
    except exceptions.ClientRequestException as e:
        print(e.status_code)
        print(e.request_id)
        print(e.error_code)
        print(e.error_msg)
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
package main

import (
	"fmt"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
    kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
    region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
)

func main() {
    // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak := os.Getenv("CLOUD_SDK_AK")
    sk := os.Getenv("CLOUD_SDK_SK")
    projectId := "{project_id}"

    auth := basic.NewCredentialsBuilder().
        WithAk(ak).
        WithSk(sk).
        WithProjectId(projectId).
        Build()

    client := kafka.NewKafkaClient(
        kafka.KafkaClientBuilder().
            WithRegion(region.ValueOf("<YOUR REGION>")).
            WithCredential(auth).
            Build())

    request := &model.ListConnectorTasksRequest{}
	request.InstanceId = "{instance_id}"
	response, err := client.ListConnectorTasks(request)
	if err == nil {
        fmt.Printf("%+v\n", response)
    } else {
        fmt.Println(err)
    }
}

更多编程语言的SDK代码示例,请参见API Explorer的代码示例页签,可生成自动对应的SDK代码示例。

状态码

状态码

描述

200

查询Smart Connect任务列表成功。

错误码

请参见错误码

相关文档