计算
弹性云服务器 ECS
Flexus云服务
裸金属服务器 BMS
弹性伸缩 AS
镜像服务 IMS
专属主机 DeH
函数工作流 FunctionGraph
云手机服务器 CPH
Huawei Cloud EulerOS
网络
虚拟私有云 VPC
弹性公网IP EIP
虚拟专用网络 VPN
弹性负载均衡 ELB
NAT网关 NAT
云专线 DC
VPC终端节点 VPCEP
云连接 CC
企业路由器 ER
企业交换机 ESW
全球加速 GA
安全与合规
安全技术与应用
Web应用防火墙 WAF
企业主机安全 HSS
云防火墙 CFW
安全云脑 SecMaster
DDoS防护 AAD
数据加密服务 DEW
数据库安全服务 DBSS
云堡垒机 CBH
数据安全中心 DSC
云证书管理服务 CCM
边缘安全 EdgeSec
CDN与智能边缘
内容分发网络 CDN
CloudPond云服务
智能边缘云 IEC
迁移
主机迁移服务 SMS
对象存储迁移服务 OMS
云数据迁移 CDM
迁移中心 MGC
大数据
MapReduce服务 MRS
数据湖探索 DLI
表格存储服务 CloudTable
云搜索服务 CSS
数据接入服务 DIS
数据仓库服务 GaussDB(DWS)
数据治理中心 DataArts Studio
数据可视化 DLV
数据湖工厂 DLF
湖仓构建 LakeFormation
企业应用
云桌面 Workspace
应用与数据集成平台 ROMA Connect
云解析服务 DNS
专属云
专属计算集群 DCC
IoT物联网
IoT物联网
设备接入 IoTDA
智能边缘平台 IEF
云生态
合作伙伴中心
云商店
开发者工具
SDK开发指南
API签名指南
Terraform
华为云命令行工具服务 KooCLI
其他
产品价格详情
系统权限
管理控制台
客户关联华为云合作伙伴须知
消息中心
公共问题
视频
视频直播 Live
视频点播 VOD
媒体处理 MPC
实时音视频 SparkRTC
数字内容生产线 MetaStudio
开发与运维
应用管理与运维平台 ServiceStage
软件开发生产线 CodeArts
需求管理 CodeArts Req
部署 CodeArts Deploy
性能测试 CodeArts PerfTest
编译构建 CodeArts Build
流水线 CodeArts Pipeline
制品仓库 CodeArts Artifact
测试计划 CodeArts TestPlan
代码检查 CodeArts Check
代码托管 CodeArts Repo
云应用引擎 CAE
华为云Astro轻应用
华为云Astro大屏应用
开源治理服务 CodeArts Governance
存储
对象存储服务 OBS
云硬盘 EVS
云备份 CBR
存储容灾服务 SDRS
高性能弹性文件服务 SFS Turbo
弹性文件服务 SFS
云硬盘备份 VBS
云服务器备份 CSBS
数据快递服务 DES
专属分布式存储服务 DSS
容器
云容器引擎 CCE
容器镜像服务 SWR
应用服务网格 ASM
华为云UCS
云容器实例 CCI
管理与监管
云监控服务 CES
统一身份认证服务 IAM
资源编排服务 RFS
云审计服务 CTS
标签管理服务 TMS
云日志服务 LTS
配置审计 Config
资源访问管理 RAM
消息通知服务 SMN
应用运维管理 AOM
应用性能管理 APM
组织 Organizations
优化顾问 OA
IAM 身份中心
云运维中心 COC
资源治理中心 RGC
应用身份管理服务 OneAccess
数据库
云数据库 RDS
文档数据库服务 DDS
数据管理服务 DAS
数据复制服务 DRS
云数据库 GeminiDB
云数据库 GaussDB
分布式数据库中间件 DDM
数据库和应用迁移 UGO
云数据库 TaurusDB
人工智能
人脸识别服务 FRS
图引擎服务 GES
图像识别 Image
内容审核 Moderation
文字识别 OCR
AI开发平台ModelArts
图像搜索 ImageSearch
对话机器人服务 CBS
华为HiLens
视频智能分析服务 VIAS
语音交互服务 SIS
应用中间件
分布式缓存服务 DCS
API网关 APIG
微服务引擎 CSE
分布式消息服务Kafka版
分布式消息服务RabbitMQ版
分布式消息服务RocketMQ版
多活高可用服务 MAS
事件网格 EG
企业协同
华为云会议 Meeting
云通信
消息&短信 MSGSMS
云化转型
云架构中心
云采用框架
用户服务
账号中心
费用中心
成本中心
资源中心
企业管理
工单管理
国际站常见问题
ICP备案
我的凭证
支持计划
客户运营能力
合作伙伴支持计划
专业服务
区块链
区块链服务 BCS
Web3节点引擎服务 NES
解决方案
SAP
高性能计算 HPC
开天aPaaS
云消息服务 KooMessage
云手机服务 KooPhone
云空间服务 KooDrive

创建Smart Connect任务

功能介绍

创建Smart Connect任务。

调用方法

请参见如何调用API

URI

POST /v2/{project_id}/instances/{instance_id}/connector/tasks

表1 路径参数

参数

是否必选

参数类型

描述

project_id

String

项目ID,获取方式请参见获取项目ID

instance_id

String

实例ID。

请求参数

表2 请求Body参数

参数

是否必选

参数类型

描述

task_name

String

SmartConnect任务名称。

start_later

Boolean

是否稍后再启动任务。如需要创建任务后立即启动,请填false;如希望稍后在任务列表中手动开启任务,请填true。

topics

String

SmartConnect任务配置的Topic。

topics_regex

String

SmartConnect任务配置的Topic正则表达式。

source_type

String

SmartConnect任务的源端类型。

source_task

SmartConnectTaskReqSourceConfig object

SmartConnect任务的源端配置。

sink_type

String

SmartConnect任务的目标端类型。

sink_task

SmartConnectTaskReqSinkConfig object

SmartConnect任务的目标端配置。

表3 SmartConnectTaskReqSourceConfig

参数

是否必选

参数类型

描述

current_cluster_name

String

当前Kafka实例别名。(仅源端类型为Kafka时需要填写)

cluster_name

String

对端Kafka实例别名。(仅源端类型为Kafka时需要填写)

user_name

String

对端Kafka开启SASL_SSL时设置的用户名,或者创建SASL_SSL用户时设置的用户名。(仅源端类型为Kafka且对端Kafka认证方式为“SASL_SSL”时需要填写)

password

String

对端Kafka开启SASL_SSL时设置的密码,或者创建SASL_SSL用户时设置的密码。(仅源端类型为Kafka且对端Kafka认证方式为“SASL_SSL”时需要填写)

sasl_mechanism

String

对端Kafka认证机制。(仅源端类型为Kafka且“认证方式”为“SASL_SSL”时需要填写)

instance_id

String

对端Kafka实例ID。(仅源端类型为Kafka时需要填写,instance_id和bootstrap_servers仅需要填写其中一个)

bootstrap_servers

String

对端Kafka实例地址。(仅源端类型为Kafka时需要填写,instance_id和bootstrap_servers仅需要填写其中一个)

security_protocol

String

对端Kafka认证方式。(仅源端类型为Kafka需要填写)

支持以下两种认证方式:

  • SASL_SSL:表示实例已开启SASL_SSL。

  • PLAINTEXT:表示实例未开启SASL_SSL。

direction

String

同步方向;pull为把对端Kafka实例数据复制到当前Kafka实例中,push为把当前Kafka实例数据复制到对端Kafka实例中,two-way为对两端Kafka实例数据进行双向复制。(仅源端类型为Kafka时需要填写)

sync_consumer_offsets_enabled

Boolean

是否同步消费进度。(仅源端类型为Kafka时需要填写)

replication_factor

Integer

在对端实例中自动创建Topic时,指定Topic的副本数,此参数值不能超过对端实例的代理数。如果对端实例中设置了“default.replication.factor”,此参数的优先级高于“default.replication.factor”。(仅源端类型为Kafka时需要填写)

task_num

Integer

数据复制的任务数。默认值为2,建议保持默认值。如果“同步方式”为“双向”,实际任务数=设置的任务数*2。(仅源端类型为Kafka时需要填写)

rename_topic_enabled

Boolean

是否重命名Topic,在目标Topic名称前添加源端Kafka实例的别名,形成目标Topic新的名称。(仅源端类型为Kafka时需要填写)

provenance_header_enabled

Boolean

目标Topic接收复制的消息,此消息header中包含消息来源。两端实例数据双向复制时,请开启“添加来源header”,防止循环复制。(仅源端类型为Kafka时需要填写)

consumer_strategy

String

启动偏移量,latest为获取最新的数据,earliest为获取最早的数据。(仅源端类型为Kafka时需要填写)

compression_type

String

复制消息所使用的压缩算法。(仅源端类型为Kafka时需要填写)

  • none

  • gzip

  • snappy

  • lz4

  • zstd

topics_mapping

String

Topic映射,用于自定义目标端Topic名称。不能同时设置“重命名Topic”和“Topic映射”。Topic映射请按照“源端Topic:目的端Topic”的格式填写,如涉及多个Topic映射,请用“,”分隔开,例如:topic-sc-1:topic-sc-2,topic-sc-3:topic-sc-4。(仅源端类型为Kafka时需要填写)

表4 SmartConnectTaskReqSinkConfig

参数

是否必选

参数类型

描述

consumer_strategy

String

转储启动偏移量,latest为获取最新的数据,earliest为获取最早的数据。(仅目标端类型为OBS时需要填写)

destination_file_type

String

转储文件格式。当前只支持TEXT。(仅目标端类型为OBS时需要填写)

deliver_time_interval

Integer

数据转储周期(秒),默认配置为300秒。(仅目标端类型为OBS时需要填写)

access_key

String

AK,访问密钥ID。(仅目标端类型为OBS时需要填写)

secret_key

String

SK,与访问密钥ID结合使用的密钥。(仅目标端类型为OBS时需要填写)

obs_bucket_name

String

转储地址,即存储Topic数据的OBS桶的名称。(仅目标端类型为OBS时需要填写)

obs_path

String

转储目录,即OBS中存储Topic的目录,多级目录可以用“/”进行分隔。(仅目标端类型为OBS时需要填写)

partition_format

String

时间目录格式。(仅目标端类型为OBS时需要填写)

  • yyyy:年

  • yyyy/MM:年/月

  • yyyy/MM/dd:年/月/日

  • yyyy/MM/dd/HH:年/月/日/时

  • yyyy/MM/dd/HH/mm:年/月/日/时/分

record_delimiter

String

记录分行符,用于分隔写入转储文件的用户数据。(仅目标端类型为OBS时需要填写)

取值范围:

  • 逗号“,”

  • 分号“;”

  • 竖线“|”

  • 换行符“\n”

  • NULL

store_keys

Boolean

是否转储Key,开启表示转储Key,关闭表示不转储Key。(仅目标端类型为OBS时需要填写)

响应参数

状态码:200

表5 响应Body参数

参数

参数类型

描述

task_name

String

SmartConnect任务名称。

topics

String

SmartConnect任务配置的Topic。

topics_regex

String

SmartConnect任务配置的Topic正则表达式。

source_type

String

SmartConnect任务的源端类型。

source_task

SmartConnectTaskRespSourceConfig object

SmartConnect任务的源端配置。

sink_type

String

SmartConnect任务的目标端类型。

sink_task

SmartConnectTaskRespSinkConfig object

SmartConnect任务的目标端配置。

id

String

SmartConnect任务的id。

status

String

SmartConnect任务的状态。

create_time

Long

SmartConnect任务的创建时间。

表6 SmartConnectTaskRespSourceConfig

参数

参数类型

描述

current_cluster_name

String

当前Kafka实例别名。(仅源端类型为Kafka时会显示)

cluster_name

String

对端Kafka实例别名。(仅源端类型为Kafka时会显示)

user_name

String

对端Kafka用户名。(仅源端类型为Kafka时会显示)

sasl_mechanism

String

对端Kafka认证机制。(仅源端类型为Kafka时会显示)

instance_id

String

对端Kafka实例ID。(仅源端类型为Kafka时会显示)

bootstrap_servers

String

对端Kafka实例地址。(仅源端类型为Kafka时会显示)

security_protocol

String

对端Kafka认证方式。(仅源端类型为Kafka时会显示)

direction

String

同步方向。(仅源端类型为Kafka时会显示)

sync_consumer_offsets_enabled

Boolean

是否同步消费进度。(仅源端类型为Kafka时会显示)

replication_factor

Integer

副本数。(仅源端类型为Kafka时会显示)

task_num

Integer

任务数。(仅源端类型为Kafka时会显示)

rename_topic_enabled

Boolean

是否重命名Topic。(仅源端类型为Kafka时会显示)

provenance_header_enabled

Boolean

是否添加来源header。(仅源端类型为Kafka时会显示)

consumer_strategy

String

启动偏移量,latest为获取最新的数据,earliest为获取最早的数据。(仅源端类型为Kafka时会显示)

compression_type

String

压缩算法。(仅源端类型为Kafka时会显示)

topics_mapping

String

Topic映射。(仅源端类型为Kafka时会显示)

表7 SmartConnectTaskRespSinkConfig

参数

参数类型

描述

consumer_strategy

String

转储启动偏移量,latest为获取最新的数据,earliest为获取最早的数据。(仅目标端类型为OBS时会显示)

destination_file_type

String

转储文件格式。当前只支持TEXT。(仅目标端类型为OBS时会显示)

deliver_time_interval

Integer

记数据转储周期(秒)。(仅目标端类型为OBS时会显示)

obs_bucket_name

String

转储地址。(仅目标端类型为OBS时会显示)

obs_path

String

转储目录。(仅目标端类型为OBS时会显示)

partition_format

String

时间目录格式。(仅目标端类型为OBS时会显示)

record_delimiter

String

记录分行符。(仅目标端类型为OBS时会显示)

store_keys

Boolean

存储Key。(仅目标端类型为OBS时会显示)

obs_part_size

Integer

每个传输文件多大后就开始上传,单位为byte;默认值5242880。(仅目标端类型为OBS时会显示)

flush_size

Integer

flush_size。(仅目标端类型为OBS时会显示)

timezone

String

时区。(仅目标端类型为OBS时会显示)

schema_generator_class

String

schema_generator类,默认为"io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator"。(仅目标端类型为OBS时会显示)

partitioner_class

String

partitioner类,默认"io.confluent.connect.storage.partitioner.TimeBasedPartitioner"。(仅目标端类型为OBS时会显示)

value_converter

String

value_converter,默认为"org.apache.kafka.connect.converters.ByteArrayConverter"。(仅目标端类型为OBS时会显示)

key_converter

String

key_converter,默认为"org.apache.kafka.connect.converters.ByteArrayConverter"。(仅目标端类型为OBS时会显示)

kv_delimiter

String

kv_delimiter,默认为":"。(仅目标端类型为OBS时会显示)

请求示例

  • 创建一个立即启动的转储任务。

    POST https://{endpoint}/v2/{project_id}/instances/{instance_id}/connector/tasks
    
    {
      "task_name" : "smart-connect-1",
      "start_later" : false,
      "source_type" : "NONE",
      "topics_regex" : "topic-obs*",
      "sink_type" : "OBS_SINK",
      "sink_task" : {
        "consumer_strategy" : "earliest",
        "destination_file_type" : "TEXT",
        "deliver_time_interval" : 300,
        "access_key" : "********",
        "secret_key" : "********",
        "obs_bucket_name" : "obs_bucket",
        "obs_path" : "obsTransfer-1810125534",
        "partition_format" : "yyyy/MM/dd/HH/mm",
        "record_delimiter" : "\\n",
        "store_keys" : false
      }
    }
  • 创建一个稍后启动的Kafka数据复制任务。

    POST https://{endpoint}/v2/{project_id}/instances/{instance_id}/connector/tasks
    
    {
      "task_name" : "smart-connect-2",
      "start_later" : true,
      "source_type" : "KAFKA_REPLICATOR_SOURCE",
      "source_task" : {
        "current_cluster_name" : "A",
        "cluster_name" : "B",
        "user_name" : "user1",
        "password" : "********",
        "sasl_mechanism" : "SCRAM-SHA-512",
        "instance_id" : "b54c9dd8-********-********",
        "direction" : "two-way",
        "sync_consumer_offsets_enabled" : false,
        "replication_factor" : 3,
        "task_num" : 2,
        "rename_topic_enabled" : false,
        "provenance_header_enabled" : true,
        "consumer_strategy" : "latest",
        "compression_type" : "snappy",
        "topics_mapping" : "topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4"
      }
    }

响应示例

状态码:200

创建Smart Connect任务成功。

{
  "task_name" : "smart-connect-121248117",
  "topics" : "topic-1643449744",
  "source_task" : {
    "current_cluster_name" : "A",
    "cluster_name" : "B",
    "direction" : "pull",
    "bootstrap_servers" : "192.168.45.58:9092,192.168.44.1:9092,192.168.41.230:9092,192.168.43.112:9092",
    "instance_id" : "59f6d088-****-****-****-********",
    "consumer_strategy" : "earliest",
    "sync_consumer_offsets_enabled" : false,
    "rename_topic_enabled" : false,
    "provenance_header_enabled" : false,
    "security_protocol" : "PLAINTEXT",
    "sasl_mechanism" : "PLAIN",
    "user_name" : "",
    "topics_mapping" : "",
    "compression_type" : "none",
    "task_num" : 2,
    "replication_factor" : 3
  },
  "source_type" : "KAFKA_REPLICATOR_SOURCE",
  "sink_task" : null,
  "sink_type" : "NONE",
  "id" : "194917d0-****-****-****-********",
  "status" : "RUNNING",
  "create_time" : 1708427753133
}

SDK代码示例

SDK代码示例如下。

  • 创建一个立即启动的转储任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    
    public class CreateConnectorTaskSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreateConnectorTaskRequest request = new CreateConnectorTaskRequest();
            request.withInstanceId("{instance_id}");
            CreateSmartConnectTaskReq body = new CreateSmartConnectTaskReq();
            SmartConnectTaskReqSinkConfig sinkTaskbody = new SmartConnectTaskReqSinkConfig();
            sinkTaskbody.withConsumerStrategy("earliest")
                .withDestinationFileType("TEXT")
                .withDeliverTimeInterval(300)
                .withAccessKey("********")
                .withSecretKey("********")
                .withObsBucketName("obs_bucket")
                .withObsPath("obsTransfer-1810125534")
                .withPartitionFormat("yyyy/MM/dd/HH/mm")
                .withRecordDelimiter("\n")
                .withStoreKeys(false);
            body.withSinkTask(sinkTaskbody);
            body.withSinkType(CreateSmartConnectTaskReq.SinkTypeEnum.fromValue("OBS_SINK"));
            body.withSourceType(CreateSmartConnectTaskReq.SourceTypeEnum.fromValue("NONE"));
            body.withTopicsRegex("topic-obs*");
            body.withStartLater(false);
            body.withTaskName("smart-connect-1");
            request.withBody(body);
            try {
                CreateConnectorTaskResponse response = client.createConnectorTask(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个稍后启动的Kafka数据复制任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    
    public class CreateConnectorTaskSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreateConnectorTaskRequest request = new CreateConnectorTaskRequest();
            request.withInstanceId("{instance_id}");
            CreateSmartConnectTaskReq body = new CreateSmartConnectTaskReq();
            SmartConnectTaskReqSourceConfig sourceTaskbody = new SmartConnectTaskReqSourceConfig();
            sourceTaskbody.withCurrentClusterName("A")
                .withClusterName("B")
                .withUserName("user1")
                .withPassword("********")
                .withSaslMechanism("SCRAM-SHA-512")
                .withInstanceId("b54c9dd8-********-********")
                .withDirection("two-way")
                .withSyncConsumerOffsetsEnabled(false)
                .withReplicationFactor(3)
                .withTaskNum(2)
                .withRenameTopicEnabled(false)
                .withProvenanceHeaderEnabled(true)
                .withConsumerStrategy("latest")
                .withCompressionType("snappy")
                .withTopicsMapping("topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4");
            body.withSourceTask(sourceTaskbody);
            body.withSourceType(CreateSmartConnectTaskReq.SourceTypeEnum.fromValue("KAFKA_REPLICATOR_SOURCE"));
            body.withStartLater(true);
            body.withTaskName("smart-connect-2");
            request.withBody(body);
            try {
                CreateConnectorTaskResponse response = client.createConnectorTask(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个立即启动的转储任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreateConnectorTaskRequest()
            request.instance_id = "{instance_id}"
            sinkTaskbody = SmartConnectTaskReqSinkConfig(
                consumer_strategy="earliest",
                destination_file_type="TEXT",
                deliver_time_interval=300,
                access_key="********",
                secret_key="********",
                obs_bucket_name="obs_bucket",
                obs_path="obsTransfer-1810125534",
                partition_format="yyyy/MM/dd/HH/mm",
                record_delimiter="\n",
                store_keys=False
            )
            request.body = CreateSmartConnectTaskReq(
                sink_task=sinkTaskbody,
                sink_type="OBS_SINK",
                source_type="NONE",
                topics_regex="topic-obs*",
                start_later=False,
                task_name="smart-connect-1"
            )
            response = client.create_connector_task(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个稍后启动的Kafka数据复制任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreateConnectorTaskRequest()
            request.instance_id = "{instance_id}"
            sourceTaskbody = SmartConnectTaskReqSourceConfig(
                current_cluster_name="A",
                cluster_name="B",
                user_name="user1",
                password="********",
                sasl_mechanism="SCRAM-SHA-512",
                instance_id="b54c9dd8-********-********",
                direction="two-way",
                sync_consumer_offsets_enabled=False,
                replication_factor=3,
                task_num=2,
                rename_topic_enabled=False,
                provenance_header_enabled=True,
                consumer_strategy="latest",
                compression_type="snappy",
                topics_mapping="topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4"
            )
            request.body = CreateSmartConnectTaskReq(
                source_task=sourceTaskbody,
                source_type="KAFKA_REPLICATOR_SOURCE",
                start_later=True,
                task_name="smart-connect-2"
            )
            response = client.create_connector_task(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个立即启动的转储任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreateConnectorTaskRequest{}
    	request.InstanceId = "{instance_id}"
    	consumerStrategySinkTask:= "earliest"
    	destinationFileTypeSinkTask:= "TEXT"
    	deliverTimeIntervalSinkTask:= int32(300)
    	accessKeySinkTask:= "********"
    	secretKeySinkTask:= "********"
    	obsBucketNameSinkTask:= "obs_bucket"
    	obsPathSinkTask:= "obsTransfer-1810125534"
    	partitionFormatSinkTask:= "yyyy/MM/dd/HH/mm"
    	recordDelimiterSinkTask:= "\n"
    	storeKeysSinkTask:= false
    	sinkTaskbody := &model.SmartConnectTaskReqSinkConfig{
    		ConsumerStrategy: &consumerStrategySinkTask,
    		DestinationFileType: &destinationFileTypeSinkTask,
    		DeliverTimeInterval: &deliverTimeIntervalSinkTask,
    		AccessKey: &accessKeySinkTask,
    		SecretKey: &secretKeySinkTask,
    		ObsBucketName: &obsBucketNameSinkTask,
    		ObsPath: &obsPathSinkTask,
    		PartitionFormat: &partitionFormatSinkTask,
    		RecordDelimiter: &recordDelimiterSinkTask,
    		StoreKeys: &storeKeysSinkTask,
    	}
    	sinkTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSinkTypeEnum().OBS_SINK
    	sourceTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSourceTypeEnum().NONE
    	topicsRegexCreateSmartConnectTaskReq:= "topic-obs*"
    	startLaterCreateSmartConnectTaskReq:= false
    	taskNameCreateSmartConnectTaskReq:= "smart-connect-1"
    	request.Body = &model.CreateSmartConnectTaskReq{
    		SinkTask: sinkTaskbody,
    		SinkType: &sinkTypeCreateSmartConnectTaskReq,
    		SourceType: &sourceTypeCreateSmartConnectTaskReq,
    		TopicsRegex: &topicsRegexCreateSmartConnectTaskReq,
    		StartLater: &startLaterCreateSmartConnectTaskReq,
    		TaskName: &taskNameCreateSmartConnectTaskReq,
    	}
    	response, err := client.CreateConnectorTask(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    
  • 创建一个稍后启动的Kafka数据复制任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreateConnectorTaskRequest{}
    	request.InstanceId = "{instance_id}"
    	currentClusterNameSourceTask:= "A"
    	clusterNameSourceTask:= "B"
    	userNameSourceTask:= "user1"
    	passwordSourceTask:= "********"
    	saslMechanismSourceTask:= "SCRAM-SHA-512"
    	instanceIdSourceTask:= "b54c9dd8-********-********"
    	directionSourceTask:= "two-way"
    	syncConsumerOffsetsEnabledSourceTask:= false
    	replicationFactorSourceTask:= int32(3)
    	taskNumSourceTask:= int32(2)
    	renameTopicEnabledSourceTask:= false
    	provenanceHeaderEnabledSourceTask:= true
    	consumerStrategySourceTask:= "latest"
    	compressionTypeSourceTask:= "snappy"
    	topicsMappingSourceTask:= "topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4"
    	sourceTaskbody := &model.SmartConnectTaskReqSourceConfig{
    		CurrentClusterName: &currentClusterNameSourceTask,
    		ClusterName: &clusterNameSourceTask,
    		UserName: &userNameSourceTask,
    		Password: &passwordSourceTask,
    		SaslMechanism: &saslMechanismSourceTask,
    		InstanceId: &instanceIdSourceTask,
    		Direction: &directionSourceTask,
    		SyncConsumerOffsetsEnabled: &syncConsumerOffsetsEnabledSourceTask,
    		ReplicationFactor: &replicationFactorSourceTask,
    		TaskNum: &taskNumSourceTask,
    		RenameTopicEnabled: &renameTopicEnabledSourceTask,
    		ProvenanceHeaderEnabled: &provenanceHeaderEnabledSourceTask,
    		ConsumerStrategy: &consumerStrategySourceTask,
    		CompressionType: &compressionTypeSourceTask,
    		TopicsMapping: &topicsMappingSourceTask,
    	}
    	sourceTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSourceTypeEnum().KAFKA_REPLICATOR_SOURCE
    	startLaterCreateSmartConnectTaskReq:= true
    	taskNameCreateSmartConnectTaskReq:= "smart-connect-2"
    	request.Body = &model.CreateSmartConnectTaskReq{
    		SourceTask: sourceTaskbody,
    		SourceType: &sourceTypeCreateSmartConnectTaskReq,
    		StartLater: &startLaterCreateSmartConnectTaskReq,
    		TaskName: &taskNameCreateSmartConnectTaskReq,
    	}
    	response, err := client.CreateConnectorTask(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    

更多编程语言的SDK代码示例,请参见API Explorer的代码示例页签,可生成自动对应的SDK代码示例。

  • 创建一个立即启动的转储任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    
    public class CreateConnectorTaskSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreateConnectorTaskRequest request = new CreateConnectorTaskRequest();
            request.withInstanceId("{instance_id}");
            CreateSmartConnectTaskReq body = new CreateSmartConnectTaskReq();
            SmartConnectTaskReqSinkConfig sinkTaskbody = new SmartConnectTaskReqSinkConfig();
            sinkTaskbody.withConsumerStrategy("earliest")
                .withDestinationFileType("TEXT")
                .withDeliverTimeInterval(300)
                .withAccessKey("********")
                .withSecretKey("********")
                .withObsBucketName("obs_bucket")
                .withObsPath("obsTransfer-1810125534")
                .withPartitionFormat("yyyy/MM/dd/HH/mm")
                .withRecordDelimiter("\n")
                .withStoreKeys(false);
            body.withSinkTask(sinkTaskbody);
            body.withSinkType(CreateSmartConnectTaskReq.SinkTypeEnum.fromValue("OBS_SINK"));
            body.withSourceType(CreateSmartConnectTaskReq.SourceTypeEnum.fromValue("NONE"));
            body.withTopicsRegex("topic-obs*");
            body.withStartLater(false);
            body.withTaskName("smart-connect-1");
            request.withBody(body);
            try {
                CreateConnectorTaskResponse response = client.createConnectorTask(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个稍后启动的Kafka数据复制任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    
    public class CreateConnectorTaskSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreateConnectorTaskRequest request = new CreateConnectorTaskRequest();
            request.withInstanceId("{instance_id}");
            CreateSmartConnectTaskReq body = new CreateSmartConnectTaskReq();
            SmartConnectTaskReqSourceConfig sourceTaskbody = new SmartConnectTaskReqSourceConfig();
            sourceTaskbody.withCurrentClusterName("A")
                .withClusterName("B")
                .withUserName("user1")
                .withPassword("********")
                .withSaslMechanism("SCRAM-SHA-512")
                .withInstanceId("b54c9dd8-********-********")
                .withDirection("two-way")
                .withSyncConsumerOffsetsEnabled(false)
                .withReplicationFactor(3)
                .withTaskNum(2)
                .withRenameTopicEnabled(false)
                .withProvenanceHeaderEnabled(true)
                .withConsumerStrategy("latest")
                .withCompressionType("snappy")
                .withTopicsMapping("topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4");
            body.withSourceTask(sourceTaskbody);
            body.withSourceType(CreateSmartConnectTaskReq.SourceTypeEnum.fromValue("KAFKA_REPLICATOR_SOURCE"));
            body.withStartLater(true);
            body.withTaskName("smart-connect-2");
            request.withBody(body);
            try {
                CreateConnectorTaskResponse response = client.createConnectorTask(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个立即启动的转储任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreateConnectorTaskRequest()
            request.instance_id = "{instance_id}"
            sinkTaskbody = SmartConnectTaskReqSinkConfig(
                consumer_strategy="earliest",
                destination_file_type="TEXT",
                deliver_time_interval=300,
                access_key="********",
                secret_key="********",
                obs_bucket_name="obs_bucket",
                obs_path="obsTransfer-1810125534",
                partition_format="yyyy/MM/dd/HH/mm",
                record_delimiter="\n",
                store_keys=False
            )
            request.body = CreateSmartConnectTaskReq(
                sink_task=sinkTaskbody,
                sink_type="OBS_SINK",
                source_type="NONE",
                topics_regex="topic-obs*",
                start_later=False,
                task_name="smart-connect-1"
            )
            response = client.create_connector_task(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个稍后启动的Kafka数据复制任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreateConnectorTaskRequest()
            request.instance_id = "{instance_id}"
            sourceTaskbody = SmartConnectTaskReqSourceConfig(
                current_cluster_name="A",
                cluster_name="B",
                user_name="user1",
                password="********",
                sasl_mechanism="SCRAM-SHA-512",
                instance_id="b54c9dd8-********-********",
                direction="two-way",
                sync_consumer_offsets_enabled=False,
                replication_factor=3,
                task_num=2,
                rename_topic_enabled=False,
                provenance_header_enabled=True,
                consumer_strategy="latest",
                compression_type="snappy",
                topics_mapping="topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4"
            )
            request.body = CreateSmartConnectTaskReq(
                source_task=sourceTaskbody,
                source_type="KAFKA_REPLICATOR_SOURCE",
                start_later=True,
                task_name="smart-connect-2"
            )
            response = client.create_connector_task(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个立即启动的转储任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreateConnectorTaskRequest{}
    	request.InstanceId = "{instance_id}"
    	consumerStrategySinkTask:= "earliest"
    	destinationFileTypeSinkTask:= "TEXT"
    	deliverTimeIntervalSinkTask:= int32(300)
    	accessKeySinkTask:= "********"
    	secretKeySinkTask:= "********"
    	obsBucketNameSinkTask:= "obs_bucket"
    	obsPathSinkTask:= "obsTransfer-1810125534"
    	partitionFormatSinkTask:= "yyyy/MM/dd/HH/mm"
    	recordDelimiterSinkTask:= "\n"
    	storeKeysSinkTask:= false
    	sinkTaskbody := &model.SmartConnectTaskReqSinkConfig{
    		ConsumerStrategy: &consumerStrategySinkTask,
    		DestinationFileType: &destinationFileTypeSinkTask,
    		DeliverTimeInterval: &deliverTimeIntervalSinkTask,
    		AccessKey: &accessKeySinkTask,
    		SecretKey: &secretKeySinkTask,
    		ObsBucketName: &obsBucketNameSinkTask,
    		ObsPath: &obsPathSinkTask,
    		PartitionFormat: &partitionFormatSinkTask,
    		RecordDelimiter: &recordDelimiterSinkTask,
    		StoreKeys: &storeKeysSinkTask,
    	}
    	sinkTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSinkTypeEnum().OBS_SINK
    	sourceTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSourceTypeEnum().NONE
    	topicsRegexCreateSmartConnectTaskReq:= "topic-obs*"
    	startLaterCreateSmartConnectTaskReq:= false
    	taskNameCreateSmartConnectTaskReq:= "smart-connect-1"
    	request.Body = &model.CreateSmartConnectTaskReq{
    		SinkTask: sinkTaskbody,
    		SinkType: &sinkTypeCreateSmartConnectTaskReq,
    		SourceType: &sourceTypeCreateSmartConnectTaskReq,
    		TopicsRegex: &topicsRegexCreateSmartConnectTaskReq,
    		StartLater: &startLaterCreateSmartConnectTaskReq,
    		TaskName: &taskNameCreateSmartConnectTaskReq,
    	}
    	response, err := client.CreateConnectorTask(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    
  • 创建一个稍后启动的Kafka数据复制任务。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreateConnectorTaskRequest{}
    	request.InstanceId = "{instance_id}"
    	currentClusterNameSourceTask:= "A"
    	clusterNameSourceTask:= "B"
    	userNameSourceTask:= "user1"
    	passwordSourceTask:= "********"
    	saslMechanismSourceTask:= "SCRAM-SHA-512"
    	instanceIdSourceTask:= "b54c9dd8-********-********"
    	directionSourceTask:= "two-way"
    	syncConsumerOffsetsEnabledSourceTask:= false
    	replicationFactorSourceTask:= int32(3)
    	taskNumSourceTask:= int32(2)
    	renameTopicEnabledSourceTask:= false
    	provenanceHeaderEnabledSourceTask:= true
    	consumerStrategySourceTask:= "latest"
    	compressionTypeSourceTask:= "snappy"
    	topicsMappingSourceTask:= "topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4"
    	sourceTaskbody := &model.SmartConnectTaskReqSourceConfig{
    		CurrentClusterName: &currentClusterNameSourceTask,
    		ClusterName: &clusterNameSourceTask,
    		UserName: &userNameSourceTask,
    		Password: &passwordSourceTask,
    		SaslMechanism: &saslMechanismSourceTask,
    		InstanceId: &instanceIdSourceTask,
    		Direction: &directionSourceTask,
    		SyncConsumerOffsetsEnabled: &syncConsumerOffsetsEnabledSourceTask,
    		ReplicationFactor: &replicationFactorSourceTask,
    		TaskNum: &taskNumSourceTask,
    		RenameTopicEnabled: &renameTopicEnabledSourceTask,
    		ProvenanceHeaderEnabled: &provenanceHeaderEnabledSourceTask,
    		ConsumerStrategy: &consumerStrategySourceTask,
    		CompressionType: &compressionTypeSourceTask,
    		TopicsMapping: &topicsMappingSourceTask,
    	}
    	sourceTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSourceTypeEnum().KAFKA_REPLICATOR_SOURCE
    	startLaterCreateSmartConnectTaskReq:= true
    	taskNameCreateSmartConnectTaskReq:= "smart-connect-2"
    	request.Body = &model.CreateSmartConnectTaskReq{
    		SourceTask: sourceTaskbody,
    		SourceType: &sourceTypeCreateSmartConnectTaskReq,
    		StartLater: &startLaterCreateSmartConnectTaskReq,
    		TaskName: &taskNameCreateSmartConnectTaskReq,
    	}
    	response, err := client.CreateConnectorTask(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    

更多编程语言的SDK代码示例,请参见API Explorer的代码示例页签,可生成自动对应的SDK代码示例。

状态码

状态码

描述

200

创建Smart Connect任务成功。

错误码

请参见错误码

我们使用cookie来确保您的高速浏览体验。继续浏览本站,即表示您同意我们使用cookie。 详情

文档反馈

文档反馈

意见反馈

0/500

标记内容

同时提交标记内容