Updated on 2025-06-19 GMT+08:00

Creating a Smart Connect Task

Function

This API is used to create a Smart Connect task.

Calling Method

For details, see Calling APIs.

URI

POST /v2/{project_id}/instances/{instance_id}/connector/tasks

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID. For details, see Obtaining a Project ID.

instance_id

Yes

String

Instance ID.

Request Parameters

Table 2 Request body parameters

Parameter

Mandatory

Type

Description

task_name

No

String

Smart Connect task name.

start_later

No

Boolean

Indicates whether to start a task later. false: to create a task and start immediately; true: to create a task and manually start it later in the task list.

topics

No

String

Topic of a Smart Connect task.

topics_regex

No

String

Regular expression of the topic of a Smart Connect task.

source_type

No

String

Source type of a Smart Connect task.

source_task

No

SmartConnectTaskReqSourceConfig object

Source configuration of a Smart Connect task.

sink_type

No

String

Target type of a Smart Connect task.

sink_task

No

SmartConnectTaskReqSinkConfig object

Target type of a Smart Connect task.

Table 3 SmartConnectTaskReqSourceConfig

Parameter

Mandatory

Type

Description

current_cluster_name

No

String

Current Kafka instance name. (Mandatory only when the source type is Kafka.)

cluster_name

No

String

Target Kafka instance name. (Mandatory only when the source type is Kafka.)

user_name

No

String

Username set when SASL_SSL was enabled for a target Kafka instance, or when a SASL_SSL user was created. (Mandatory only when the source type is Kafka and the target Kafka authentication mode is SASL_SSL.)

password

No

String

Password set when SASL_SSL was enabled for a target Kafka instance, or when a SASL_SSL user was created. (Mandatory only when the source type is Kafka and the target Kafka authentication mode is SASL_SSL.)

sasl_mechanism

No

String

Target Kafka authentication mode. (Mandatory only when the source type is Kafka and the authentication mode is SASL_SSL.)

instance_id

No

String

Target Kafka instance ID. (Mandatory only when the source type is Kafka. Specify either instance_id or bootstrap_servers.)

bootstrap_servers

No

String

Target Kafka instance address. (Mandatory only when the source type is Kafka. Specify either instance_id or bootstrap_servers.)

security_protocol

No

String

Target Kafka authentication. (Mandatory only when the source type is Kafka).

There are two authentication modes:

  • SASL_SSL: SASL_SSL is enabled.

  • PLAINTEXT: SASL_SSL is disabled.

direction

No

String

Sync direction: pull replicates data from the target Kafka instance to the current one; push replicates data from the source Kafka instance to the target one; two-way replicates data from the source and target Kafka instance data to each other. (Mandatory only when the source type is Kafka.)

sync_consumer_offsets_enabled

No

Boolean

Indicates whether to sync the consumption progress. (Mandatory only when the source type is Kafka.)

replication_factor

No

Integer

Number of topic replicas when a topic is automatically created in the peer instance. The value of this parameter cannot exceed the number of brokers in the peer instance. This parameter overrides the default.replication.factor parameter configured in the target instance. (Mandatory only when the source type is Kafka.)

task_num

No

Integer

Number of data replication tasks. The default value is 2. You are advised to use the default value. If the sync direction is set to two-way, the actual number of tasks will be twice the number of tasks you configure here. (Mandatory only when the source type is Kafka.)

rename_topic_enabled

No

Boolean

Indicates whether to rename a topic. If yes, add the alias of the source Kafka instance before the target topic name to form a new name of the target topic. (Mandatory only when the source type is Kafka.)

provenance_header_enabled

No

Boolean

The target topic receives the replicated messages. The message header contains the message source. If you select Both for Sync Direction, enable Add Source Header to prevent infinite replication. (Mandatory only when the source type is Kafka.)

consumer_strategy

No

String

Start offset. latest: Obtain the latest data; earliest: Obtain the earliest data. (Mandatory only when the source type is Kafka.)

compression_type

No

String

Compression algorithm to use for copying messages. (Mandatory only when the source type is Kafka.)

  • none

  • gzip

  • snappy

  • lz4

  • zstd

topics_mapping

No

String

Topic mapping, which is used to customize the target topic name. Rename Topic and Topic Mapping cannot be configured at the same time. Topic mapping format: source topic:target topic. Use commas (,) to separate multiple topic mappings, for example, topic-sc-1:topic-sc-2,topic-sc-3:topic-sc-4. (Mandatory only when the source type is Kafka.)

Table 4 SmartConnectTaskReqSinkConfig

Parameter

Mandatory

Type

Description

consumer_strategy

No

String

Start offset. latest: Obtain the latest data; earliest: Obtain the earliest data. (Mandatory only when the target type is OBS.)

destination_file_type

No

String

Dump file format. Only TEXT is supported. (Mandatory only when the target type is OBS.)

deliver_time_interval

No

Integer

Data dumping intervals, in seconds. The default interval is 300s. (Mandatory only when the target type is OBS.)

access_key

No

String

AK: access key ID. (Mandatory only when the target type is OBS.)

secret_key

No

String

SK: secret access key used together with the access key ID. (Mandatory only when the target type is OBS.)

obs_bucket_name

No

String

Dumping address, which is the OBS bucket used to store the topic data. (Mandatory only when the target type is OBS.)

obs_path

No

String

Dumping directory, which is the directory for storing topic files dumped to OBS. Use slashes (/) to separate directory levels. (Mandatory only when the target type is OBS.)

partition_format

No

String

Time directory format. (Mandatory only when the target type is OBS.)

  • yyyy: year.

  • yyyy/MM: year and month.

  • yyyy/MM/dd: year, month, and day.

  • yyyy/MM/dd/HH: year, month, day, and hour.

  • yyyy/MM/dd/HH/mm: year, month, day, hour, and minute.

record_delimiter

No

String

Line break, which is used to separate the user data that is written into the dump file. (Mandatory only when the target type is OBS.)

Value range:

  • Comma (,)

  • Semicolon (;)

  • Vertical bar (|)

  • Newline (\n)

  • NULL

store_keys

No

Boolean

Specifies whether to dump keys. (Mandatory only when the target type is OBS.)

Response Parameters

Status code: 200

Table 5 Response body parameters

Parameter

Type

Description

task_name

String

Definition:

Smart Connect task name.

Range:

N/A

topics

String

Definition:

Topic of a Smart Connect task.

Range:

N/A

topics_regex

String

Definition:

Topic regular expression of a Smart Connect task.

Range:

N/A

source_type

String

Definition:

Source type of a Smart Connect task.

Range:

  • NONE: none

  • KAFKA_REPLICATOR_SOURCE: Kafka data replication

source_task

SmartConnectTaskRespSourceConfig object

Definition:

Source configuration of a Smart Connect task.

sink_type

String

Definition:

Target type of a Smart Connect task.

Range:

  • NONE: none

  • OBS_SINK: dump

sink_task

SmartConnectTaskRespSinkConfig object

Definition:

Target configuration of a Smart Connect task.

id

String

Definition:

ID of a Smart Connect task.

Range:

N/A

status

String

Definition:

Status of a Smart Connect task.

Range:

N/A

create_time

Long

Definition:

Time when a Smart Connect task was created.

Range:

N/A

Table 6 SmartConnectTaskRespSourceConfig

Parameter

Type

Description

current_cluster_name

String

Definition:

Current Kafka instance name. (Displayed only when the source type is Kafka.)

Range:

N/A

cluster_name

String

Definition:

Target Kafka instance name. (Displayed only when the source type is Kafka.)

Range:

N/A

user_name

String

Definition:

Username of a target Kafka instance. (Displayed only when the source type is Kafka.)

Range:

N/A

sasl_mechanism

String

Definition:

Target Kafka authentication mode. (Displayed only when the source type is Kafka.)

Range:

  • PLAIN

  • SCRAM-SHA-512

instance_id

String

Definition:

Target Kafka instance ID. (Displayed only when the source type is Kafka.)

Range:

N/A

bootstrap_servers

String

Definition:

Target Kafka instance address. (Displayed only when the source type is Kafka.)

Range:

N/A

security_protocol

String

Definition:

Target Kafka authentication mode. (Displayed only when the source type is Kafka.)

Range:

  • PLAINTEXT: SSL is disabled and data is transmitted in plaintext.

  • SASL_SSL: SASL is used for authentication. Data is encrypted with an SSL certificate for high-security transmission.

  • SASL_PLAINTEXT: SASL is used for authentication. Data is transmitted in plaintext for better performance.

direction

String

Definition:

Synchronization direction. (Displayed only when the source type is Kafka.)

Range:

  • pull

  • push

  • two-way

sync_consumer_offsets_enabled

Boolean

Definition:

Whether to synchronize the consumer offset. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

replication_factor

Integer

Definition:

Number of replicas. (Displayed only when the source type is Kafka.)

Range:

N/A

task_num

Integer

Definition:

Number of tasks. (Displayed only when the source type is Kafka.)

Range:

N/A

rename_topic_enabled

Boolean

Definition:

Whether to rename a topic. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

provenance_header_enabled

Boolean

Definition:

Whether to add the source header. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

consumer_strategy

String

Definition:

Start offset. (Displayed only when the source type is Kafka.)

Range:

  • latest: obtains the latest data.

  • earliest: obtains the earliest data.

compression_type

String

Definition:

Compression algorithm. (Displayed only when the source type is Kafka.)

Range:

  • none

  • gzip

  • snappy

  • lz4

  • zstd

topics_mapping

String

Definition:

Topic mapping. (Displayed only when the source type is Kafka.)

Range:

N/A

Table 7 SmartConnectTaskRespSinkConfig

Parameter

Type

Description

consumer_strategy

String

Definition:

Dump start offset. (Displayed only when the target type is OBS.)

Range:

  • latest: obtains the latest data.

  • earliest: obtains the earliest data.

destination_file_type

String

Definition:

Dump file format. Only TEXT is supported. (Displayed only when the target type is OBS.)

Range:

N/A

deliver_time_interval

Integer

Definition:

Dumping period (s). (Displayed only when the target type is OBS.)

Range:

N/A

obs_bucket_name

String

Definition:

Dumping address. (Displayed only when the target type is OBS.)

Range:

N/A

obs_path

String

Definition:

Dump directory. (Displayed only when the target type is OBS.)

Range:

N/A

partition_format

String

Definition:

Time directory format. (Displayed only when the target type is OBS.)

Range:

N/A

record_delimiter

String

Definition:

Line break. (Displayed only when the target type is OBS.)

Range:

N/A

store_keys

Boolean

Definition:

Storage key. (Displayed only when the target type is OBS.)

Range:

N/A

obs_part_size

Integer

Definition:

Size (in bytes) of each file to be uploaded. The default value is 5242880. (Displayed only when the target type is OBS.)

Range:

N/A

flush_size

Integer

Definition:

Size of flushed data. (Displayed only when the target type is OBS.)

Range:

N/A

timezone

String

Definition:

Time zone. (Displayed only when the target type is OBS.)

Range:

N/A

schema_generator_class

String

Definition:

schema_generator class. The default value is io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator. (Displayed only when the target type is OBS.)

Range:

N/A

partitioner_class

String

Definition:

partitioner class. The default value is io.confluent.connect.storage.partitioner.TimeBasedPartitioner. (Displayed only when the target type is OBS.)

Range:

N/A

value_converter

String

Definition:

Value converter. The default value is org.apache.kafka.connect.converters.ByteArrayConverter. (Displayed only when the target type is OBS.)

Range:

N/A

key_converter

String

Definition:

Key converter. The default value is org.apache.kafka.connect.converters.ByteArrayConverter. (Displayed only when the target type is OBS.)

Range:

N/A

kv_delimiter

String

Definition:

Key-value delimiter. The default value is :. (Displayed only when the target type is OBS.)

Range:

N/A

Example Requests

  • Creating a dumping task that starts immediately.

    POST https://{endpoint}/v2/{project_id}/instances/{instance_id}/connector/tasks
    
    {
      "task_name" : "smart-connect-1",
      "start_later" : false,
      "source_type" : "NONE",
      "topics_regex" : "topic-obs*",
      "sink_type" : "OBS_SINK",
      "sink_task" : {
        "consumer_strategy" : "earliest",
        "destination_file_type" : "TEXT",
        "deliver_time_interval" : 300,
        "access_key" : "********",
        "secret_key" : "********",
        "obs_bucket_name" : "obs_bucket",
        "obs_path" : "obsTransfer-1810125534",
        "partition_format" : "yyyy/MM/dd/HH/mm",
        "record_delimiter" : "\\n",
        "store_keys" : false
      }
    }
  • Creating a Kafka data replication task that starts later.

    POST https://{endpoint}/v2/{project_id}/instances/{instance_id}/connector/tasks
    
    {
      "task_name" : "smart-connect-2",
      "start_later" : true,
      "source_type" : "KAFKA_REPLICATOR_SOURCE",
      "source_task" : {
        "current_cluster_name" : "A",
        "cluster_name" : "B",
        "user_name" : "user1",
        "password" : "********",
        "sasl_mechanism" : "SCRAM-SHA-512",
        "instance_id" : "b54c9dd8-********-********",
        "direction" : "two-way",
        "sync_consumer_offsets_enabled" : false,
        "replication_factor" : 3,
        "task_num" : 2,
        "rename_topic_enabled" : false,
        "provenance_header_enabled" : true,
        "consumer_strategy" : "latest",
        "compression_type" : "snappy",
        "topics_mapping" : "topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4"
      }
    }

Example Responses

Status code: 200

Successful.

{
  "task_name" : "smart-connect-121248117",
  "topics" : "topic-1643449744",
  "source_task" : {
    "current_cluster_name" : "A",
    "cluster_name" : "B",
    "direction" : "pull",
    "bootstrap_servers" : "192.168.45.58:9092,192.168.44.1:9092,192.168.41.230:9092,192.168.43.112:9092",
    "instance_id" : "59f6d088-****-****-****-********",
    "consumer_strategy" : "earliest",
    "sync_consumer_offsets_enabled" : false,
    "rename_topic_enabled" : false,
    "provenance_header_enabled" : false,
    "security_protocol" : "PLAINTEXT",
    "sasl_mechanism" : "PLAIN",
    "user_name" : "",
    "topics_mapping" : "",
    "compression_type" : "none",
    "task_num" : 2,
    "replication_factor" : 3
  },
  "source_type" : "KAFKA_REPLICATOR_SOURCE",
  "sink_task" : null,
  "sink_type" : "NONE",
  "id" : "194917d0-****-****-****-********",
  "status" : "RUNNING",
  "create_time" : 1708427753133
}

SDK Sample Code

The SDK sample code is as follows.

  • Creating a dumping task that starts immediately.

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    
    public class CreateConnectorTaskSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreateConnectorTaskRequest request = new CreateConnectorTaskRequest();
            request.withInstanceId("{instance_id}");
            CreateSmartConnectTaskReq body = new CreateSmartConnectTaskReq();
            SmartConnectTaskReqSinkConfig sinkTaskbody = new SmartConnectTaskReqSinkConfig();
            sinkTaskbody.withConsumerStrategy("earliest")
                .withDestinationFileType("TEXT")
                .withDeliverTimeInterval(300)
                .withAccessKey("********")
                .withSecretKey("********")
                .withObsBucketName("obs_bucket")
                .withObsPath("obsTransfer-1810125534")
                .withPartitionFormat("yyyy/MM/dd/HH/mm")
                .withRecordDelimiter("\n")
                .withStoreKeys(false);
            body.withSinkTask(sinkTaskbody);
            body.withSinkType(CreateSmartConnectTaskReq.SinkTypeEnum.fromValue("OBS_SINK"));
            body.withSourceType(CreateSmartConnectTaskReq.SourceTypeEnum.fromValue("NONE"));
            body.withTopicsRegex("topic-obs*");
            body.withStartLater(false);
            body.withTaskName("smart-connect-1");
            request.withBody(body);
            try {
                CreateConnectorTaskResponse response = client.createConnectorTask(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • Creating a Kafka data replication task that starts later.

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    
    public class CreateConnectorTaskSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreateConnectorTaskRequest request = new CreateConnectorTaskRequest();
            request.withInstanceId("{instance_id}");
            CreateSmartConnectTaskReq body = new CreateSmartConnectTaskReq();
            SmartConnectTaskReqSourceConfig sourceTaskbody = new SmartConnectTaskReqSourceConfig();
            sourceTaskbody.withCurrentClusterName("A")
                .withClusterName("B")
                .withUserName("user1")
                .withPassword("********")
                .withSaslMechanism("SCRAM-SHA-512")
                .withInstanceId("b54c9dd8-********-********")
                .withDirection("two-way")
                .withSyncConsumerOffsetsEnabled(false)
                .withReplicationFactor(3)
                .withTaskNum(2)
                .withRenameTopicEnabled(false)
                .withProvenanceHeaderEnabled(true)
                .withConsumerStrategy("latest")
                .withCompressionType("snappy")
                .withTopicsMapping("topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4");
            body.withSourceTask(sourceTaskbody);
            body.withSourceType(CreateSmartConnectTaskReq.SourceTypeEnum.fromValue("KAFKA_REPLICATOR_SOURCE"));
            body.withStartLater(true);
            body.withTaskName("smart-connect-2");
            request.withBody(body);
            try {
                CreateConnectorTaskResponse response = client.createConnectorTask(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • Creating a dumping task that starts immediately.

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreateConnectorTaskRequest()
            request.instance_id = "{instance_id}"
            sinkTaskbody = SmartConnectTaskReqSinkConfig(
                consumer_strategy="earliest",
                destination_file_type="TEXT",
                deliver_time_interval=300,
                access_key="********",
                secret_key="********",
                obs_bucket_name="obs_bucket",
                obs_path="obsTransfer-1810125534",
                partition_format="yyyy/MM/dd/HH/mm",
                record_delimiter="\n",
                store_keys=False
            )
            request.body = CreateSmartConnectTaskReq(
                sink_task=sinkTaskbody,
                sink_type="OBS_SINK",
                source_type="NONE",
                topics_regex="topic-obs*",
                start_later=False,
                task_name="smart-connect-1"
            )
            response = client.create_connector_task(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • Creating a Kafka data replication task that starts later.

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreateConnectorTaskRequest()
            request.instance_id = "{instance_id}"
            sourceTaskbody = SmartConnectTaskReqSourceConfig(
                current_cluster_name="A",
                cluster_name="B",
                user_name="user1",
                password="********",
                sasl_mechanism="SCRAM-SHA-512",
                instance_id="b54c9dd8-********-********",
                direction="two-way",
                sync_consumer_offsets_enabled=False,
                replication_factor=3,
                task_num=2,
                rename_topic_enabled=False,
                provenance_header_enabled=True,
                consumer_strategy="latest",
                compression_type="snappy",
                topics_mapping="topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4"
            )
            request.body = CreateSmartConnectTaskReq(
                source_task=sourceTaskbody,
                source_type="KAFKA_REPLICATOR_SOURCE",
                start_later=True,
                task_name="smart-connect-2"
            )
            response = client.create_connector_task(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • Creating a dumping task that starts immediately.

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreateConnectorTaskRequest{}
    	request.InstanceId = "{instance_id}"
    	consumerStrategySinkTask:= "earliest"
    	destinationFileTypeSinkTask:= "TEXT"
    	deliverTimeIntervalSinkTask:= int32(300)
    	accessKeySinkTask:= "********"
    	secretKeySinkTask:= "********"
    	obsBucketNameSinkTask:= "obs_bucket"
    	obsPathSinkTask:= "obsTransfer-1810125534"
    	partitionFormatSinkTask:= "yyyy/MM/dd/HH/mm"
    	recordDelimiterSinkTask:= "\n"
    	storeKeysSinkTask:= false
    	sinkTaskbody := &model.SmartConnectTaskReqSinkConfig{
    		ConsumerStrategy: &consumerStrategySinkTask,
    		DestinationFileType: &destinationFileTypeSinkTask,
    		DeliverTimeInterval: &deliverTimeIntervalSinkTask,
    		AccessKey: &accessKeySinkTask,
    		SecretKey: &secretKeySinkTask,
    		ObsBucketName: &obsBucketNameSinkTask,
    		ObsPath: &obsPathSinkTask,
    		PartitionFormat: &partitionFormatSinkTask,
    		RecordDelimiter: &recordDelimiterSinkTask,
    		StoreKeys: &storeKeysSinkTask,
    	}
    	sinkTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSinkTypeEnum().OBS_SINK
    	sourceTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSourceTypeEnum().NONE
    	topicsRegexCreateSmartConnectTaskReq:= "topic-obs*"
    	startLaterCreateSmartConnectTaskReq:= false
    	taskNameCreateSmartConnectTaskReq:= "smart-connect-1"
    	request.Body = &model.CreateSmartConnectTaskReq{
    		SinkTask: sinkTaskbody,
    		SinkType: &sinkTypeCreateSmartConnectTaskReq,
    		SourceType: &sourceTypeCreateSmartConnectTaskReq,
    		TopicsRegex: &topicsRegexCreateSmartConnectTaskReq,
    		StartLater: &startLaterCreateSmartConnectTaskReq,
    		TaskName: &taskNameCreateSmartConnectTaskReq,
    	}
    	response, err := client.CreateConnectorTask(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    
  • Creating a Kafka data replication task that starts later.

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreateConnectorTaskRequest{}
    	request.InstanceId = "{instance_id}"
    	currentClusterNameSourceTask:= "A"
    	clusterNameSourceTask:= "B"
    	userNameSourceTask:= "user1"
    	passwordSourceTask:= "********"
    	saslMechanismSourceTask:= "SCRAM-SHA-512"
    	instanceIdSourceTask:= "b54c9dd8-********-********"
    	directionSourceTask:= "two-way"
    	syncConsumerOffsetsEnabledSourceTask:= false
    	replicationFactorSourceTask:= int32(3)
    	taskNumSourceTask:= int32(2)
    	renameTopicEnabledSourceTask:= false
    	provenanceHeaderEnabledSourceTask:= true
    	consumerStrategySourceTask:= "latest"
    	compressionTypeSourceTask:= "snappy"
    	topicsMappingSourceTask:= "topic-sc-1:topic-sc-3,topic-sc-2:topic-sc-4"
    	sourceTaskbody := &model.SmartConnectTaskReqSourceConfig{
    		CurrentClusterName: &currentClusterNameSourceTask,
    		ClusterName: &clusterNameSourceTask,
    		UserName: &userNameSourceTask,
    		Password: &passwordSourceTask,
    		SaslMechanism: &saslMechanismSourceTask,
    		InstanceId: &instanceIdSourceTask,
    		Direction: &directionSourceTask,
    		SyncConsumerOffsetsEnabled: &syncConsumerOffsetsEnabledSourceTask,
    		ReplicationFactor: &replicationFactorSourceTask,
    		TaskNum: &taskNumSourceTask,
    		RenameTopicEnabled: &renameTopicEnabledSourceTask,
    		ProvenanceHeaderEnabled: &provenanceHeaderEnabledSourceTask,
    		ConsumerStrategy: &consumerStrategySourceTask,
    		CompressionType: &compressionTypeSourceTask,
    		TopicsMapping: &topicsMappingSourceTask,
    	}
    	sourceTypeCreateSmartConnectTaskReq:= model.GetCreateSmartConnectTaskReqSourceTypeEnum().KAFKA_REPLICATOR_SOURCE
    	startLaterCreateSmartConnectTaskReq:= true
    	taskNameCreateSmartConnectTaskReq:= "smart-connect-2"
    	request.Body = &model.CreateSmartConnectTaskReq{
    		SourceTask: sourceTaskbody,
    		SourceType: &sourceTypeCreateSmartConnectTaskReq,
    		StartLater: &startLaterCreateSmartConnectTaskReq,
    		TaskName: &taskNameCreateSmartConnectTaskReq,
    	}
    	response, err := client.CreateConnectorTask(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    

For SDK sample code of more programming languages, see the Sample Code tab in API Explorer. SDK sample code can be automatically generated.

Status Codes

Status Code

Description

200

Successful.

Error Codes

See Error Codes.