Updated on 2025-06-19 GMT+08:00

Modifying the Smart Connect Task Configuration

Function

This API is used to modify the Smart Connect task configuration.

Calling Method

For details, see Calling APIs.

URI

PUT /v2/{project_id}/instances/{instance_id}/connector/tasks/{task_id}

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Definition:

Project ID. For details, see Obtaining a Project ID.

Constraints:

N/A

Range:

N/A

Default Value:

N/A

instance_id

Yes

String

Definition:

Instance ID. To obtain it, log in to the Kafka console and find the instance ID on the Kafka instance details page.

Constraints:

N/A

Range:

N/A

Default Value:

N/A

task_id

Yes

String

Definition:

Task ID. To obtain it, log in to the Kafka console and search for the task ID on the Smart Connect page.

Constraints:

N/A

Range:

N/A

Default Value:

N/A

Request Parameters

Table 2 Request body parameters

Parameter

Mandatory

Type

Description

task_name

No

String

Definition:

Smart Connect task name.

Range:

N/A

topics

No

String

Definition:

Topic of a Smart Connect task.

Range:

N/A

topics_regex

No

String

Definition:

Topic regular expression of a Smart Connect task.

Range:

N/A

source_type

No

String

Definition:

Source type of a Smart Connect task.

Range:

  • NONE: none

  • KAFKA_REPLICATOR_SOURCE: Kafka data replication

source_task

No

SmartConnectTaskRespSourceConfig object

Definition:

Source configuration of a Smart Connect task.

sink_type

No

String

Definition:

Target type of a Smart Connect task.

Range:

  • NONE: none

  • OBS_SINK: dump

sink_task

No

SmartConnectTaskRespSinkConfig object

Definition:

Target configuration of a Smart Connect task.

id

No

String

Definition:

ID of a Smart Connect task.

Range:

N/A

status

No

String

Definition:

Status of a Smart Connect task.

Range:

N/A

create_time

No

Long

Definition:

Time when a Smart Connect task was created.

Range:

N/A

Table 3 SmartConnectTaskRespSourceConfig

Parameter

Mandatory

Type

Description

current_cluster_name

No

String

Definition:

Current Kafka instance name. (Displayed only when the source type is Kafka.)

Range:

N/A

cluster_name

No

String

Definition:

Target Kafka instance name. (Displayed only when the source type is Kafka.)

Range:

N/A

user_name

No

String

Definition:

Username of a target Kafka instance. (Displayed only when the source type is Kafka.)

Range:

N/A

sasl_mechanism

No

String

Definition:

Target Kafka authentication mode. (Displayed only when the source type is Kafka.)

Range:

  • PLAIN

  • SCRAM-SHA-512

instance_id

No

String

Definition:

Target Kafka instance ID. (Displayed only when the source type is Kafka.)

Range:

N/A

bootstrap_servers

No

String

Definition:

Target Kafka instance address. (Displayed only when the source type is Kafka.)

Range:

N/A

security_protocol

No

String

Definition:

Target Kafka authentication mode. (Displayed only when the source type is Kafka.)

Range:

  • PLAINTEXT: SSL is disabled and data is transmitted in plaintext.

  • SASL_SSL: SASL is used for authentication. Data is encrypted with an SSL certificate for high-security transmission.

  • SASL_PLAINTEXT: SASL is used for authentication. Data is transmitted in plaintext for better performance.

direction

No

String

Definition:

Synchronization direction. (Displayed only when the source type is Kafka.)

Range:

  • pull

  • push

  • two-way

sync_consumer_offsets_enabled

No

Boolean

Definition:

Whether to synchronize the consumer offset. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

replication_factor

No

Integer

Definition:

Number of replicas. (Displayed only when the source type is Kafka.)

Range:

N/A

task_num

No

Integer

Definition:

Number of tasks. (Displayed only when the source type is Kafka.)

Range:

N/A

rename_topic_enabled

No

Boolean

Definition:

Whether to rename a topic. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

provenance_header_enabled

No

Boolean

Definition:

Whether to add the source header. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

consumer_strategy

No

String

Definition:

Start offset. (Displayed only when the source type is Kafka.)

Range:

  • latest: obtains the latest data.

  • earliest: obtains the earliest data.

compression_type

No

String

Definition:

Compression algorithm. (Displayed only when the source type is Kafka.)

Range:

  • none

  • gzip

  • snappy

  • lz4

  • zstd

topics_mapping

No

String

Definition:

Topic mapping. (Displayed only when the source type is Kafka.)

Range:

N/A

Table 4 SmartConnectTaskRespSinkConfig

Parameter

Mandatory

Type

Description

consumer_strategy

No

String

Definition:

Dump start offset. (Displayed only when the target type is OBS.)

Range:

  • latest: obtains the latest data.

  • earliest: obtains the earliest data.

destination_file_type

No

String

Definition:

Dump file format. Only TEXT is supported. (Displayed only when the target type is OBS.)

Range:

N/A

deliver_time_interval

No

Integer

Definition:

Dumping period (s). (Displayed only when the target type is OBS.)

Range:

N/A

obs_bucket_name

No

String

Definition:

Dumping address. (Displayed only when the target type is OBS.)

Range:

N/A

obs_path

No

String

Definition:

Dump directory. (Displayed only when the target type is OBS.)

Range:

N/A

partition_format

No

String

Definition:

Time directory format. (Displayed only when the target type is OBS.)

Range:

N/A

record_delimiter

No

String

Definition:

Line break. (Displayed only when the target type is OBS.)

Range:

N/A

store_keys

No

Boolean

Definition:

Storage key. (Displayed only when the target type is OBS.)

Range:

N/A

obs_part_size

No

Integer

Definition:

Size (in bytes) of each file to be uploaded. The default value is 5242880. (Displayed only when the target type is OBS.)

Range:

N/A

flush_size

No

Integer

Definition:

Size of flushed data. (Displayed only when the target type is OBS.)

Range:

N/A

timezone

No

String

Definition:

Time zone. (Displayed only when the target type is OBS.)

Range:

N/A

schema_generator_class

No

String

Definition:

schema_generator class. The default value is io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator. (Displayed only when the target type is OBS.)

Range:

N/A

partitioner_class

No

String

Definition:

partitioner class. The default value is io.confluent.connect.storage.partitioner.TimeBasedPartitioner. (Displayed only when the target type is OBS.)

Range:

N/A

value_converter

No

String

Definition:

Value converter. The default value is org.apache.kafka.connect.converters.ByteArrayConverter. (Displayed only when the target type is OBS.)

Range:

N/A

key_converter

No

String

Definition:

Key converter. The default value is org.apache.kafka.connect.converters.ByteArrayConverter. (Displayed only when the target type is OBS.)

Range:

N/A

kv_delimiter

No

String

Definition:

Key-value delimiter. The default value is :. (Displayed only when the target type is OBS.)

Range:

N/A

Response Parameters

Status code: 200

Table 5 Response body parameters

Parameter

Type

Description

task_name

String

Definition:

Smart Connect task name.

Range:

N/A

topics

String

Definition:

Topic of a Smart Connect task.

Range:

N/A

topics_regex

String

Definition:

Topic regular expression of a Smart Connect task.

Range:

N/A

source_type

String

Definition:

Source type of a Smart Connect task.

Range:

  • NONE: none

  • KAFKA_REPLICATOR_SOURCE: Kafka data replication

source_task

SmartConnectTaskRespSourceConfig object

Definition:

Source configuration of a Smart Connect task.

sink_type

String

Definition:

Target type of a Smart Connect task.

Range:

  • NONE: none

  • OBS_SINK: dump

sink_task

SmartConnectTaskRespSinkConfig object

Definition:

Target configuration of a Smart Connect task.

id

String

Definition:

ID of a Smart Connect task.

Range:

N/A

status

String

Definition:

Status of a Smart Connect task.

Range:

N/A

create_time

Long

Definition:

Time when a Smart Connect task was created.

Range:

N/A

Table 6 SmartConnectTaskRespSourceConfig

Parameter

Type

Description

current_cluster_name

String

Definition:

Current Kafka instance name. (Displayed only when the source type is Kafka.)

Range:

N/A

cluster_name

String

Definition:

Target Kafka instance name. (Displayed only when the source type is Kafka.)

Range:

N/A

user_name

String

Definition:

Username of a target Kafka instance. (Displayed only when the source type is Kafka.)

Range:

N/A

sasl_mechanism

String

Definition:

Target Kafka authentication mode. (Displayed only when the source type is Kafka.)

Range:

  • PLAIN

  • SCRAM-SHA-512

instance_id

String

Definition:

Target Kafka instance ID. (Displayed only when the source type is Kafka.)

Range:

N/A

bootstrap_servers

String

Definition:

Target Kafka instance address. (Displayed only when the source type is Kafka.)

Range:

N/A

security_protocol

String

Definition:

Target Kafka authentication mode. (Displayed only when the source type is Kafka.)

Range:

  • PLAINTEXT: SSL is disabled and data is transmitted in plaintext.

  • SASL_SSL: SASL is used for authentication. Data is encrypted with an SSL certificate for high-security transmission.

  • SASL_PLAINTEXT: SASL is used for authentication. Data is transmitted in plaintext for better performance.

direction

String

Definition:

Synchronization direction. (Displayed only when the source type is Kafka.)

Range:

  • pull

  • push

  • two-way

sync_consumer_offsets_enabled

Boolean

Definition:

Whether to synchronize the consumer offset. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

replication_factor

Integer

Definition:

Number of replicas. (Displayed only when the source type is Kafka.)

Range:

N/A

task_num

Integer

Definition:

Number of tasks. (Displayed only when the source type is Kafka.)

Range:

N/A

rename_topic_enabled

Boolean

Definition:

Whether to rename a topic. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

provenance_header_enabled

Boolean

Definition:

Whether to add the source header. (Displayed only when the source type is Kafka.)

Range:

  • true: Yes

  • false: No

consumer_strategy

String

Definition:

Start offset. (Displayed only when the source type is Kafka.)

Range:

  • latest: obtains the latest data.

  • earliest: obtains the earliest data.

compression_type

String

Definition:

Compression algorithm. (Displayed only when the source type is Kafka.)

Range:

  • none

  • gzip

  • snappy

  • lz4

  • zstd

topics_mapping

String

Definition:

Topic mapping. (Displayed only when the source type is Kafka.)

Range:

N/A

Table 7 SmartConnectTaskRespSinkConfig

Parameter

Type

Description

consumer_strategy

String

Definition:

Dump start offset. (Displayed only when the target type is OBS.)

Range:

  • latest: obtains the latest data.

  • earliest: obtains the earliest data.

destination_file_type

String

Definition:

Dump file format. Only TEXT is supported. (Displayed only when the target type is OBS.)

Range:

N/A

deliver_time_interval

Integer

Definition:

Dumping period (s). (Displayed only when the target type is OBS.)

Range:

N/A

obs_bucket_name

String

Definition:

Dumping address. (Displayed only when the target type is OBS.)

Range:

N/A

obs_path

String

Definition:

Dump directory. (Displayed only when the target type is OBS.)

Range:

N/A

partition_format

String

Definition:

Time directory format. (Displayed only when the target type is OBS.)

Range:

N/A

record_delimiter

String

Definition:

Line break. (Displayed only when the target type is OBS.)

Range:

N/A

store_keys

Boolean

Definition:

Storage key. (Displayed only when the target type is OBS.)

Range:

N/A

obs_part_size

Integer

Definition:

Size (in bytes) of each file to be uploaded. The default value is 5242880. (Displayed only when the target type is OBS.)

Range:

N/A

flush_size

Integer

Definition:

Size of flushed data. (Displayed only when the target type is OBS.)

Range:

N/A

timezone

String

Definition:

Time zone. (Displayed only when the target type is OBS.)

Range:

N/A

schema_generator_class

String

Definition:

schema_generator class. The default value is io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator. (Displayed only when the target type is OBS.)

Range:

N/A

partitioner_class

String

Definition:

partitioner class. The default value is io.confluent.connect.storage.partitioner.TimeBasedPartitioner. (Displayed only when the target type is OBS.)

Range:

N/A

value_converter

String

Definition:

Value converter. The default value is org.apache.kafka.connect.converters.ByteArrayConverter. (Displayed only when the target type is OBS.)

Range:

N/A

key_converter

String

Definition:

Key converter. The default value is org.apache.kafka.connect.converters.ByteArrayConverter. (Displayed only when the target type is OBS.)

Range:

N/A

kv_delimiter

String

Definition:

Key-value delimiter. The default value is :. (Displayed only when the target type is OBS.)

Range:

N/A

Example Requests

PUT https://{endpoint}/v2/{project_id}/instances/{instance_id}/connector/tasks/{task_id}

Example Responses

Status code: 200

Successful

{
  "task_name" : "smart-connect-121248117",
  "topics" : "topic-1643449744",
  "source_task" : {
    "current_cluster_name" : "A",
    "cluster_name" : "B",
    "direction" : "pull",
    "bootstrap_servers" : "192.168.45.58:9092,192.168.44.1:9092,192.168.41.230:9092,192.168.43.112:9092",
    "instance_id" : "59f6d088-****-****-****-********",
    "consumer_strategy" : "earliest",
    "sync_consumer_offsets_enabled" : false,
    "rename_topic_enabled" : false,
    "provenance_header_enabled" : false,
    "security_protocol" : "PLAINTEXT",
    "sasl_mechanism" : "PLAIN",
    "user_name" : "",
    "topics_mapping" : "",
    "compression_type" : "none",
    "task_num" : 2,
    "replication_factor" : 3
  },
  "source_type" : "KAFKA_REPLICATOR_SOURCE",
  "sink_task" : null,
  "sink_type" : "NONE",
  "id" : "194917d0-****-****-****-********",
  "status" : "RUNNING",
  "create_time" : 1708427753133
}

SDK Sample Code

The SDK sample code is as follows.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
package com.huaweicloud.sdk.test;

import com.huaweicloud.sdk.core.auth.ICredential;
import com.huaweicloud.sdk.core.auth.BasicCredentials;
import com.huaweicloud.sdk.core.exception.ConnectionException;
import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
import com.huaweicloud.sdk.core.exception.ServiceResponseException;
import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
import com.huaweicloud.sdk.kafka.v2.*;
import com.huaweicloud.sdk.kafka.v2.model.*;


public class ModifyConnectorTaskSolution {

    public static void main(String[] args) {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        String ak = System.getenv("CLOUD_SDK_AK");
        String sk = System.getenv("CLOUD_SDK_SK");
        String projectId = "{project_id}";

        ICredential auth = new BasicCredentials()
                .withProjectId(projectId)
                .withAk(ak)
                .withSk(sk);

        KafkaClient client = KafkaClient.newBuilder()
                .withCredential(auth)
                .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                .build();
        ModifyConnectorTaskRequest request = new ModifyConnectorTaskRequest();
        request.withInstanceId("{instance_id}");
        request.withTaskId("{task_id}");
        SmartConnectTaskEntity body = new SmartConnectTaskEntity();
        request.withBody(body);
        try {
            ModifyConnectorTaskResponse response = client.modifyConnectorTask(request);
            System.out.println(response.toString());
        } catch (ConnectionException e) {
            e.printStackTrace();
        } catch (RequestTimeoutException e) {
            e.printStackTrace();
        } catch (ServiceResponseException e) {
            e.printStackTrace();
            System.out.println(e.getHttpStatusCode());
            System.out.println(e.getRequestId());
            System.out.println(e.getErrorCode());
            System.out.println(e.getErrorMsg());
        }
    }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
# coding: utf-8

import os
from huaweicloudsdkcore.auth.credentials import BasicCredentials
from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkkafka.v2 import *

if __name__ == "__main__":
    # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak = os.environ["CLOUD_SDK_AK"]
    sk = os.environ["CLOUD_SDK_SK"]
    projectId = "{project_id}"

    credentials = BasicCredentials(ak, sk, projectId)

    client = KafkaClient.new_builder() \
        .with_credentials(credentials) \
        .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
        .build()

    try:
        request = ModifyConnectorTaskRequest()
        request.instance_id = "{instance_id}"
        request.task_id = "{task_id}"
        request.body = SmartConnectTaskEntity(
        )
        response = client.modify_connector_task(request)
        print(response)
    except exceptions.ClientRequestException as e:
        print(e.status_code)
        print(e.request_id)
        print(e.error_code)
        print(e.error_msg)
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
package main

import (
	"fmt"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
    kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
    region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
)

func main() {
    // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak := os.Getenv("CLOUD_SDK_AK")
    sk := os.Getenv("CLOUD_SDK_SK")
    projectId := "{project_id}"

    auth := basic.NewCredentialsBuilder().
        WithAk(ak).
        WithSk(sk).
        WithProjectId(projectId).
        Build()

    client := kafka.NewKafkaClient(
        kafka.KafkaClientBuilder().
            WithRegion(region.ValueOf("<YOUR REGION>")).
            WithCredential(auth).
            Build())

    request := &model.ModifyConnectorTaskRequest{}
	request.InstanceId = "{instance_id}"
	request.TaskId = "{task_id}"
	request.Body = &model.SmartConnectTaskEntity{
	}
	response, err := client.ModifyConnectorTask(request)
	if err == nil {
        fmt.Printf("%+v\n", response)
    } else {
        fmt.Println(err)
    }
}

For SDK sample code of more programming languages, see the Sample Code tab in API Explorer. SDK sample code can be automatically generated.

Status Codes

Status Code

Description

200

Successful

Error Codes

See Error Codes.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more