Kafka实例查询Topic
功能介绍
该接口用于查询指定Kafka实例的Topic详情。
调用方法
请参见如何调用API。
URI
GET /v2/{project_id}/instances/{instance_id}/topics
参数 |
是否必选 |
参数类型 |
描述 |
---|---|---|---|
project_id |
是 |
String |
项目ID,获取方式请参见获取项目ID。 |
instance_id |
是 |
String |
实例ID。 |
参数 |
是否必选 |
参数类型 |
描述 |
---|---|---|---|
offset |
否 |
String |
偏移量,表示从此偏移量开始查询, offset大于等于0。 |
limit |
否 |
String |
当次查询返回的最大实例个数,默认值为10,取值范围为1~50。 |
请求参数
无
响应参数
状态码: 200
参数 |
参数类型 |
描述 |
---|---|---|
total |
Integer |
topic总数。 |
size |
Integer |
分页查询的大小。 |
remain_partitions |
Integer |
剩余分区数。 |
max_partitions |
Integer |
分区总数。 |
topic_max_partitions |
Integer |
单个topic最大占用分区数。 |
topics |
Array of TopicEntity objects |
topic列表。 |
参数 |
参数类型 |
描述 |
---|---|---|
policiesOnly |
Boolean |
是否为默认策略。 |
name |
String |
topic名称。 |
replication |
Integer |
副本数,配置数据的可靠性。 |
partition |
Integer |
topic分区数,设置消费的并发数。 |
retention_time |
Integer |
消息老化时间。 |
sync_replication |
Boolean |
是否开启同步复制,开启后,客户端生产消息时相应的也要设置acks=-1,否则不生效,默认关闭。 |
sync_message_flush |
Boolean |
是否使用同步落盘。默认值为false。同步落盘会导致性能降低。 |
external_configs |
Object |
扩展配置。 |
topic_type |
Integer |
topic类型(0:普通Topic 1:系统(内部)Topic)。 |
topic_other_configs |
Array of topic_other_configs objects |
topic其他配置 |
topic_desc |
String |
topic描述 |
created_at |
Long |
topic创建时间 |
请求示例
查询Topic列表。
GET https://{endpoint}/v2/{project_id}/instances/{instance_id}/topics?offset=0&limit=10
响应示例
状态码: 200
查询成功。
{ "total" : 1, "size" : 0, "topics" : [ { "policiesOnly" : false, "name" : "Topic-test01", "replication" : 3, "partition" : 3, "retention_time" : 72, "sync_replication" : "false", "sync_message_flush" : "false", "topic_other_configs" : [ { "name" : "max.message.bytes", "valid_values" : "[0...10485760]", "default_value" : "10485760", "config_type" : "dynamic", "value" : "10485760", "value_type" : "int" }, { "name" : "message.timestamp.type", "valid_values" : "[CreateTime, LogAppendTime]", "default_value" : "LogAppendTime", "config_type" : "dynamic", "value" : "LogAppendTime", "value_type" : "string" } ], "external_configs" : { }, "topic_type" : 0, "topic_desc" : "This is a test topic", "created_at" : 1688112779916 } ], "remain_partitions" : 294, "max_partitions" : 300, "topic_max_partitions" : 200 }
SDK代码示例
SDK代码示例如下。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
package com.huaweicloud.sdk.test; import com.huaweicloud.sdk.core.auth.ICredential; import com.huaweicloud.sdk.core.auth.BasicCredentials; import com.huaweicloud.sdk.core.exception.ConnectionException; import com.huaweicloud.sdk.core.exception.RequestTimeoutException; import com.huaweicloud.sdk.core.exception.ServiceResponseException; import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion; import com.huaweicloud.sdk.kafka.v2.*; import com.huaweicloud.sdk.kafka.v2.model.*; public class ListInstanceTopicsSolution { public static void main(String[] args) { // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security. // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment String ak = System.getenv("CLOUD_SDK_AK"); String sk = System.getenv("CLOUD_SDK_SK"); String projectId = "{project_id}"; ICredential auth = new BasicCredentials() .withProjectId(projectId) .withAk(ak) .withSk(sk); KafkaClient client = KafkaClient.newBuilder() .withCredential(auth) .withRegion(KafkaRegion.valueOf("<YOUR REGION>")) .build(); ListInstanceTopicsRequest request = new ListInstanceTopicsRequest(); request.withInstanceId("{instance_id}"); try { ListInstanceTopicsResponse response = client.listInstanceTopics(request); System.out.println(response.toString()); } catch (ConnectionException e) { e.printStackTrace(); } catch (RequestTimeoutException e) { e.printStackTrace(); } catch (ServiceResponseException e) { e.printStackTrace(); System.out.println(e.getHttpStatusCode()); System.out.println(e.getRequestId()); System.out.println(e.getErrorCode()); System.out.println(e.getErrorMsg()); } } } |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
# coding: utf-8 import os from huaweicloudsdkcore.auth.credentials import BasicCredentials from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion from huaweicloudsdkcore.exceptions import exceptions from huaweicloudsdkkafka.v2 import * if __name__ == "__main__": # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security. # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment ak = os.environ["CLOUD_SDK_AK"] sk = os.environ["CLOUD_SDK_SK"] projectId = "{project_id}" credentials = BasicCredentials(ak, sk, projectId) client = KafkaClient.new_builder() \ .with_credentials(credentials) \ .with_region(KafkaRegion.value_of("<YOUR REGION>")) \ .build() try: request = ListInstanceTopicsRequest() request.instance_id = "{instance_id}" response = client.list_instance_topics(request) print(response) except exceptions.ClientRequestException as e: print(e.status_code) print(e.request_id) print(e.error_code) print(e.error_msg) |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 |
package main import ( "fmt" "github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic" kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2" "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model" region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region" ) func main() { // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security. // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment ak := os.Getenv("CLOUD_SDK_AK") sk := os.Getenv("CLOUD_SDK_SK") projectId := "{project_id}" auth := basic.NewCredentialsBuilder(). WithAk(ak). WithSk(sk). WithProjectId(projectId). Build() client := kafka.NewKafkaClient( kafka.KafkaClientBuilder(). WithRegion(region.ValueOf("<YOUR REGION>")). WithCredential(auth). Build()) request := &model.ListInstanceTopicsRequest{} request.InstanceId = "{instance_id}" response, err := client.ListInstanceTopics(request) if err == nil { fmt.Printf("%+v\n", response) } else { fmt.Println(err) } } |
更多编程语言的SDK代码示例,请参见API Explorer的代码示例页签,可生成自动对应的SDK代码示例。
状态码
状态码 |
描述 |
---|---|
200 |
查询成功。 |
错误码
请参见错误码。