Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Querying an Instance

Function

This API is used to query the details about a specified instance.

Calling Method

For details, see Calling APIs.

URI

GET /v2/{project_id}/instances/{instance_id}

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID. For details about how to obtain it, see Obtaining a Project ID.

instance_id

Yes

String

Instance ID.

Request Parameters

None

Response Parameters

Status code: 200

Table 2 Response body parameters

Parameter

Type

Description

name

String

Instance name.

engine

String

Message engine.

engine_version

String

Version.

description

String

Instance description

specification

String

Instance specifications.

storage_space

Integer

Message storage space in GB.

partition_num

String

Number of partitions in a Kafka instance.

used_storage_space

Integer

Used message storage space in GB.

dns_enable

Boolean

Indicates whether domain name access is enabled for an instance.

  • true: enable

  • false: disable

connect_address

String

IP address of an instance.

port

Integer

Port of an instance.

status

String

Instance status. For details, see Instance Status.

instance_id

String

Instance ID.

resource_spec_code

String

Resource specifications.

  • dms.instance.kafka.cluster.c3.mini: Kafka instance with 100 MB/s bandwidth

  • dms.instance.kafka.cluster.c3.small.2: Kafka instance with 300 MB/s bandwidth

  • dms.instance.kafka.cluster.c3.middle.2: Kafka instance with 600 MB/s bandwidth

  • dms.instance.kafka.cluster.c3.high.2: Kafka instance with 1200 MB/s bandwidth

charging_mode

Integer

Billing mode. Options: 1: pay-per-use; 0: yearly/monthly.

vpc_id

String

VPC ID.

vpc_name

String

VPC name.

created_at

String

Time when the instance was created.

The time is in the format of timestamp, that is, the offset milliseconds from 1970-01-01 00:00:00 UTC to the specified time.

subnet_name

String

Subnet name.

subnet_cidr

String

Subnet CIDR block.

user_id

String

User ID.

user_name

String

Username.

access_user

String

Username for accessing the instance.

order_id

String

Order ID. This parameter has a value only when the billing mode is yearly/monthly.

maintain_begin

String

Time at which the maintenance time window starts. The format is HH:mm:ss.

maintain_end

String

Time at which the maintenance time window ends. The format is HH:mm:ss.

enable_publicip

Boolean

Whether public access is enabled for the instance.

  • true: enabled

  • false: disabled

management_connect_address

String

Connection address of Kafka Manager of the Kafka instance.

ssl_enable

Boolean

Whether security authentication is enabled.

  • true: enable

  • false: disabled

broker_ssl_enable

Boolean

Indicates whether to enable encrypted replica transmission among brokers.

  • true: enable

  • false: disable

kafka_security_protocol

String

Security protocol used by Kafka.

If port-protocols is returned in the instance details, kafka_security_protocol works as the security protocol for private and public network access, and cross-VPC access.

Otherwise, kafka_security_protocol works as the security protocol only for cross-VPC access. For the security protocol for private and public network access, see port_protocols.

  • PLAINTEXT: The SSL certificate is not used for encrypted transmission, and username-password authentication is unavailable. This mode has higher performance but lower security. You are advised not to use this mode for public network access in the production environment.

  • SASL_SSL: Data is encrypted with SSL certificates for high-security transmission.

  • SASL_PLAINTEXT: Data is transmitted in plaintext with username and password authentication. This protocol uses the SCRAM-SHA-512 mechanism and delivers high performance.

sasl_enabled_mechanisms

Array of strings

Authentication mechanism used after SASL is enabled.

  • PLAIN: simple username and password verification.

  • SCRAM-SHA-512: user credential verification, which is more secure than PLAIN.

ssl_two_way_enable

Boolean

Indicates whether to enable two-way authentication.

cert_replaced

Boolean

Whether the certificate can be replaced.

public_management_connect_address

String

Address for accessing Kafka Manager over public networks.

enterprise_project_id

String

Enterprise project ID.

is_logical_volume

Boolean

Whether the instance is a new instance. This parameter is used to distinguish old instances from new instances during instance capacity expansion.

  • true: New instance, which allows dynamic disk capacity expansion without restarting the instance.

  • false: Old instance.

extend_times

Integer

Number of disk expansion times. If the value exceeds 20, disk expansion is no longer allowed.

enable_auto_topic

Boolean

Whether automatic topic creation is enabled.

  • true: enabled

  • false: disabled

type

String

Instance type. The value can be cluster.

product_id

String

Product ID.

security_group_id

String

Security group ID.

security_group_name

String

Security group name.

subnet_id

String

Subnet ID.

available_zones

Array of strings

AZ to which the instance brokers belong. The AZ ID is returned.

available_zone_names

Array of strings

Name of the AZ to which the instance node belongs. The AZ name is returned.

total_storage_space

Integer

Message storage space in GB.

public_connect_address

String

Instance public access address. This parameter is available only when public access is enabled for the instance.

public_connect_domain_name

String

Public network access domain name of the instance. This parameter is available only when public access is enabled for the instance.

storage_resource_id

String

Storage resource ID.

storage_spec_code

String

I/O specifications.

service_type

String

Service type.

storage_type

String

Storage class.

retention_policy

String

Message retention policy.

kafka_public_status

String

Whether public access is enabled for Kafka.

public_bandwidth

Integer

Public network access bandwidth.

enable_log_collection

Boolean

Whether log collection is enabled.

new_auth_cert

Boolean

Indicates whether to enable a new certificate.

cross_vpc_info

String

Cross-VPC access information.

ipv6_enable

Boolean

Whether IPv6 is enabled.

ipv6_connect_addresses

Array of strings

IPv6 connection address.

connector_enable

Boolean

Whether dumping is enabled. Dumping is not supported for the new specification type.

connector_node_num

Integer

Number of connectors.

connector_id

String

Dumping task ID.

rest_enable

Boolean

Whether Kafka REST is enabled.

rest_connect_address

String

Kafka REST connection address.

public_boundwidth

Integer

Public network access bandwidth. To be deleted.

message_query_inst_enable

Boolean

Whether message query is enabled.

vpc_client_plain

Boolean

Whether intra-VPC plaintext access is enabled.

support_features

String

List of features supported by the Kafka instance.

trace_enable

Boolean

Whether message tracing is enabled.

agent_enable

Boolean

Indicates whether the proxy is enabled.

pod_connect_address

String

Connection address on the tenant side.

disk_encrypted

Boolean

Whether disk encryption is enabled.

disk_encrypted_key

String

Disk encryption key. If disk encryption is not enabled, this parameter is left blank.

kafka_private_connect_address

String

Private connection address of a Kafka instance.

kafka_private_connect_domain_name

String

Private connection domain name of a Kafka instance.

ces_version

String

Cloud Eye version.

public_access_enabled

String

Time when public access was enabled for an instance.

Value range:

  • true: enabled

  • actived: disabled

  • closed: disabled

  • false: disabled

node_num

Integer

Node quantity.

port_protocols

PortProtocolsEntity object

Connection modes and addresses supported by an instance.

enable_acl

Boolean

Indicates whether access control is enabled.

new_spec_billing_enable

Boolean

Whether billing based on new specifications is enabled.

broker_num

Integer

Broker quantity.

tags

Array of TagEntity objects

Tag list.

dr_enable

Boolean

Indicates whether DR is enabled.

Table 3 PortProtocolsEntity

Parameter

Type

Description

private_plain_enable

Boolean

Whether the instance supports private plaintext access.

  • true: Yes

  • false: No

private_plain_address

String

Connection addresses of Kafka private plaintext access.

private_plain_domain_name

String

Private plaintext connection domain name.

private_sasl_ssl_enable

Boolean

Whether the instance supports private SASL_SSL access.

  • true: Yes

  • false: No

private_sasl_ssl_address

String

Connection addresses of the Kafka private SASL_SSL access mode.

private_sasl_ssl_domain_name

String

Private SASL_SSL connection domain name.

private_sasl_plaintext_enable

Boolean

Whether the instance supports private SASL_PLAINTEXT access.

  • true: Yes

  • false: No

private_sasl_plaintext_address

String

Connection addresses of the Kafka private SASL_PLAINTEXT access mode.

private_sasl_plaintext_domain_name

String

Private SASL_PLAINTEXT connection domain name.

public_plain_enable

Boolean

Whether the instance supports public plaintext access.

  • true: Yes

  • false: No

public_plain_address

String

Connection addresses of Kafka public plaintext access.

public_plain_domain_name

String

Public plaintext connection domain name.

public_sasl_ssl_enable

Boolean

Whether the instance supports public SASL_SSL access.

  • true: Yes

  • false: No

public_sasl_ssl_address

String

Connection addresses of the Kafka public SASL_SSL access mode.

public_sasl_ssl_domain_name

String

Public SASL_SSL connection domain name.

public_sasl_plaintext_enable

Boolean

Whether the instance supports public SASL_PLAINTEXT access.

  • true: Yes

  • false: No

public_sasl_plaintext_address

String

Connection addresses of the Kafka public SASL_PLAINTEXT access mode.

public_sasl_plaintext_domain_name

String

Public SASL_PLAINTEXT connection domain name.

Table 4 TagEntity

Parameter

Type

Description

key

String

Tag key.

  • Cannot be left blank.

  • Must be unique for the same instance.

  • Can contain 1 to 128 characters.

  • Can contain letters, digits, spaces, and special characters _.:=+-@

  • Cannot start with sys

  • Cannot start or end with a space.

value

String

Tag value.

  • Can contain 0 to 255 characters.

  • Can contain letters, digits, spaces, and special characters _.:=+-@

Example Requests

Querying an instance

GET https://{endpoint}/v2/{project_id}/instances/{instance_id}

Example Responses

Status code: 200

Specified instance queried.

{
  "name" : "kafka-2085975099",
  "engine" : "kafka",
  "port" : 9092,
  "status" : "RUNNING",
  "type" : "cluster",
  "specification" : "100MB",
  "engine_version" : "1.1.0",
  "connect_address" : "192.168.0.100,192.168.0.61,192.168.0.72",
  "instance_id" : "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "resource_spec_code" : "dms.instance.kafka.cluster.c3.mini",
  "charging_mode" : 1,
  "vpc_id" : "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "vpc_name" : "dms-test",
  "created_at" : "1585618587087",
  "product_id" : "00300-30308-0--0",
  "security_group_id" : "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "security_group_name" : "Sys-default",
  "subnet_id" : "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "available_zones" : [ "38b0f7a602344246bcb0da47b5d548e7" ],
  "available_zone_names" : [ "AZ1" ],
  "user_id" : "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
  "user_name" : "paas_dms",
  "maintain_begin" : "02:00:00",
  "maintain_end" : "06:00:00",
  "enable_log_collection" : false,
  "new_auth_cert" : false,
  "storage_space" : 492,
  "total_storage_space" : 600,
  "used_storage_space" : 25,
  "partition_num" : "300",
  "enable_publicip" : false,
  "ssl_enable" : false,
  "broker_ssl_enable" : false,
  "cert_replaced" : false,
  "management_connect_address" : "https://192.168.0.100:9999",
  "cross_vpc_info" : "{\"192.168.0.61\":{\"advertised_ip\":\"192.168.0.61\",\"port\":9011,\"port_id\":\"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx\"},\"192.168.0.72\":{\"advertised_ip\":\"192.168.0.72\",\"port\":9011,\"port_id\":\"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx\"},\"192.168.0.100\":{\"advertised_ip\":\"192.168.0.100\",\"port\":9011,\"port_id\":\"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx\"}}",
  "storage_resource_id" : "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "storage_spec_code" : "dms.physical.storage.ultra",
  "service_type" : "advanced",
  "storage_type" : "hec",
  "enterprise_project_id" : "0",
  "is_logical_volume" : true,
  "extend_times" : 0,
  "retention_policy" : "produce_reject",
  "ipv6_enable" : false,
  "ipv6_connect_addresses" : [ ],
  "connector_enable" : false,
  "connector_node_num" : 0,
  "connector_id" : "",
  "rest_enable" : false,
  "rest_connect_address" : "",
  "kafka_public_status" : "closed",
  "public_bandwidth" : 0,
  "message_query_inst_enable" : true,
  "vpc_client_plain" : false,
  "support_features" : "kafka.new.pod.port,feature.physerver.kafka.topic.modify,feature.physerver.kafka.topic.accesspolicy,message_trace_enable,features.pod.token.access,feature.physerver.kafka.pulbic.dynamic,roma_app_enable,features.log.collection,auto_topic_switch,feature.physerver.kafka.user.manager",
  "trace_enable" : false,
  "agent_enable" : false,
  "pod_connect_address" : "100.86.75.15:9080,100.86.142.77:9080,100.86.250.167:9080",
  "disk_encrypted" : false,
  "kafka_private_connect_address" : "192.168.0.61:9092,192.168.0.100:9092,192.168.0.72:9092",
  "enable_auto_topic" : false,
  "new_spec_billing_enable" : false,
  "ces_version" : "linux",
  "port_protocols" : "{\"private_plain_enable\": true,\"private_plain_address\": \"192.xxx.xxx.xxx:9092,192.xxx.xxx.xxx:9092,192.xxx.xxx.xxx:9092\",\"private_sasl_ssl_enable\": true,\"private_sasl_ssl_address\": \"192.xxx.xxx.xxx:9093,192.xxx.xxx.xxx:9093,192.xxx.xxx.xxx:9093\",\"private_sasl_plaintext_enable\": false,\"private_sasl_plaintext_address\": \"\",\"public_plain_enable\": true,\"public_plain_address\": \"100.xxx.xxx.xxx:9094,100.xxx.xxx.xxx:9094,100.xxx.xxx.xxx:9094\",\"public_sasl_ssl_enable\": true,\"public_sasl_ssl_address\": \"100.xxx.xxx.xxx:9095,100.xxx.xxx.xxx:9095,100.xxx.xxx.xxx:9095\",\"public_sasl_plaintext_enable\": false,\"public_sasl_plaintext_address\": \"\"}"
}

SDK Sample Code

The SDK sample code is as follows.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
package com.huaweicloud.sdk.test;

import com.huaweicloud.sdk.core.auth.ICredential;
import com.huaweicloud.sdk.core.auth.BasicCredentials;
import com.huaweicloud.sdk.core.exception.ConnectionException;
import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
import com.huaweicloud.sdk.core.exception.ServiceResponseException;
import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
import com.huaweicloud.sdk.kafka.v2.*;
import com.huaweicloud.sdk.kafka.v2.model.*;


public class ShowInstanceSolution {

    public static void main(String[] args) {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        String ak = System.getenv("CLOUD_SDK_AK");
        String sk = System.getenv("CLOUD_SDK_SK");
        String projectId = "{project_id}";

        ICredential auth = new BasicCredentials()
                .withProjectId(projectId)
                .withAk(ak)
                .withSk(sk);

        KafkaClient client = KafkaClient.newBuilder()
                .withCredential(auth)
                .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                .build();
        ShowInstanceRequest request = new ShowInstanceRequest();
        request.withInstanceId("{instance_id}");
        try {
            ShowInstanceResponse response = client.showInstance(request);
            System.out.println(response.toString());
        } catch (ConnectionException e) {
            e.printStackTrace();
        } catch (RequestTimeoutException e) {
            e.printStackTrace();
        } catch (ServiceResponseException e) {
            e.printStackTrace();
            System.out.println(e.getHttpStatusCode());
            System.out.println(e.getRequestId());
            System.out.println(e.getErrorCode());
            System.out.println(e.getErrorMsg());
        }
    }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# coding: utf-8

import os
from huaweicloudsdkcore.auth.credentials import BasicCredentials
from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkkafka.v2 import *

if __name__ == "__main__":
    # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak = os.environ["CLOUD_SDK_AK"]
    sk = os.environ["CLOUD_SDK_SK"]
    projectId = "{project_id}"

    credentials = BasicCredentials(ak, sk, projectId)

    client = KafkaClient.new_builder() \
        .with_credentials(credentials) \
        .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
        .build()

    try:
        request = ShowInstanceRequest()
        request.instance_id = "{instance_id}"
        response = client.show_instance(request)
        print(response)
    except exceptions.ClientRequestException as e:
        print(e.status_code)
        print(e.request_id)
        print(e.error_code)
        print(e.error_msg)
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
package main

import (
	"fmt"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
    kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
    region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
)

func main() {
    // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak := os.Getenv("CLOUD_SDK_AK")
    sk := os.Getenv("CLOUD_SDK_SK")
    projectId := "{project_id}"

    auth := basic.NewCredentialsBuilder().
        WithAk(ak).
        WithSk(sk).
        WithProjectId(projectId).
        Build()

    client := kafka.NewKafkaClient(
        kafka.KafkaClientBuilder().
            WithRegion(region.ValueOf("<YOUR REGION>")).
            WithCredential(auth).
            Build())

    request := &model.ShowInstanceRequest{}
	request.InstanceId = "{instance_id}"
	response, err := client.ShowInstance(request)
	if err == nil {
        fmt.Printf("%+v\n", response)
    } else {
        fmt.Println(err)
    }
}

For SDK sample code of more programming languages, see the Sample Code tab in API Explorer. SDK sample code can be automatically generated.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
package com.huaweicloud.sdk.test;

import com.huaweicloud.sdk.core.auth.ICredential;
import com.huaweicloud.sdk.core.auth.BasicCredentials;
import com.huaweicloud.sdk.core.exception.ConnectionException;
import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
import com.huaweicloud.sdk.core.exception.ServiceResponseException;
import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
import com.huaweicloud.sdk.kafka.v2.*;
import com.huaweicloud.sdk.kafka.v2.model.*;


public class ShowInstanceSolution {

    public static void main(String[] args) {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        String ak = System.getenv("CLOUD_SDK_AK");
        String sk = System.getenv("CLOUD_SDK_SK");
        String projectId = "{project_id}";

        ICredential auth = new BasicCredentials()
                .withProjectId(projectId)
                .withAk(ak)
                .withSk(sk);

        KafkaClient client = KafkaClient.newBuilder()
                .withCredential(auth)
                .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                .build();
        ShowInstanceRequest request = new ShowInstanceRequest();
        request.withInstanceId("{instance_id}");
        try {
            ShowInstanceResponse response = client.showInstance(request);
            System.out.println(response.toString());
        } catch (ConnectionException e) {
            e.printStackTrace();
        } catch (RequestTimeoutException e) {
            e.printStackTrace();
        } catch (ServiceResponseException e) {
            e.printStackTrace();
            System.out.println(e.getHttpStatusCode());
            System.out.println(e.getRequestId());
            System.out.println(e.getErrorCode());
            System.out.println(e.getErrorMsg());
        }
    }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# coding: utf-8

import os
from huaweicloudsdkcore.auth.credentials import BasicCredentials
from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkkafka.v2 import *

if __name__ == "__main__":
    # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak = os.environ["CLOUD_SDK_AK"]
    sk = os.environ["CLOUD_SDK_SK"]
    projectId = "{project_id}"

    credentials = BasicCredentials(ak, sk, projectId)

    client = KafkaClient.new_builder() \
        .with_credentials(credentials) \
        .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
        .build()

    try:
        request = ShowInstanceRequest()
        request.instance_id = "{instance_id}"
        response = client.show_instance(request)
        print(response)
    except exceptions.ClientRequestException as e:
        print(e.status_code)
        print(e.request_id)
        print(e.error_code)
        print(e.error_msg)
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
package main

import (
	"fmt"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
    kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
    region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
)

func main() {
    // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak := os.Getenv("CLOUD_SDK_AK")
    sk := os.Getenv("CLOUD_SDK_SK")
    projectId := "{project_id}"

    auth := basic.NewCredentialsBuilder().
        WithAk(ak).
        WithSk(sk).
        WithProjectId(projectId).
        Build()

    client := kafka.NewKafkaClient(
        kafka.KafkaClientBuilder().
            WithRegion(region.ValueOf("<YOUR REGION>")).
            WithCredential(auth).
            Build())

    request := &model.ShowInstanceRequest{}
	request.InstanceId = "{instance_id}"
	response, err := client.ShowInstance(request)
	if err == nil {
        fmt.Printf("%+v\n", response)
    } else {
        fmt.Println(err)
    }
}

For SDK sample code of more programming languages, see the Sample Code tab in API Explorer. SDK sample code can be automatically generated.

Status Codes

Status Code

Description

200

Specified instance queried.

Error Codes

See Error Codes.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback