更新时间:2024-12-16 GMT+08:00
分享

创建实例

功能介绍

创建实例。

该接口支持创建按需和包周期两种计费方式的实例。

调用方法

请参见如何调用API

URI

POST /v2/{engine}/{project_id}/instances

表1 路径参数

参数

是否必选

参数类型

描述

engine

String

消息引擎。

project_id

String

项目ID,获取方式请参见获取项目ID

请求参数

表2 请求Body参数

参数

是否必选

参数类型

描述

name

String

实例名称。

由英文字符开头,只能由英文字母、数字、中划线、下划线组成,长度为4~64的字符。

description

String

实例的描述信息。

长度不超过1024的字符串。

说明:

\与"在json报文中属于特殊字符,如果参数值中需要显示\或者"字符,请在字符前增加转义字符\,比如\或者"。

engine

String

消息引擎。取值填写为:kafka。

engine_version

String

消息引擎的版本。取值填写为:

  • 1.1.0

  • 2.7

  • 3.x

broker_num

Integer

代理个数。

storage_space

Integer

消息存储空间,单位GB。

  • Kafka实例规格为c6.2u4g.cluster时,存储空间取值范围300GB ~ 300000GB。

  • Kafka实例规格为c6.4u8g.cluster时,存储空间取值范围300GB ~ 600000GB。

  • Kafka实例规格为c6.8u16g.cluster时,存储空间取值范围300GB ~ 1500000GB。

  • Kafka实例规格为c6.12u24g.cluster时,存储空间取值范围300GB ~ 1500000GB。

  • Kafka实例规格为c6.16u32g.cluster时,存储空间取值范围300GB ~ 1500000GB。

access_user

String

当ssl_enable为true时,该参数必选,ssl_enable为false时,该参数无效。

认证用户名,只能由英文字母开头且由英文字母、数字、中划线、下划线组成,长度为4~64的字符。

password

String

当ssl_enable为true时,该参数必选,ssl_enable为false时,该参数无效。

实例的认证密码。

复杂度要求:

  • 输入长度为8到32位的字符串。

  • 必须包含如下四种字符中的三种组合:

    • 小写字母

    • 大写字母

    • 数字

    • 特殊字符包括(`~!@#$%^&*()-_=+|[{}]:'",<.>/?)和空格,并且不能以-开头

vpc_id

String

虚拟私有云ID。

获取方法如下:登录虚拟私有云服务的控制台界面,在虚拟私有云的详情页面查找VPC ID。

security_group_id

String

指定实例所属的安全组。

获取方法如下:登录虚拟私有云服务的控制台界面,在安全组的详情页面查找安全组ID。

subnet_id

String

子网信息。

获取方法如下:登录虚拟私有云服务的控制台界面,单击VPC下的子网,进入子网详情页面,查找网络ID。

available_zones

Array of strings

创建节点到指定且有资源的可用区ID。请参考查询可用区信息获取可用区ID。

该参数不能为空数组或者数组的值为空。

创建Kafka实例,支持节点部署在1个或3个及3个以上的可用区。在为节点指定可用区时,用逗号分隔开。

product_id

String

产品ID。

产品ID可以从查询产品规格列表获取。

maintain_begin

String

维护时间窗开始时间,格式为HH:mm。

maintain_end

String

维护时间窗结束时间,格式为HH:mm。

enable_publicip

Boolean

是否开启公网访问功能。默认不开启公网。

  • true:开启

  • false:不开启

tenant_ips

Array of strings

创建实例时可以手动指定实例节点的内网IP地址,仅支持指定IPv4地址。

指定内网IP地址数量必须小于等于购买的节点数量。

如果指定的内网IP地址数量小于购买的节点数量时,系统会自动为剩余的节点随机分配内网IP地址。

publicip_id

String

实例绑定的弹性IP地址的ID。

以英文逗号隔开多个弹性IP地址的ID。

如果开启了公网访问功能(即enable_publicip为true),该字段为必选。

ssl_enable

Boolean

是否开启SASL加密访问。

  • true:开启SASL加密访问。

  • false:关闭SASL加密访问。

kafka_security_protocol

String

开启SASL后使用的安全协议,如果开启了SASL认证功能(即ssl_enable=true),该字段为必选。

若该字段值为空,默认开启SASL_SSL认证机制。

实例创建后将不支持动态开启和关闭。

  • SASL_SSL: 采用SSL证书进行加密传输,支持账号密码认证,安全性更高。

  • SASL_PLAINTEXT: 明文传输,支持账号密码认证,性能更好,建议使用SCRAM-SHA-512机制。

sasl_enabled_mechanisms

Array of strings

开启SASL后使用的认证机制,如果开启了SASL认证功能(即ssl_enable=true),该字段为必选。

若该字段值为空,默认开启PLAIN认证机制。

选择其一进行SASL认证即可,支持同时开启两种认证机制。

取值如下:

  • PLAIN: 简单的用户名密码校验。

  • SCRAM-SHA-512: 用户凭证校验,安全性比PLAIN机制更高。

retention_policy

String

磁盘的容量到达容量阈值后,对于消息的处理策略。

取值如下:

  • produce_reject:表示拒绝消息写入。

  • time_base:表示自动删除最老消息。

ipv6_enable

Boolean

是否开启ipv6。仅在虚拟私有云支持ipv6时生效。

disk_encrypted_enable

Boolean

是否开启磁盘加密。

disk_encrypted_key

String

磁盘加密key,未开启磁盘加密时为空

connector_enable

Boolean

是否开启消息转储功能。

默认不开启消息转储。

enable_auto_topic

Boolean

是否打开kafka自动创建topic功能。

  • true:开启

  • false:关闭

当您选择开启,表示生产或消费一个未创建的Topic时,会自动创建一个包含3个分区和3个副本的Topic。

默认是false关闭。

storage_spec_code

String

存储IO规格。

取值范围:

  • dms.physical.storage.high.v2:使用高IO的磁盘类型。

  • dms.physical.storage.ultra.v2:使用超高IO的磁盘类型。

如何选择磁盘类型请参考《云硬盘 产品介绍》的“磁盘类型及性能介绍”。

enterprise_project_id

String

企业项目ID。若为企业项目账号,该参数必填。

tags

Array of TagEntity objects

标签列表。

arch_type

String

CPU架构。当前只支持X86架构。

取值范围:

  • X86

vpc_client_plain

Boolean

VPC内网明文访问。

bss_param

BssParam object

表示包周期计费模式的相关参数。

如果为空,则默认计费模式为按需计费;否则是包周期方式。

表3 TagEntity

参数

是否必选

参数类型

描述

key

String

标签键。

  • 不能为空。

  • 对于同一个实例,Key值唯一。

  • 长度为1~128个字符(中文也可以输入128个字符)。

  • 由任意语种字母、数字、空格和字符组成,字符仅支持_ . : = + - @

  • 不能以_sys_开头。

  • 首尾字符不能为空格。

value

String

标签值。

  • 长度为0~255个字符(中文也可以输入255个字符)。

  • 由任意语种字母、数字、空格和字符组成,字符仅支持_ . : = + - @

表4 BssParam

参数

是否必选

参数类型

描述

is_auto_renew

Boolean

是否自动续订。

取值范围:

  • true: 自动续订。

  • false: 不自动续订。

默认不自动续订。

charging_mode

String

计费模式。

功能说明:付费方式。

取值范围:

  • prePaid:预付费,即包年包月;

  • postPaid:后付费,即按需付费;

默认为postPaid。

is_auto_pay

Boolean

下单订购后,是否自动从客户的账户中支付,而不需要客户手动去进行支付。

取值范围:

  • true:是(自动支付)

  • false:否(需要客户手动支付)

默认为手动支付。

period_type

String

订购周期类型。

取值范围:

  • month:月

  • year:年

chargingMode为prePaid时生效且为必选值。

period_num

Integer

订购周期数。

取值范围:

  • periodType=month(周期类型为月)时,取值为[1,9];

  • periodType=year(周期类型为年)时,取值为[1,3];

chargingMode为prePaid时生效且为必选值。

响应参数

状态码: 200

表5 响应Body参数

参数

参数类型

描述

instance_id

String

实例ID

请求示例

  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

    POST https://{endpoint}/v2/{engine}/{project_id}/instances
    
    {
      "name" : "kafka-test",
      "description" : "",
      "engine" : "kafka",
      "engine_version" : "2.7",
      "storage_space" : 300,
      "vpc_id" : "********-9b4a-44c5-a964-************",
      "subnet_id" : "********-8fbf-4438-ba71-************",
      "security_group_id" : "********-e073-4aad-991f-************",
      "available_zones" : [ "********706d4c1fb0eb72f0********" ],
      "product_id" : "c6.2u4g.cluster",
      "ssl_enable" : true,
      "kafka_security_protocol" : "SASL_SSL",
      "sasl_enabled_mechanisms" : [ "SCRAM-SHA-512" ],
      "storage_spec_code" : "dms.physical.storage.ultra.v2",
      "broker_num" : 3,
      "arch_type" : "X86",
      "enterprise_project_id" : "0",
      "access_user" : "********",
      "password" : "********",
      "enable_publicip" : true,
      "tags" : [ {
        "key" : "aaa",
        "value" : "111"
      } ],
      "retention_policy" : "time_base",
      "disk_encrypted_enable" : true,
      "disk_encrypted_key" : "********-b953-4875-a743-************",
      "publicip_id" : "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
      "vpc_client_plain" : true,
      "enable_auto_topic" : true,
      "tenant_ips" : [ "127.xx.xx.x", "127.xx.xx.x", "127.xx.xx.x" ]
    }
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

    POST https://{endpoint}/v2/{engine}/{project_id}/instances
    
    {
      "name" : "kafka-test1",
      "description" : "",
      "engine" : "kafka",
      "engine_version" : "2.7",
      "storage_space" : 300,
      "vpc_id" : "********-9b4a-44c5-a964-************",
      "subnet_id" : "********-8fbf-4438-ba71-************",
      "security_group_id" : "********-e073-4aad-991f-************",
      "available_zones" : [ "********706d4c1fb0eb72f0********" ],
      "product_id" : "c6.2u4g.cluster",
      "ssl_enable" : true,
      "kafka_security_protocol" : "SASL_SSL",
      "sasl_enabled_mechanisms" : [ "SCRAM-SHA-512" ],
      "storage_spec_code" : "dms.physical.storage.ultra.v2",
      "broker_num" : 3,
      "arch_type" : "X86",
      "enterprise_project_id" : "0",
      "access_user" : "********",
      "password" : "********",
      "enable_publicip" : true,
      "tags" : [ {
        "key" : "aaa",
        "value" : "111"
      } ],
      "retention_policy" : "time_base",
      "publicip_id" : "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
      "vpc_client_plain" : true,
      "enable_auto_topic" : true,
      "bss_param" : {
        "charging_mode" : "prePaid",
        "period_type" : "month",
        "period_num" : 1,
        "is_auto_pay" : true
      },
      "tenant_ips" : [ "127.xx.xx.x", "127.xx.xx.x", "127.xx.xx.x" ]
    }

响应示例

状态码: 200

创建实例成功。

{
  "instance_id" : "8959ab1c-7n1a-yyb1-a05t-93dfc361b32d"
}

SDK代码示例

SDK代码示例如下。

  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    import java.util.List;
    import java.util.ArrayList;
    
    public class CreateInstanceByEngineSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreateInstanceByEngineRequest request = new CreateInstanceByEngineRequest();
            request.withEngine(CreateInstanceByEngineRequest.EngineEnum.fromValue("{engine}"));
            CreateInstanceByEngineReq body = new CreateInstanceByEngineReq();
            List<TagEntity> listbodyTags = new ArrayList<>();
            listbodyTags.add(
                new TagEntity()
                    .withKey("aaa")
                    .withValue("111")
            );
            List<CreateInstanceByEngineReq.SaslEnabledMechanismsEnum> listbodySaslEnabledMechanisms = new ArrayList<>();
            listbodySaslEnabledMechanisms.add(CreateInstanceByEngineReq.SaslEnabledMechanismsEnum.fromValue("SCRAM-SHA-512"));
            List<String> listbodyAvailableZones = new ArrayList<>();
            listbodyAvailableZones.add("********706d4c1fb0eb72f0********");
            body.withVpcClientPlain(true);
            body.withArchType("X86");
            body.withTags(listbodyTags);
            body.withEnterpriseProjectId("0");
            body.withStorageSpecCode(CreateInstanceByEngineReq.StorageSpecCodeEnum.fromValue("dms.physical.storage.ultra.v2"));
            body.withEnableAutoTopic(true);
            body.withDiskEncryptedKey("********-b953-4875-a743-************");
            body.withDiskEncryptedEnable(true);
            body.withRetentionPolicy(CreateInstanceByEngineReq.RetentionPolicyEnum.fromValue("time_base"));
            body.withSaslEnabledMechanisms(listbodySaslEnabledMechanisms);
            body.withKafkaSecurityProtocol("SASL_SSL");
            body.withSslEnable(true);
            body.withPublicipId("********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************");
            body.withEnablePublicip(true);
            body.withProductId("c6.2u4g.cluster");
            body.withAvailableZones(listbodyAvailableZones);
            body.withSubnetId("********-8fbf-4438-ba71-************");
            body.withSecurityGroupId("********-e073-4aad-991f-************");
            body.withVpcId("********-9b4a-44c5-a964-************");
            body.withPassword("********");
            body.withAccessUser("********");
            body.withStorageSpace(300);
            body.withBrokerNum(3);
            body.withEngineVersion("2.7");
            body.withEngine(CreateInstanceByEngineReq.EngineEnum.fromValue("kafka"));
            body.withDescription("");
            body.withName("kafka-test");
            request.withBody(body);
            try {
                CreateInstanceByEngineResponse response = client.createInstanceByEngine(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    90
    91
    92
    93
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    import java.util.List;
    import java.util.ArrayList;
    
    public class CreateInstanceByEngineSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreateInstanceByEngineRequest request = new CreateInstanceByEngineRequest();
            request.withEngine(CreateInstanceByEngineRequest.EngineEnum.fromValue("{engine}"));
            CreateInstanceByEngineReq body = new CreateInstanceByEngineReq();
            BssParam bssParambody = new BssParam();
            bssParambody.withChargingMode(BssParam.ChargingModeEnum.fromValue("prePaid"))
                .withIsAutoPay(true)
                .withPeriodType(BssParam.PeriodTypeEnum.fromValue("month"))
                .withPeriodNum(1);
            List<TagEntity> listbodyTags = new ArrayList<>();
            listbodyTags.add(
                new TagEntity()
                    .withKey("aaa")
                    .withValue("111")
            );
            List<CreateInstanceByEngineReq.SaslEnabledMechanismsEnum> listbodySaslEnabledMechanisms = new ArrayList<>();
            listbodySaslEnabledMechanisms.add(CreateInstanceByEngineReq.SaslEnabledMechanismsEnum.fromValue("SCRAM-SHA-512"));
            List<String> listbodyAvailableZones = new ArrayList<>();
            listbodyAvailableZones.add("********706d4c1fb0eb72f0********");
            body.withBssParam(bssParambody);
            body.withVpcClientPlain(true);
            body.withArchType("X86");
            body.withTags(listbodyTags);
            body.withEnterpriseProjectId("0");
            body.withStorageSpecCode(CreateInstanceByEngineReq.StorageSpecCodeEnum.fromValue("dms.physical.storage.ultra.v2"));
            body.withEnableAutoTopic(true);
            body.withRetentionPolicy(CreateInstanceByEngineReq.RetentionPolicyEnum.fromValue("time_base"));
            body.withSaslEnabledMechanisms(listbodySaslEnabledMechanisms);
            body.withKafkaSecurityProtocol("SASL_SSL");
            body.withSslEnable(true);
            body.withPublicipId("********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************");
            body.withEnablePublicip(true);
            body.withProductId("c6.2u4g.cluster");
            body.withAvailableZones(listbodyAvailableZones);
            body.withSubnetId("********-8fbf-4438-ba71-************");
            body.withSecurityGroupId("********-e073-4aad-991f-************");
            body.withVpcId("********-9b4a-44c5-a964-************");
            body.withPassword("********");
            body.withAccessUser("********");
            body.withStorageSpace(300);
            body.withBrokerNum(3);
            body.withEngineVersion("2.7");
            body.withEngine(CreateInstanceByEngineReq.EngineEnum.fromValue("kafka"));
            body.withDescription("");
            body.withName("kafka-test1");
            request.withBody(body);
            try {
                CreateInstanceByEngineResponse response = client.createInstanceByEngine(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreateInstanceByEngineRequest()
            request.engine = "{engine}"
            listTagsbody = [
                TagEntity(
                    key="aaa",
                    value="111"
                )
            ]
            listSaslEnabledMechanismsbody = [
                "SCRAM-SHA-512"
            ]
            listAvailableZonesbody = [
                "********706d4c1fb0eb72f0********"
            ]
            request.body = CreateInstanceByEngineReq(
                vpc_client_plain=True,
                arch_type="X86",
                tags=listTagsbody,
                enterprise_project_id="0",
                storage_spec_code="dms.physical.storage.ultra.v2",
                enable_auto_topic=True,
                disk_encrypted_key="********-b953-4875-a743-************",
                disk_encrypted_enable=True,
                retention_policy="time_base",
                sasl_enabled_mechanisms=listSaslEnabledMechanismsbody,
                kafka_security_protocol="SASL_SSL",
                ssl_enable=True,
                publicip_id="********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
                enable_publicip=True,
                product_id="c6.2u4g.cluster",
                available_zones=listAvailableZonesbody,
                subnet_id="********-8fbf-4438-ba71-************",
                security_group_id="********-e073-4aad-991f-************",
                vpc_id="********-9b4a-44c5-a964-************",
                password="********",
                access_user="********",
                storage_space=300,
                broker_num=3,
                engine_version="2.7",
                engine="kafka",
                description="",
                name="kafka-test"
            )
            response = client.create_instance_by_engine(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreateInstanceByEngineRequest()
            request.engine = "{engine}"
            bssParambody = BssParam(
                charging_mode="prePaid",
                is_auto_pay=True,
                period_type="month",
                period_num=1
            )
            listTagsbody = [
                TagEntity(
                    key="aaa",
                    value="111"
                )
            ]
            listSaslEnabledMechanismsbody = [
                "SCRAM-SHA-512"
            ]
            listAvailableZonesbody = [
                "********706d4c1fb0eb72f0********"
            ]
            request.body = CreateInstanceByEngineReq(
                bss_param=bssParambody,
                vpc_client_plain=True,
                arch_type="X86",
                tags=listTagsbody,
                enterprise_project_id="0",
                storage_spec_code="dms.physical.storage.ultra.v2",
                enable_auto_topic=True,
                retention_policy="time_base",
                sasl_enabled_mechanisms=listSaslEnabledMechanismsbody,
                kafka_security_protocol="SASL_SSL",
                ssl_enable=True,
                publicip_id="********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
                enable_publicip=True,
                product_id="c6.2u4g.cluster",
                available_zones=listAvailableZonesbody,
                subnet_id="********-8fbf-4438-ba71-************",
                security_group_id="********-e073-4aad-991f-************",
                vpc_id="********-9b4a-44c5-a964-************",
                password="********",
                access_user="********",
                storage_space=300,
                broker_num=3,
                engine_version="2.7",
                engine="kafka",
                description="",
                name="kafka-test1"
            )
            response = client.create_instance_by_engine(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    90
    91
    92
    93
    94
    95
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreateInstanceByEngineRequest{}
    	request.Engine = model.GetCreateInstanceByEngineRequestEngineEnum().ENGINE
    	keyTags:= "aaa"
    	valueTags:= "111"
    	var listTagsbody = []model.TagEntity{
            {
                Key: &keyTags,
                Value: &valueTags,
            },
        }
    	var listSaslEnabledMechanismsbody = []model.CreateInstanceByEngineReqSaslEnabledMechanisms{
            model.GetCreateInstanceByEngineReqSaslEnabledMechanismsEnum().SCRAM_SHA_512,
        }
    	var listAvailableZonesbody = []string{
            "********706d4c1fb0eb72f0********",
        }
    	vpcClientPlainCreateInstanceByEngineReq:= true
    	archTypeCreateInstanceByEngineReq:= "X86"
    	enterpriseProjectIdCreateInstanceByEngineReq:= "0"
    	enableAutoTopicCreateInstanceByEngineReq:= true
    	diskEncryptedKeyCreateInstanceByEngineReq:= "********-b953-4875-a743-************"
    	diskEncryptedEnableCreateInstanceByEngineReq:= true
    	retentionPolicyCreateInstanceByEngineReq:= model.GetCreateInstanceByEngineReqRetentionPolicyEnum().TIME_BASE
    	kafkaSecurityProtocolCreateInstanceByEngineReq:= "SASL_SSL"
    	sslEnableCreateInstanceByEngineReq:= true
    	publicipIdCreateInstanceByEngineReq:= "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************"
    	enablePublicipCreateInstanceByEngineReq:= true
    	passwordCreateInstanceByEngineReq:= "********"
    	accessUserCreateInstanceByEngineReq:= "********"
    	descriptionCreateInstanceByEngineReq:= ""
    	request.Body = &model.CreateInstanceByEngineReq{
    		VpcClientPlain: &vpcClientPlainCreateInstanceByEngineReq,
    		ArchType: &archTypeCreateInstanceByEngineReq,
    		Tags: &listTagsbody,
    		EnterpriseProjectId: &enterpriseProjectIdCreateInstanceByEngineReq,
    		StorageSpecCode: model.GetCreateInstanceByEngineReqStorageSpecCodeEnum().DMS_PHYSICAL_STORAGE_ULTRA,
    		EnableAutoTopic: &enableAutoTopicCreateInstanceByEngineReq,
    		DiskEncryptedKey: &diskEncryptedKeyCreateInstanceByEngineReq,
    		DiskEncryptedEnable: &diskEncryptedEnableCreateInstanceByEngineReq,
    		RetentionPolicy: &retentionPolicyCreateInstanceByEngineReq,
    		SaslEnabledMechanisms: &listSaslEnabledMechanismsbody,
    		KafkaSecurityProtocol: &kafkaSecurityProtocolCreateInstanceByEngineReq,
    		SslEnable: &sslEnableCreateInstanceByEngineReq,
    		PublicipId: &publicipIdCreateInstanceByEngineReq,
    		EnablePublicip: &enablePublicipCreateInstanceByEngineReq,
    		ProductId: "c6.2u4g.cluster",
    		AvailableZones: listAvailableZonesbody,
    		SubnetId: "********-8fbf-4438-ba71-************",
    		SecurityGroupId: "********-e073-4aad-991f-************",
    		VpcId: "********-9b4a-44c5-a964-************",
    		Password: &passwordCreateInstanceByEngineReq,
    		AccessUser: &accessUserCreateInstanceByEngineReq,
    		StorageSpace: int32(300),
    		BrokerNum: int32(3),
    		EngineVersion: "2.7",
    		Engine: model.GetCreateInstanceByEngineReqEngineEnum().KAFKA,
    		Description: &descriptionCreateInstanceByEngineReq,
    		Name: "kafka-test",
    	}
    	response, err := client.CreateInstanceByEngine(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

      1
      2
      3
      4
      5
      6
      7
      8
      9
     10
     11
     12
     13
     14
     15
     16
     17
     18
     19
     20
     21
     22
     23
     24
     25
     26
     27
     28
     29
     30
     31
     32
     33
     34
     35
     36
     37
     38
     39
     40
     41
     42
     43
     44
     45
     46
     47
     48
     49
     50
     51
     52
     53
     54
     55
     56
     57
     58
     59
     60
     61
     62
     63
     64
     65
     66
     67
     68
     69
     70
     71
     72
     73
     74
     75
     76
     77
     78
     79
     80
     81
     82
     83
     84
     85
     86
     87
     88
     89
     90
     91
     92
     93
     94
     95
     96
     97
     98
     99
    100
    101
    102
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreateInstanceByEngineRequest{}
    	request.Engine = model.GetCreateInstanceByEngineRequestEngineEnum().ENGINE
    	chargingModeBssParam:= model.GetBssParamChargingModeEnum().PRE_PAID
    	isAutoPayBssParam:= true
    	periodTypeBssParam:= model.GetBssParamPeriodTypeEnum().MONTH
    	periodNumBssParam:= int32(1)
    	bssParambody := &model.BssParam{
    		ChargingMode: &chargingModeBssParam,
    		IsAutoPay: &isAutoPayBssParam,
    		PeriodType: &periodTypeBssParam,
    		PeriodNum: &periodNumBssParam,
    	}
    	keyTags:= "aaa"
    	valueTags:= "111"
    	var listTagsbody = []model.TagEntity{
            {
                Key: &keyTags,
                Value: &valueTags,
            },
        }
    	var listSaslEnabledMechanismsbody = []model.CreateInstanceByEngineReqSaslEnabledMechanisms{
            model.GetCreateInstanceByEngineReqSaslEnabledMechanismsEnum().SCRAM_SHA_512,
        }
    	var listAvailableZonesbody = []string{
            "********706d4c1fb0eb72f0********",
        }
    	vpcClientPlainCreateInstanceByEngineReq:= true
    	archTypeCreateInstanceByEngineReq:= "X86"
    	enterpriseProjectIdCreateInstanceByEngineReq:= "0"
    	enableAutoTopicCreateInstanceByEngineReq:= true
    	retentionPolicyCreateInstanceByEngineReq:= model.GetCreateInstanceByEngineReqRetentionPolicyEnum().TIME_BASE
    	kafkaSecurityProtocolCreateInstanceByEngineReq:= "SASL_SSL"
    	sslEnableCreateInstanceByEngineReq:= true
    	publicipIdCreateInstanceByEngineReq:= "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************"
    	enablePublicipCreateInstanceByEngineReq:= true
    	passwordCreateInstanceByEngineReq:= "********"
    	accessUserCreateInstanceByEngineReq:= "********"
    	descriptionCreateInstanceByEngineReq:= ""
    	request.Body = &model.CreateInstanceByEngineReq{
    		BssParam: bssParambody,
    		VpcClientPlain: &vpcClientPlainCreateInstanceByEngineReq,
    		ArchType: &archTypeCreateInstanceByEngineReq,
    		Tags: &listTagsbody,
    		EnterpriseProjectId: &enterpriseProjectIdCreateInstanceByEngineReq,
    		StorageSpecCode: model.GetCreateInstanceByEngineReqStorageSpecCodeEnum().DMS_PHYSICAL_STORAGE_ULTRA,
    		EnableAutoTopic: &enableAutoTopicCreateInstanceByEngineReq,
    		RetentionPolicy: &retentionPolicyCreateInstanceByEngineReq,
    		SaslEnabledMechanisms: &listSaslEnabledMechanismsbody,
    		KafkaSecurityProtocol: &kafkaSecurityProtocolCreateInstanceByEngineReq,
    		SslEnable: &sslEnableCreateInstanceByEngineReq,
    		PublicipId: &publicipIdCreateInstanceByEngineReq,
    		EnablePublicip: &enablePublicipCreateInstanceByEngineReq,
    		ProductId: "c6.2u4g.cluster",
    		AvailableZones: listAvailableZonesbody,
    		SubnetId: "********-8fbf-4438-ba71-************",
    		SecurityGroupId: "********-e073-4aad-991f-************",
    		VpcId: "********-9b4a-44c5-a964-************",
    		Password: &passwordCreateInstanceByEngineReq,
    		AccessUser: &accessUserCreateInstanceByEngineReq,
    		StorageSpace: int32(300),
    		BrokerNum: int32(3),
    		EngineVersion: "2.7",
    		Engine: model.GetCreateInstanceByEngineReqEngineEnum().KAFKA,
    		Description: &descriptionCreateInstanceByEngineReq,
    		Name: "kafka-test1",
    	}
    	response, err := client.CreateInstanceByEngine(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    

更多编程语言的SDK代码示例,请参见API Explorer的代码示例页签,可生成自动对应的SDK代码示例。

状态码

状态码

描述

200

创建实例成功。

错误码

请参见错误码

相关文档