计算
弹性云服务器 ECS
Flexus云服务
裸金属服务器 BMS
弹性伸缩 AS
镜像服务 IMS
专属主机 DeH
函数工作流 FunctionGraph
云手机服务器 CPH
Huawei Cloud EulerOS
网络
虚拟私有云 VPC
弹性公网IP EIP
虚拟专用网络 VPN
弹性负载均衡 ELB
NAT网关 NAT
云专线 DC
VPC终端节点 VPCEP
云连接 CC
企业路由器 ER
企业交换机 ESW
全球加速 GA
安全与合规
安全技术与应用
Web应用防火墙 WAF
企业主机安全 HSS
云防火墙 CFW
安全云脑 SecMaster
DDoS防护 AAD
数据加密服务 DEW
数据库安全服务 DBSS
云堡垒机 CBH
数据安全中心 DSC
云证书管理服务 CCM
边缘安全 EdgeSec
CDN与智能边缘
内容分发网络 CDN
CloudPond云服务
智能边缘云 IEC
迁移
主机迁移服务 SMS
对象存储迁移服务 OMS
云数据迁移 CDM
迁移中心 MGC
大数据
MapReduce服务 MRS
数据湖探索 DLI
表格存储服务 CloudTable
云搜索服务 CSS
数据接入服务 DIS
数据仓库服务 GaussDB(DWS)
数据治理中心 DataArts Studio
数据可视化 DLV
数据湖工厂 DLF
湖仓构建 LakeFormation
企业应用
云桌面 Workspace
应用与数据集成平台 ROMA Connect
云解析服务 DNS
专属云
专属计算集群 DCC
IoT物联网
IoT物联网
设备接入 IoTDA
智能边缘平台 IEF
云生态
合作伙伴中心
云商店
开发者工具
SDK开发指南
API签名指南
Terraform
华为云命令行工具服务 KooCLI
其他
产品价格详情
系统权限
管理控制台
客户关联华为云合作伙伴须知
消息中心
公共问题
视频
视频直播 Live
视频点播 VOD
媒体处理 MPC
实时音视频 SparkRTC
数字内容生产线 MetaStudio
开发与运维
应用管理与运维平台 ServiceStage
软件开发生产线 CodeArts
需求管理 CodeArts Req
部署 CodeArts Deploy
性能测试 CodeArts PerfTest
编译构建 CodeArts Build
流水线 CodeArts Pipeline
制品仓库 CodeArts Artifact
测试计划 CodeArts TestPlan
代码检查 CodeArts Check
代码托管 CodeArts Repo
云应用引擎 CAE
华为云Astro轻应用
华为云Astro大屏应用
开源治理服务 CodeArts Governance
存储
对象存储服务 OBS
云硬盘 EVS
云备份 CBR
存储容灾服务 SDRS
高性能弹性文件服务 SFS Turbo
弹性文件服务 SFS
云硬盘备份 VBS
云服务器备份 CSBS
数据快递服务 DES
专属分布式存储服务 DSS
容器
云容器引擎 CCE
容器镜像服务 SWR
应用服务网格 ASM
华为云UCS
云容器实例 CCI
管理与监管
云监控服务 CES
统一身份认证服务 IAM
资源编排服务 RFS
云审计服务 CTS
标签管理服务 TMS
云日志服务 LTS
配置审计 Config
资源访问管理 RAM
消息通知服务 SMN
应用运维管理 AOM
应用性能管理 APM
组织 Organizations
优化顾问 OA
IAM 身份中心
云运维中心 COC
资源治理中心 RGC
应用身份管理服务 OneAccess
数据库
云数据库 RDS
文档数据库服务 DDS
数据管理服务 DAS
数据复制服务 DRS
云数据库 GeminiDB
云数据库 GaussDB
分布式数据库中间件 DDM
数据库和应用迁移 UGO
云数据库 TaurusDB
人工智能
人脸识别服务 FRS
图引擎服务 GES
图像识别 Image
内容审核 Moderation
文字识别 OCR
AI开发平台ModelArts
图像搜索 ImageSearch
对话机器人服务 CBS
华为HiLens
视频智能分析服务 VIAS
语音交互服务 SIS
应用中间件
分布式缓存服务 DCS
API网关 APIG
微服务引擎 CSE
分布式消息服务Kafka版
分布式消息服务RabbitMQ版
分布式消息服务RocketMQ版
多活高可用服务 MAS
事件网格 EG
企业协同
华为云会议 Meeting
云通信
消息&短信 MSGSMS
云化转型
云架构中心
云采用框架
用户服务
账号中心
费用中心
成本中心
资源中心
企业管理
工单管理
国际站常见问题
ICP备案
我的凭证
支持计划
客户运营能力
合作伙伴支持计划
专业服务
区块链
区块链服务 BCS
Web3节点引擎服务 NES
解决方案
SAP
高性能计算 HPC
开天aPaaS
云消息服务 KooMessage
云手机服务 KooPhone
云空间服务 KooDrive

创建Kafka实例

功能介绍

创建实例。

该接口支持创建按需和包周期两种计费方式的实例。

调用方法

请参见如何调用API

URI

POST /v2/{project_id}/kafka/instances

表1 路径参数

参数

是否必选

参数类型

描述

project_id

String

项目ID,获取方式请参见获取项目ID

请求参数

表2 请求Body参数

参数

是否必选

参数类型

描述

name

String

实例名称。

由英文字符开头,只能由英文字母、数字、中划线、下划线组成,长度为4~64的字符。

description

String

实例的描述信息。

长度不超过1024的字符串。

说明:
\与"在json报文中属于特殊字符,如果参数值中需要显示\或者"字符,请在字符前增加转义字符\,比如\或者"。

engine

String

消息引擎。取值填写为:kafka。

engine_version

String

消息引擎的版本。取值填写为:

  • 1.1.0

  • 2.7

  • 3.x

broker_num

Integer

代理个数。

storage_space

Integer

消息存储空间,单位GB。

  • Kafka实例规格为c6.2u4g.cluster时,存储空间取值范围300GB ~ 300000GB。

  • Kafka实例规格为c6.4u8g.cluster时,存储空间取值范围300GB ~ 600000GB。

  • Kafka实例规格为c6.8u16g.cluster时,存储空间取值范围300GB ~ 1500000GB。

  • Kafka实例规格为c6.12u24g.cluster时,存储空间取值范围300GB ~ 1500000GB。

  • Kafka实例规格为c6.16u32g.cluster时,存储空间取值范围300GB ~ 1500000GB。

access_user

String

当ssl_enable为true时,该参数必选,ssl_enable为false时,该参数无效。

认证用户名,只能由英文字母开头且由英文字母、数字、中划线、下划线组成,长度为4~64的字符。

password

String

当ssl_enable为true时,该参数必选,ssl_enable为false时,该参数无效。

实例的认证密码。

复杂度要求:

  • 输入长度为8到32位的字符串。

  • 必须包含如下四种字符中的三种组合:

    • 小写字母

    • 大写字母

    • 数字

    • 特殊字符包括(`~!@#$%^&*()-_=+|[{}]:'",<.>/?)和空格,并且不能以-开头

vpc_id

String

虚拟私有云ID。

获取方法如下:登录虚拟私有云服务的控制台界面,在虚拟私有云的详情页面查找VPC ID。

security_group_id

String

指定实例所属的安全组。

获取方法如下:登录虚拟私有云服务的控制台界面,在安全组的详情页面查找安全组ID。

subnet_id

String

子网信息。

获取方法如下:登录虚拟私有云服务的控制台界面,单击VPC下的子网,进入子网详情页面,查找网络ID。

available_zones

Array of strings

创建节点到指定且有资源的可用区ID。请参考查询可用区信息获取可用区ID。

该参数不能为空数组或者数组的值为空。

创建Kafka实例,支持节点部署在1个或3个及3个以上的可用区。在为节点指定可用区时,用逗号分隔开。

product_id

String

产品ID。

产品ID可以从查询产品规格列表获取。

maintain_begin

String

维护时间窗开始时间,格式为HH:mm。

maintain_end

String

维护时间窗结束时间,格式为HH:mm。

enable_publicip

Boolean

是否开启公网访问功能。默认不开启公网。

  • true:开启

  • false:不开启

tenant_ips

Array of strings

创建实例时可以手动指定实例节点的内网IP地址,仅支持指定IPv4地址。

指定内网IP地址数量必须小于等于创建的节点数量。

如果指定的内网IP地址数量小于创建的节点数量时,系统会自动为剩余的节点随机分配内网IP地址。

publicip_id

String

实例绑定的弹性IP地址的ID。

以英文逗号隔开多个弹性IP地址的ID。

如果开启了公网访问功能(即enable_publicip为true),该字段为必选。

ssl_enable

Boolean

是否开启SASL加密访问。

  • true:开启SASL加密访问。

  • false:关闭SASL加密访问。

kafka_security_protocol

String

开启SASL后使用的安全协议。

  • SASL_SSL: 使用SSL证书加密传输,支持账号密码认证,安全性更高。

  • SASL_PLAINTEXT: 通过明文传输,支持账号密码认证,性能更好。

若该字段值为空,默认开启SASL_SSL认证机制。实例创建后,此参数不支持动态修改。

若创建实例时,使用了port_protocol参数,则Kafka的内网访问安全协议以及公网访问安全协议会使用port_protocol中的值,则此参数无效。

sasl_enabled_mechanisms

Array of strings

开启SASL后使用的认证机制,如果开启了SASL认证功能(即ssl_enable=true),该字段为必选。

若该字段值为空,默认开启PLAIN认证机制。

选择其一进行SASL认证即可,支持同时开启两种认证机制。

取值如下:

  • PLAIN: 简单的用户名密码校验。

  • SCRAM-SHA-512: 用户凭证校验,安全性比PLAIN机制更高。

port_protocol

PortProtocol object

设置Kafka实例的接入方式。PLAINTEXT表示明文接入,SASL_SSL或者SASL_PLAINTEXT表示密文接入。

内网访问不支持关闭,明文接入和密文接入至少开启一个。

跨VPC访问的安全协议等于内网访问的安全协议,若内网同时开启了密文访问和明文访问,则跨VPC访问的安全协议会优先使用密文访问的安全协议。

retention_policy

String

磁盘的容量到达容量阈值后,对于消息的处理策略。

取值如下:

  • produce_reject:表示拒绝消息写入。

  • time_base:表示自动删除最老消息。

ipv6_enable

Boolean

是否开启ipv6。仅在虚拟私有云支持ipv6时生效。

disk_encrypted_enable

Boolean

是否开启磁盘加密。

disk_encrypted_key

String

磁盘加密key,未开启磁盘加密时为空

connector_enable

Boolean

是否开启消息转储功能。

默认不开启消息转储。

enable_auto_topic

Boolean

是否打开kafka自动创建Topic功能。

  • true:开启

  • false:关闭

当您选择开启,表示生产或消费一个未创建的Topic时,会自动创建一个包含3个分区和3个副本的Topic。

默认是false关闭。

storage_spec_code

String

存储IO规格。

取值范围:

  • dms.physical.storage.high.v2:使用高IO的磁盘类型。

  • dms.physical.storage.ultra.v2:使用超高IO的磁盘类型。

  • dms.physical.storage.general:使用通用型SSD的磁盘类型。

  • dms.physical.storage.extreme:使用极速型SSD的磁盘类型。

如何选择磁盘类型请参考《云硬盘 产品介绍》的“磁盘类型及性能介绍”。

enterprise_project_id

String

企业项目ID。若为企业项目账号,该参数必填。

tags

Array of TagEntity objects

标签列表。

arch_type

String

CPU架构。当前只支持X86架构。

取值范围:

  • X86

vpc_client_plain

Boolean

VPC内网明文访问。

bss_param

BssParam object

表示包周期计费模式的相关参数。

如果为空,则默认计费模式为按需计费;否则是包周期方式。

表3 PortProtocol

参数

是否必选

参数类型

描述

private_plain_enable

Boolean

是否开启内网明文访问连接方式。

取值范围:

  • true: 开启内网明文访问连接方式,连接地址:ip:9092,访问协议PLAINTEXT。

  • false: 关闭内网明文访问。

默认为false。

private_sasl_ssl_enable

Boolean

是否开启安全协议为SASL_SSL的内网密文接入方式。

取值范围:

  • true: 开启安全协议为SASL_SSL的内网密文接入方式。

  • false: 关闭安全协议为SASL_SSL的内网接入方式。

private_sasl_ssl_enable和private_sasl_plaintext_enable不能同时为true。

默认为false。

private_sasl_plaintext_enable

Boolean

是否开启安全协议为SASL_PLAINTEXT的内网密文接入方式。

取值范围:

  • true: 开启安全协议为SASL_PLAINTEXT的内网密文接入方式,连接地址:ip:9093,访问协议SASL_PLAINTEXT。

  • false: 关闭安全协议为SASL_PLAINTEXT的内网密文接入方式。

private_sasl_plaintext_enable和private_sasl_ssl_enable不能同时为true。

默认为false。

public_plain_enable

Boolean

是否开启公网明文访问连接方式。

取值范围:

  • true: 开启公网明文访问连接方式,连接地址:ip:9094,访问协议PLAINTEXT。

  • false: 关闭公网明文接入方式。

开启公网明文接入前,需要先开启公网访问功能。

默认为false。

public_sasl_ssl_enable

Boolean

是否开启安全协议为SASL_SSL的公网密文接入。

取值范围:

  • true: 开启安全协议为SASL_SSL的公网密文接入方式,连接地址:ip:9095,访问协议:SASL_SSL。

  • false: 关闭安全协议为SASL_SSL的公网密文接入方式。

public_sasl_ssl_enable和public_sasl_plaintext_enable不能同时为true。

为true时,需要实例开启公网。

默认为false。

public_sasl_plaintext_enable

Boolean

是否开启安全协议为SASL_PLAINTEXT的公网密文接入方式。

取值范围:

  • true: 开启安全协议为SASL_PLAINTEXT的公网密文接入方式,连接地址:ip:9095,访问协议:SASL_PLAINTEXT。

  • false: 关闭安全协议为SASL_PLAINTEXT的公网密文接入方式。

public_sasl_plaintext_enable和public_sasl_ssl_enable不能同时为true。

为true时,需要实例开启公网。

默认为false。

表4 TagEntity

参数

是否必选

参数类型

描述

key

String

标签键。

  • 不能为空。

  • 对于同一个实例,Key值唯一。

  • 长度为1~128个字符(中文也可以输入128个字符)。

  • 由任意语种字母、数字、空格和字符组成,字符仅支持_ . : = + - @

  • 不能以_sys_开头。

  • 首尾字符不能为空格。

value

String

标签值。

  • 长度为0~255个字符(中文也可以输入255个字符)。

  • 由任意语种字母、数字、空格和字符组成,字符仅支持_ . : = + - @

表5 BssParam

参数

是否必选

参数类型

描述

is_auto_renew

Boolean

是否自动续订。

取值范围:

  • true: 自动续订。

  • false: 不自动续订。

默认不自动续订。

charging_mode

String

计费模式。

功能说明:付费方式。

取值范围:

  • prePaid:预付费,即包年包月;

  • postPaid:后付费,即按需付费;

默认为postPaid。

is_auto_pay

Boolean

下单订购后,是否自动从客户的账户中支付,而不需要客户手动去进行支付。

取值范围:

  • true:是(自动支付)

  • false:否(需要客户手动支付)

默认为手动支付。

period_type

String

订购周期类型。

取值范围:

  • month:月

  • year:年

chargingMode为prePaid时生效且为必选值。

period_num

Integer

订购周期数。

取值范围:

  • periodType=month(周期类型为月)时,取值为[1,9];

  • periodType=year(周期类型为年)时,取值为[1,3];

chargingMode为prePaid时生效且为必选值。

响应参数

状态码:200

表6 响应Body参数

参数

参数类型

描述

instance_id

String

实例ID

请求示例

  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

    POST https://{endpoint}/v2/{engine}/{project_id}/instances
    
    {
      "name" : "kafka-test",
      "description" : "",
      "engine" : "kafka",
      "engine_version" : "2.7",
      "storage_space" : 300,
      "vpc_id" : "********-9b4a-44c5-a964-************",
      "subnet_id" : "********-8fbf-4438-ba71-************",
      "security_group_id" : "********-e073-4aad-991f-************",
      "available_zones" : [ "********706d4c1fb0eb72f0********" ],
      "product_id" : "c6.2u4g.cluster",
      "ssl_enable" : true,
      "kafka_security_protocol" : "SASL_SSL",
      "sasl_enabled_mechanisms" : [ "SCRAM-SHA-512" ],
      "storage_spec_code" : "dms.physical.storage.ultra.v2",
      "broker_num" : 3,
      "arch_type" : "X86",
      "enterprise_project_id" : "0",
      "access_user" : "********",
      "password" : "********",
      "enable_publicip" : true,
      "tags" : [ {
        "key" : "aaa",
        "value" : "111"
      } ],
      "retention_policy" : "time_base",
      "disk_encrypted_enable" : true,
      "disk_encrypted_key" : "********-b953-4875-a743-************",
      "publicip_id" : "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
      "vpc_client_plain" : true,
      "enable_auto_topic" : true,
      "tenant_ips" : [ "127.xx.xx.x", "127.xx.xx.x", "127.xx.xx.x" ]
    }
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

    POST https://{endpoint}/v2/{engine}/{project_id}/instances
    
    {
      "name" : "kafka-test1",
      "description" : "",
      "engine" : "kafka",
      "engine_version" : "2.7",
      "storage_space" : 300,
      "vpc_id" : "********-9b4a-44c5-a964-************",
      "subnet_id" : "********-8fbf-4438-ba71-************",
      "security_group_id" : "********-e073-4aad-991f-************",
      "available_zones" : [ "********706d4c1fb0eb72f0********" ],
      "product_id" : "c6.2u4g.cluster",
      "ssl_enable" : true,
      "kafka_security_protocol" : "SASL_SSL",
      "sasl_enabled_mechanisms" : [ "SCRAM-SHA-512" ],
      "storage_spec_code" : "dms.physical.storage.ultra.v2",
      "broker_num" : 3,
      "arch_type" : "X86",
      "enterprise_project_id" : "0",
      "access_user" : "********",
      "password" : "********",
      "enable_publicip" : true,
      "tags" : [ {
        "key" : "aaa",
        "value" : "111"
      } ],
      "retention_policy" : "time_base",
      "publicip_id" : "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
      "vpc_client_plain" : true,
      "enable_auto_topic" : true,
      "bss_param" : {
        "charging_mode" : "prePaid",
        "period_type" : "month",
        "period_num" : 1,
        "is_auto_pay" : true
      },
      "tenant_ips" : [ "127.xx.xx.x", "127.xx.xx.x", "127.xx.xx.x" ]
    }

响应示例

状态码:200

创建实例成功。

{
  "instance_id" : "8959ab1c-7n1a-yyb1-a05t-93dfc361b32d"
}

SDK代码示例

SDK代码示例如下。

  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    90
    91
    92
    93
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    import java.util.List;
    import java.util.ArrayList;
    
    public class CreatePostPaidKafkaInstanceSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreatePostPaidKafkaInstanceRequest request = new CreatePostPaidKafkaInstanceRequest();
            CreateInstanceByEngineReq body = new CreateInstanceByEngineReq();
            List<TagEntity> listbodyTags = new ArrayList<>();
            listbodyTags.add(
                new TagEntity()
                    .withKey("aaa")
                    .withValue("111")
            );
            List<CreateInstanceByEngineReq.SaslEnabledMechanismsEnum> listbodySaslEnabledMechanisms = new ArrayList<>();
            listbodySaslEnabledMechanisms.add(CreateInstanceByEngineReq.SaslEnabledMechanismsEnum.fromValue("SCRAM-SHA-512"));
            List<String> listbodyTenantIps = new ArrayList<>();
            listbodyTenantIps.add("127.xx.xx.x");
            listbodyTenantIps.add("127.xx.xx.x");
            listbodyTenantIps.add("127.xx.xx.x");
            List<String> listbodyAvailableZones = new ArrayList<>();
            listbodyAvailableZones.add("********706d4c1fb0eb72f0********");
            body.withVpcClientPlain(true);
            body.withArchType("X86");
            body.withTags(listbodyTags);
            body.withEnterpriseProjectId("0");
            body.withStorageSpecCode(CreateInstanceByEngineReq.StorageSpecCodeEnum.fromValue("dms.physical.storage.ultra.v2"));
            body.withEnableAutoTopic(true);
            body.withDiskEncryptedKey("********-b953-4875-a743-************");
            body.withDiskEncryptedEnable(true);
            body.withRetentionPolicy(CreateInstanceByEngineReq.RetentionPolicyEnum.fromValue("time_base"));
            body.withSaslEnabledMechanisms(listbodySaslEnabledMechanisms);
            body.withKafkaSecurityProtocol("SASL_SSL");
            body.withSslEnable(true);
            body.withPublicipId("********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************");
            body.withTenantIps(listbodyTenantIps);
            body.withEnablePublicip(true);
            body.withProductId("c6.2u4g.cluster");
            body.withAvailableZones(listbodyAvailableZones);
            body.withSubnetId("********-8fbf-4438-ba71-************");
            body.withSecurityGroupId("********-e073-4aad-991f-************");
            body.withVpcId("********-9b4a-44c5-a964-************");
            body.withPassword("********");
            body.withAccessUser("********");
            body.withStorageSpace(300);
            body.withBrokerNum(3);
            body.withEngineVersion("2.7");
            body.withEngine(CreateInstanceByEngineReq.EngineEnum.fromValue("kafka"));
            body.withDescription("");
            body.withName("kafka-test");
            request.withBody(body);
            try {
                CreatePostPaidKafkaInstanceResponse response = client.createPostPaidKafkaInstance(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    90
    91
    92
    93
    94
    95
    96
    97
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    import java.util.List;
    import java.util.ArrayList;
    
    public class CreatePostPaidKafkaInstanceSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreatePostPaidKafkaInstanceRequest request = new CreatePostPaidKafkaInstanceRequest();
            CreateInstanceByEngineReq body = new CreateInstanceByEngineReq();
            BssParam bssParambody = new BssParam();
            bssParambody.withChargingMode(BssParam.ChargingModeEnum.fromValue("prePaid"))
                .withIsAutoPay(true)
                .withPeriodType(BssParam.PeriodTypeEnum.fromValue("month"))
                .withPeriodNum(1);
            List<TagEntity> listbodyTags = new ArrayList<>();
            listbodyTags.add(
                new TagEntity()
                    .withKey("aaa")
                    .withValue("111")
            );
            List<CreateInstanceByEngineReq.SaslEnabledMechanismsEnum> listbodySaslEnabledMechanisms = new ArrayList<>();
            listbodySaslEnabledMechanisms.add(CreateInstanceByEngineReq.SaslEnabledMechanismsEnum.fromValue("SCRAM-SHA-512"));
            List<String> listbodyTenantIps = new ArrayList<>();
            listbodyTenantIps.add("127.xx.xx.x");
            listbodyTenantIps.add("127.xx.xx.x");
            listbodyTenantIps.add("127.xx.xx.x");
            List<String> listbodyAvailableZones = new ArrayList<>();
            listbodyAvailableZones.add("********706d4c1fb0eb72f0********");
            body.withBssParam(bssParambody);
            body.withVpcClientPlain(true);
            body.withArchType("X86");
            body.withTags(listbodyTags);
            body.withEnterpriseProjectId("0");
            body.withStorageSpecCode(CreateInstanceByEngineReq.StorageSpecCodeEnum.fromValue("dms.physical.storage.ultra.v2"));
            body.withEnableAutoTopic(true);
            body.withRetentionPolicy(CreateInstanceByEngineReq.RetentionPolicyEnum.fromValue("time_base"));
            body.withSaslEnabledMechanisms(listbodySaslEnabledMechanisms);
            body.withKafkaSecurityProtocol("SASL_SSL");
            body.withSslEnable(true);
            body.withPublicipId("********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************");
            body.withTenantIps(listbodyTenantIps);
            body.withEnablePublicip(true);
            body.withProductId("c6.2u4g.cluster");
            body.withAvailableZones(listbodyAvailableZones);
            body.withSubnetId("********-8fbf-4438-ba71-************");
            body.withSecurityGroupId("********-e073-4aad-991f-************");
            body.withVpcId("********-9b4a-44c5-a964-************");
            body.withPassword("********");
            body.withAccessUser("********");
            body.withStorageSpace(300);
            body.withBrokerNum(3);
            body.withEngineVersion("2.7");
            body.withEngine(CreateInstanceByEngineReq.EngineEnum.fromValue("kafka"));
            body.withDescription("");
            body.withName("kafka-test1");
            request.withBody(body);
            try {
                CreatePostPaidKafkaInstanceResponse response = client.createPostPaidKafkaInstance(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreatePostPaidKafkaInstanceRequest()
            listTagsbody = [
                TagEntity(
                    key="aaa",
                    value="111"
                )
            ]
            listSaslEnabledMechanismsbody = [
                "SCRAM-SHA-512"
            ]
            listTenantIpsbody = [
                "127.xx.xx.x",
                "127.xx.xx.x",
                "127.xx.xx.x"
            ]
            listAvailableZonesbody = [
                "********706d4c1fb0eb72f0********"
            ]
            request.body = CreateInstanceByEngineReq(
                vpc_client_plain=True,
                arch_type="X86",
                tags=listTagsbody,
                enterprise_project_id="0",
                storage_spec_code="dms.physical.storage.ultra.v2",
                enable_auto_topic=True,
                disk_encrypted_key="********-b953-4875-a743-************",
                disk_encrypted_enable=True,
                retention_policy="time_base",
                sasl_enabled_mechanisms=listSaslEnabledMechanismsbody,
                kafka_security_protocol="SASL_SSL",
                ssl_enable=True,
                publicip_id="********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
                tenant_ips=listTenantIpsbody,
                enable_publicip=True,
                product_id="c6.2u4g.cluster",
                available_zones=listAvailableZonesbody,
                subnet_id="********-8fbf-4438-ba71-************",
                security_group_id="********-e073-4aad-991f-************",
                vpc_id="********-9b4a-44c5-a964-************",
                password="********",
                access_user="********",
                storage_space=300,
                broker_num=3,
                engine_version="2.7",
                engine="kafka",
                description="",
                name="kafka-test"
            )
            response = client.create_post_paid_kafka_instance(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreatePostPaidKafkaInstanceRequest()
            bssParambody = BssParam(
                charging_mode="prePaid",
                is_auto_pay=True,
                period_type="month",
                period_num=1
            )
            listTagsbody = [
                TagEntity(
                    key="aaa",
                    value="111"
                )
            ]
            listSaslEnabledMechanismsbody = [
                "SCRAM-SHA-512"
            ]
            listTenantIpsbody = [
                "127.xx.xx.x",
                "127.xx.xx.x",
                "127.xx.xx.x"
            ]
            listAvailableZonesbody = [
                "********706d4c1fb0eb72f0********"
            ]
            request.body = CreateInstanceByEngineReq(
                bss_param=bssParambody,
                vpc_client_plain=True,
                arch_type="X86",
                tags=listTagsbody,
                enterprise_project_id="0",
                storage_spec_code="dms.physical.storage.ultra.v2",
                enable_auto_topic=True,
                retention_policy="time_base",
                sasl_enabled_mechanisms=listSaslEnabledMechanismsbody,
                kafka_security_protocol="SASL_SSL",
                ssl_enable=True,
                publicip_id="********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
                tenant_ips=listTenantIpsbody,
                enable_publicip=True,
                product_id="c6.2u4g.cluster",
                available_zones=listAvailableZonesbody,
                subnet_id="********-8fbf-4438-ba71-************",
                security_group_id="********-e073-4aad-991f-************",
                vpc_id="********-9b4a-44c5-a964-************",
                password="********",
                access_user="********",
                storage_space=300,
                broker_num=3,
                engine_version="2.7",
                engine="kafka",
                description="",
                name="kafka-test1"
            )
            response = client.create_post_paid_kafka_instance(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

      1
      2
      3
      4
      5
      6
      7
      8
      9
     10
     11
     12
     13
     14
     15
     16
     17
     18
     19
     20
     21
     22
     23
     24
     25
     26
     27
     28
     29
     30
     31
     32
     33
     34
     35
     36
     37
     38
     39
     40
     41
     42
     43
     44
     45
     46
     47
     48
     49
     50
     51
     52
     53
     54
     55
     56
     57
     58
     59
     60
     61
     62
     63
     64
     65
     66
     67
     68
     69
     70
     71
     72
     73
     74
     75
     76
     77
     78
     79
     80
     81
     82
     83
     84
     85
     86
     87
     88
     89
     90
     91
     92
     93
     94
     95
     96
     97
     98
     99
    100
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreatePostPaidKafkaInstanceRequest{}
    	keyTags:= "aaa"
    	valueTags:= "111"
    	var listTagsbody = []model.TagEntity{
            {
                Key: &keyTags,
                Value: &valueTags,
            },
        }
    	var listSaslEnabledMechanismsbody = []model.CreateInstanceByEngineReqSaslEnabledMechanisms{
            model.GetCreateInstanceByEngineReqSaslEnabledMechanismsEnum().SCRAM_SHA_512,
        }
    	var listTenantIpsbody = []string{
            "127.xx.xx.x",
    	    "127.xx.xx.x",
    	    "127.xx.xx.x",
        }
    	var listAvailableZonesbody = []string{
            "********706d4c1fb0eb72f0********",
        }
    	vpcClientPlainCreateInstanceByEngineReq:= true
    	archTypeCreateInstanceByEngineReq:= "X86"
    	enterpriseProjectIdCreateInstanceByEngineReq:= "0"
    	enableAutoTopicCreateInstanceByEngineReq:= true
    	diskEncryptedKeyCreateInstanceByEngineReq:= "********-b953-4875-a743-************"
    	diskEncryptedEnableCreateInstanceByEngineReq:= true
    	retentionPolicyCreateInstanceByEngineReq:= model.GetCreateInstanceByEngineReqRetentionPolicyEnum().TIME_BASE
    	kafkaSecurityProtocolCreateInstanceByEngineReq:= "SASL_SSL"
    	sslEnableCreateInstanceByEngineReq:= true
    	publicipIdCreateInstanceByEngineReq:= "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************"
    	enablePublicipCreateInstanceByEngineReq:= true
    	passwordCreateInstanceByEngineReq:= "********"
    	accessUserCreateInstanceByEngineReq:= "********"
    	descriptionCreateInstanceByEngineReq:= ""
    	request.Body = &model.CreateInstanceByEngineReq{
    		VpcClientPlain: &vpcClientPlainCreateInstanceByEngineReq,
    		ArchType: &archTypeCreateInstanceByEngineReq,
    		Tags: &listTagsbody,
    		EnterpriseProjectId: &enterpriseProjectIdCreateInstanceByEngineReq,
    		StorageSpecCode: model.GetCreateInstanceByEngineReqStorageSpecCodeEnum().DMS_PHYSICAL_STORAGE_ULTRA,
    		EnableAutoTopic: &enableAutoTopicCreateInstanceByEngineReq,
    		DiskEncryptedKey: &diskEncryptedKeyCreateInstanceByEngineReq,
    		DiskEncryptedEnable: &diskEncryptedEnableCreateInstanceByEngineReq,
    		RetentionPolicy: &retentionPolicyCreateInstanceByEngineReq,
    		SaslEnabledMechanisms: &listSaslEnabledMechanismsbody,
    		KafkaSecurityProtocol: &kafkaSecurityProtocolCreateInstanceByEngineReq,
    		SslEnable: &sslEnableCreateInstanceByEngineReq,
    		PublicipId: &publicipIdCreateInstanceByEngineReq,
    		TenantIps: &listTenantIpsbody,
    		EnablePublicip: &enablePublicipCreateInstanceByEngineReq,
    		ProductId: "c6.2u4g.cluster",
    		AvailableZones: listAvailableZonesbody,
    		SubnetId: "********-8fbf-4438-ba71-************",
    		SecurityGroupId: "********-e073-4aad-991f-************",
    		VpcId: "********-9b4a-44c5-a964-************",
    		Password: &passwordCreateInstanceByEngineReq,
    		AccessUser: &accessUserCreateInstanceByEngineReq,
    		StorageSpace: int32(300),
    		BrokerNum: int32(3),
    		EngineVersion: "2.7",
    		Engine: model.GetCreateInstanceByEngineReqEngineEnum().KAFKA,
    		Description: &descriptionCreateInstanceByEngineReq,
    		Name: "kafka-test",
    	}
    	response, err := client.CreatePostPaidKafkaInstance(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

      1
      2
      3
      4
      5
      6
      7
      8
      9
     10
     11
     12
     13
     14
     15
     16
     17
     18
     19
     20
     21
     22
     23
     24
     25
     26
     27
     28
     29
     30
     31
     32
     33
     34
     35
     36
     37
     38
     39
     40
     41
     42
     43
     44
     45
     46
     47
     48
     49
     50
     51
     52
     53
     54
     55
     56
     57
     58
     59
     60
     61
     62
     63
     64
     65
     66
     67
     68
     69
     70
     71
     72
     73
     74
     75
     76
     77
     78
     79
     80
     81
     82
     83
     84
     85
     86
     87
     88
     89
     90
     91
     92
     93
     94
     95
     96
     97
     98
     99
    100
    101
    102
    103
    104
    105
    106
    107
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreatePostPaidKafkaInstanceRequest{}
    	chargingModeBssParam:= model.GetBssParamChargingModeEnum().PRE_PAID
    	isAutoPayBssParam:= true
    	periodTypeBssParam:= model.GetBssParamPeriodTypeEnum().MONTH
    	periodNumBssParam:= int32(1)
    	bssParambody := &model.BssParam{
    		ChargingMode: &chargingModeBssParam,
    		IsAutoPay: &isAutoPayBssParam,
    		PeriodType: &periodTypeBssParam,
    		PeriodNum: &periodNumBssParam,
    	}
    	keyTags:= "aaa"
    	valueTags:= "111"
    	var listTagsbody = []model.TagEntity{
            {
                Key: &keyTags,
                Value: &valueTags,
            },
        }
    	var listSaslEnabledMechanismsbody = []model.CreateInstanceByEngineReqSaslEnabledMechanisms{
            model.GetCreateInstanceByEngineReqSaslEnabledMechanismsEnum().SCRAM_SHA_512,
        }
    	var listTenantIpsbody = []string{
            "127.xx.xx.x",
    	    "127.xx.xx.x",
    	    "127.xx.xx.x",
        }
    	var listAvailableZonesbody = []string{
            "********706d4c1fb0eb72f0********",
        }
    	vpcClientPlainCreateInstanceByEngineReq:= true
    	archTypeCreateInstanceByEngineReq:= "X86"
    	enterpriseProjectIdCreateInstanceByEngineReq:= "0"
    	enableAutoTopicCreateInstanceByEngineReq:= true
    	retentionPolicyCreateInstanceByEngineReq:= model.GetCreateInstanceByEngineReqRetentionPolicyEnum().TIME_BASE
    	kafkaSecurityProtocolCreateInstanceByEngineReq:= "SASL_SSL"
    	sslEnableCreateInstanceByEngineReq:= true
    	publicipIdCreateInstanceByEngineReq:= "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************"
    	enablePublicipCreateInstanceByEngineReq:= true
    	passwordCreateInstanceByEngineReq:= "********"
    	accessUserCreateInstanceByEngineReq:= "********"
    	descriptionCreateInstanceByEngineReq:= ""
    	request.Body = &model.CreateInstanceByEngineReq{
    		BssParam: bssParambody,
    		VpcClientPlain: &vpcClientPlainCreateInstanceByEngineReq,
    		ArchType: &archTypeCreateInstanceByEngineReq,
    		Tags: &listTagsbody,
    		EnterpriseProjectId: &enterpriseProjectIdCreateInstanceByEngineReq,
    		StorageSpecCode: model.GetCreateInstanceByEngineReqStorageSpecCodeEnum().DMS_PHYSICAL_STORAGE_ULTRA,
    		EnableAutoTopic: &enableAutoTopicCreateInstanceByEngineReq,
    		RetentionPolicy: &retentionPolicyCreateInstanceByEngineReq,
    		SaslEnabledMechanisms: &listSaslEnabledMechanismsbody,
    		KafkaSecurityProtocol: &kafkaSecurityProtocolCreateInstanceByEngineReq,
    		SslEnable: &sslEnableCreateInstanceByEngineReq,
    		PublicipId: &publicipIdCreateInstanceByEngineReq,
    		TenantIps: &listTenantIpsbody,
    		EnablePublicip: &enablePublicipCreateInstanceByEngineReq,
    		ProductId: "c6.2u4g.cluster",
    		AvailableZones: listAvailableZonesbody,
    		SubnetId: "********-8fbf-4438-ba71-************",
    		SecurityGroupId: "********-e073-4aad-991f-************",
    		VpcId: "********-9b4a-44c5-a964-************",
    		Password: &passwordCreateInstanceByEngineReq,
    		AccessUser: &accessUserCreateInstanceByEngineReq,
    		StorageSpace: int32(300),
    		BrokerNum: int32(3),
    		EngineVersion: "2.7",
    		Engine: model.GetCreateInstanceByEngineReqEngineEnum().KAFKA,
    		Description: &descriptionCreateInstanceByEngineReq,
    		Name: "kafka-test1",
    	}
    	response, err := client.CreatePostPaidKafkaInstance(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    

更多编程语言的SDK代码示例,请参见API Explorer的代码示例页签,可生成自动对应的SDK代码示例。

  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    90
    91
    92
    93
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    import java.util.List;
    import java.util.ArrayList;
    
    public class CreatePostPaidKafkaInstanceSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreatePostPaidKafkaInstanceRequest request = new CreatePostPaidKafkaInstanceRequest();
            CreateInstanceByEngineReq body = new CreateInstanceByEngineReq();
            List<TagEntity> listbodyTags = new ArrayList<>();
            listbodyTags.add(
                new TagEntity()
                    .withKey("aaa")
                    .withValue("111")
            );
            List<CreateInstanceByEngineReq.SaslEnabledMechanismsEnum> listbodySaslEnabledMechanisms = new ArrayList<>();
            listbodySaslEnabledMechanisms.add(CreateInstanceByEngineReq.SaslEnabledMechanismsEnum.fromValue("SCRAM-SHA-512"));
            List<String> listbodyTenantIps = new ArrayList<>();
            listbodyTenantIps.add("127.xx.xx.x");
            listbodyTenantIps.add("127.xx.xx.x");
            listbodyTenantIps.add("127.xx.xx.x");
            List<String> listbodyAvailableZones = new ArrayList<>();
            listbodyAvailableZones.add("********706d4c1fb0eb72f0********");
            body.withVpcClientPlain(true);
            body.withArchType("X86");
            body.withTags(listbodyTags);
            body.withEnterpriseProjectId("0");
            body.withStorageSpecCode(CreateInstanceByEngineReq.StorageSpecCodeEnum.fromValue("dms.physical.storage.ultra.v2"));
            body.withEnableAutoTopic(true);
            body.withDiskEncryptedKey("********-b953-4875-a743-************");
            body.withDiskEncryptedEnable(true);
            body.withRetentionPolicy(CreateInstanceByEngineReq.RetentionPolicyEnum.fromValue("time_base"));
            body.withSaslEnabledMechanisms(listbodySaslEnabledMechanisms);
            body.withKafkaSecurityProtocol("SASL_SSL");
            body.withSslEnable(true);
            body.withPublicipId("********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************");
            body.withTenantIps(listbodyTenantIps);
            body.withEnablePublicip(true);
            body.withProductId("c6.2u4g.cluster");
            body.withAvailableZones(listbodyAvailableZones);
            body.withSubnetId("********-8fbf-4438-ba71-************");
            body.withSecurityGroupId("********-e073-4aad-991f-************");
            body.withVpcId("********-9b4a-44c5-a964-************");
            body.withPassword("********");
            body.withAccessUser("********");
            body.withStorageSpace(300);
            body.withBrokerNum(3);
            body.withEngineVersion("2.7");
            body.withEngine(CreateInstanceByEngineReq.EngineEnum.fromValue("kafka"));
            body.withDescription("");
            body.withName("kafka-test");
            request.withBody(body);
            try {
                CreatePostPaidKafkaInstanceResponse response = client.createPostPaidKafkaInstance(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    90
    91
    92
    93
    94
    95
    96
    97
    package com.huaweicloud.sdk.test;
    
    import com.huaweicloud.sdk.core.auth.ICredential;
    import com.huaweicloud.sdk.core.auth.BasicCredentials;
    import com.huaweicloud.sdk.core.exception.ConnectionException;
    import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
    import com.huaweicloud.sdk.core.exception.ServiceResponseException;
    import com.huaweicloud.sdk.kafka.v2.region.KafkaRegion;
    import com.huaweicloud.sdk.kafka.v2.*;
    import com.huaweicloud.sdk.kafka.v2.model.*;
    
    import java.util.List;
    import java.util.ArrayList;
    
    public class CreatePostPaidKafkaInstanceSolution {
    
        public static void main(String[] args) {
            // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
            // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
            String ak = System.getenv("CLOUD_SDK_AK");
            String sk = System.getenv("CLOUD_SDK_SK");
            String projectId = "{project_id}";
    
            ICredential auth = new BasicCredentials()
                    .withProjectId(projectId)
                    .withAk(ak)
                    .withSk(sk);
    
            KafkaClient client = KafkaClient.newBuilder()
                    .withCredential(auth)
                    .withRegion(KafkaRegion.valueOf("<YOUR REGION>"))
                    .build();
            CreatePostPaidKafkaInstanceRequest request = new CreatePostPaidKafkaInstanceRequest();
            CreateInstanceByEngineReq body = new CreateInstanceByEngineReq();
            BssParam bssParambody = new BssParam();
            bssParambody.withChargingMode(BssParam.ChargingModeEnum.fromValue("prePaid"))
                .withIsAutoPay(true)
                .withPeriodType(BssParam.PeriodTypeEnum.fromValue("month"))
                .withPeriodNum(1);
            List<TagEntity> listbodyTags = new ArrayList<>();
            listbodyTags.add(
                new TagEntity()
                    .withKey("aaa")
                    .withValue("111")
            );
            List<CreateInstanceByEngineReq.SaslEnabledMechanismsEnum> listbodySaslEnabledMechanisms = new ArrayList<>();
            listbodySaslEnabledMechanisms.add(CreateInstanceByEngineReq.SaslEnabledMechanismsEnum.fromValue("SCRAM-SHA-512"));
            List<String> listbodyTenantIps = new ArrayList<>();
            listbodyTenantIps.add("127.xx.xx.x");
            listbodyTenantIps.add("127.xx.xx.x");
            listbodyTenantIps.add("127.xx.xx.x");
            List<String> listbodyAvailableZones = new ArrayList<>();
            listbodyAvailableZones.add("********706d4c1fb0eb72f0********");
            body.withBssParam(bssParambody);
            body.withVpcClientPlain(true);
            body.withArchType("X86");
            body.withTags(listbodyTags);
            body.withEnterpriseProjectId("0");
            body.withStorageSpecCode(CreateInstanceByEngineReq.StorageSpecCodeEnum.fromValue("dms.physical.storage.ultra.v2"));
            body.withEnableAutoTopic(true);
            body.withRetentionPolicy(CreateInstanceByEngineReq.RetentionPolicyEnum.fromValue("time_base"));
            body.withSaslEnabledMechanisms(listbodySaslEnabledMechanisms);
            body.withKafkaSecurityProtocol("SASL_SSL");
            body.withSslEnable(true);
            body.withPublicipId("********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************");
            body.withTenantIps(listbodyTenantIps);
            body.withEnablePublicip(true);
            body.withProductId("c6.2u4g.cluster");
            body.withAvailableZones(listbodyAvailableZones);
            body.withSubnetId("********-8fbf-4438-ba71-************");
            body.withSecurityGroupId("********-e073-4aad-991f-************");
            body.withVpcId("********-9b4a-44c5-a964-************");
            body.withPassword("********");
            body.withAccessUser("********");
            body.withStorageSpace(300);
            body.withBrokerNum(3);
            body.withEngineVersion("2.7");
            body.withEngine(CreateInstanceByEngineReq.EngineEnum.fromValue("kafka"));
            body.withDescription("");
            body.withName("kafka-test1");
            request.withBody(body);
            try {
                CreatePostPaidKafkaInstanceResponse response = client.createPostPaidKafkaInstance(request);
                System.out.println(response.toString());
            } catch (ConnectionException e) {
                e.printStackTrace();
            } catch (RequestTimeoutException e) {
                e.printStackTrace();
            } catch (ServiceResponseException e) {
                e.printStackTrace();
                System.out.println(e.getHttpStatusCode());
                System.out.println(e.getRequestId());
                System.out.println(e.getErrorCode());
                System.out.println(e.getErrorMsg());
            }
        }
    }
    
  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreatePostPaidKafkaInstanceRequest()
            listTagsbody = [
                TagEntity(
                    key="aaa",
                    value="111"
                )
            ]
            listSaslEnabledMechanismsbody = [
                "SCRAM-SHA-512"
            ]
            listTenantIpsbody = [
                "127.xx.xx.x",
                "127.xx.xx.x",
                "127.xx.xx.x"
            ]
            listAvailableZonesbody = [
                "********706d4c1fb0eb72f0********"
            ]
            request.body = CreateInstanceByEngineReq(
                vpc_client_plain=True,
                arch_type="X86",
                tags=listTagsbody,
                enterprise_project_id="0",
                storage_spec_code="dms.physical.storage.ultra.v2",
                enable_auto_topic=True,
                disk_encrypted_key="********-b953-4875-a743-************",
                disk_encrypted_enable=True,
                retention_policy="time_base",
                sasl_enabled_mechanisms=listSaslEnabledMechanismsbody,
                kafka_security_protocol="SASL_SSL",
                ssl_enable=True,
                publicip_id="********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
                tenant_ips=listTenantIpsbody,
                enable_publicip=True,
                product_id="c6.2u4g.cluster",
                available_zones=listAvailableZonesbody,
                subnet_id="********-8fbf-4438-ba71-************",
                security_group_id="********-e073-4aad-991f-************",
                vpc_id="********-9b4a-44c5-a964-************",
                password="********",
                access_user="********",
                storage_space=300,
                broker_num=3,
                engine_version="2.7",
                engine="kafka",
                description="",
                name="kafka-test"
            )
            response = client.create_post_paid_kafka_instance(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    # coding: utf-8
    
    import os
    from huaweicloudsdkcore.auth.credentials import BasicCredentials
    from huaweicloudsdkkafka.v2.region.kafka_region import KafkaRegion
    from huaweicloudsdkcore.exceptions import exceptions
    from huaweicloudsdkkafka.v2 import *
    
    if __name__ == "__main__":
        # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak = os.environ["CLOUD_SDK_AK"]
        sk = os.environ["CLOUD_SDK_SK"]
        projectId = "{project_id}"
    
        credentials = BasicCredentials(ak, sk, projectId)
    
        client = KafkaClient.new_builder() \
            .with_credentials(credentials) \
            .with_region(KafkaRegion.value_of("<YOUR REGION>")) \
            .build()
    
        try:
            request = CreatePostPaidKafkaInstanceRequest()
            bssParambody = BssParam(
                charging_mode="prePaid",
                is_auto_pay=True,
                period_type="month",
                period_num=1
            )
            listTagsbody = [
                TagEntity(
                    key="aaa",
                    value="111"
                )
            ]
            listSaslEnabledMechanismsbody = [
                "SCRAM-SHA-512"
            ]
            listTenantIpsbody = [
                "127.xx.xx.x",
                "127.xx.xx.x",
                "127.xx.xx.x"
            ]
            listAvailableZonesbody = [
                "********706d4c1fb0eb72f0********"
            ]
            request.body = CreateInstanceByEngineReq(
                bss_param=bssParambody,
                vpc_client_plain=True,
                arch_type="X86",
                tags=listTagsbody,
                enterprise_project_id="0",
                storage_spec_code="dms.physical.storage.ultra.v2",
                enable_auto_topic=True,
                retention_policy="time_base",
                sasl_enabled_mechanisms=listSaslEnabledMechanismsbody,
                kafka_security_protocol="SASL_SSL",
                ssl_enable=True,
                publicip_id="********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************",
                tenant_ips=listTenantIpsbody,
                enable_publicip=True,
                product_id="c6.2u4g.cluster",
                available_zones=listAvailableZonesbody,
                subnet_id="********-8fbf-4438-ba71-************",
                security_group_id="********-e073-4aad-991f-************",
                vpc_id="********-9b4a-44c5-a964-************",
                password="********",
                access_user="********",
                storage_space=300,
                broker_num=3,
                engine_version="2.7",
                engine="kafka",
                description="",
                name="kafka-test1"
            )
            response = client.create_post_paid_kafka_instance(request)
            print(response)
        except exceptions.ClientRequestException as e:
            print(e.status_code)
            print(e.request_id)
            print(e.error_code)
            print(e.error_msg)
    
  • 创建一个按需付费的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

      1
      2
      3
      4
      5
      6
      7
      8
      9
     10
     11
     12
     13
     14
     15
     16
     17
     18
     19
     20
     21
     22
     23
     24
     25
     26
     27
     28
     29
     30
     31
     32
     33
     34
     35
     36
     37
     38
     39
     40
     41
     42
     43
     44
     45
     46
     47
     48
     49
     50
     51
     52
     53
     54
     55
     56
     57
     58
     59
     60
     61
     62
     63
     64
     65
     66
     67
     68
     69
     70
     71
     72
     73
     74
     75
     76
     77
     78
     79
     80
     81
     82
     83
     84
     85
     86
     87
     88
     89
     90
     91
     92
     93
     94
     95
     96
     97
     98
     99
    100
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreatePostPaidKafkaInstanceRequest{}
    	keyTags:= "aaa"
    	valueTags:= "111"
    	var listTagsbody = []model.TagEntity{
            {
                Key: &keyTags,
                Value: &valueTags,
            },
        }
    	var listSaslEnabledMechanismsbody = []model.CreateInstanceByEngineReqSaslEnabledMechanisms{
            model.GetCreateInstanceByEngineReqSaslEnabledMechanismsEnum().SCRAM_SHA_512,
        }
    	var listTenantIpsbody = []string{
            "127.xx.xx.x",
    	    "127.xx.xx.x",
    	    "127.xx.xx.x",
        }
    	var listAvailableZonesbody = []string{
            "********706d4c1fb0eb72f0********",
        }
    	vpcClientPlainCreateInstanceByEngineReq:= true
    	archTypeCreateInstanceByEngineReq:= "X86"
    	enterpriseProjectIdCreateInstanceByEngineReq:= "0"
    	enableAutoTopicCreateInstanceByEngineReq:= true
    	diskEncryptedKeyCreateInstanceByEngineReq:= "********-b953-4875-a743-************"
    	diskEncryptedEnableCreateInstanceByEngineReq:= true
    	retentionPolicyCreateInstanceByEngineReq:= model.GetCreateInstanceByEngineReqRetentionPolicyEnum().TIME_BASE
    	kafkaSecurityProtocolCreateInstanceByEngineReq:= "SASL_SSL"
    	sslEnableCreateInstanceByEngineReq:= true
    	publicipIdCreateInstanceByEngineReq:= "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************"
    	enablePublicipCreateInstanceByEngineReq:= true
    	passwordCreateInstanceByEngineReq:= "********"
    	accessUserCreateInstanceByEngineReq:= "********"
    	descriptionCreateInstanceByEngineReq:= ""
    	request.Body = &model.CreateInstanceByEngineReq{
    		VpcClientPlain: &vpcClientPlainCreateInstanceByEngineReq,
    		ArchType: &archTypeCreateInstanceByEngineReq,
    		Tags: &listTagsbody,
    		EnterpriseProjectId: &enterpriseProjectIdCreateInstanceByEngineReq,
    		StorageSpecCode: model.GetCreateInstanceByEngineReqStorageSpecCodeEnum().DMS_PHYSICAL_STORAGE_ULTRA,
    		EnableAutoTopic: &enableAutoTopicCreateInstanceByEngineReq,
    		DiskEncryptedKey: &diskEncryptedKeyCreateInstanceByEngineReq,
    		DiskEncryptedEnable: &diskEncryptedEnableCreateInstanceByEngineReq,
    		RetentionPolicy: &retentionPolicyCreateInstanceByEngineReq,
    		SaslEnabledMechanisms: &listSaslEnabledMechanismsbody,
    		KafkaSecurityProtocol: &kafkaSecurityProtocolCreateInstanceByEngineReq,
    		SslEnable: &sslEnableCreateInstanceByEngineReq,
    		PublicipId: &publicipIdCreateInstanceByEngineReq,
    		TenantIps: &listTenantIpsbody,
    		EnablePublicip: &enablePublicipCreateInstanceByEngineReq,
    		ProductId: "c6.2u4g.cluster",
    		AvailableZones: listAvailableZonesbody,
    		SubnetId: "********-8fbf-4438-ba71-************",
    		SecurityGroupId: "********-e073-4aad-991f-************",
    		VpcId: "********-9b4a-44c5-a964-************",
    		Password: &passwordCreateInstanceByEngineReq,
    		AccessUser: &accessUserCreateInstanceByEngineReq,
    		StorageSpace: int32(300),
    		BrokerNum: int32(3),
    		EngineVersion: "2.7",
    		Engine: model.GetCreateInstanceByEngineReqEngineEnum().KAFKA,
    		Description: &descriptionCreateInstanceByEngineReq,
    		Name: "kafka-test",
    	}
    	response, err := client.CreatePostPaidKafkaInstance(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    
  • 创建一个包年包月的Kafka实例,版本为2.7,规格为2U4G*3,300GB的存储空间。

      1
      2
      3
      4
      5
      6
      7
      8
      9
     10
     11
     12
     13
     14
     15
     16
     17
     18
     19
     20
     21
     22
     23
     24
     25
     26
     27
     28
     29
     30
     31
     32
     33
     34
     35
     36
     37
     38
     39
     40
     41
     42
     43
     44
     45
     46
     47
     48
     49
     50
     51
     52
     53
     54
     55
     56
     57
     58
     59
     60
     61
     62
     63
     64
     65
     66
     67
     68
     69
     70
     71
     72
     73
     74
     75
     76
     77
     78
     79
     80
     81
     82
     83
     84
     85
     86
     87
     88
     89
     90
     91
     92
     93
     94
     95
     96
     97
     98
     99
    100
    101
    102
    103
    104
    105
    106
    107
    package main
    
    import (
    	"fmt"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
        kafka "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2"
    	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/model"
        region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/kafka/v2/region"
    )
    
    func main() {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        ak := os.Getenv("CLOUD_SDK_AK")
        sk := os.Getenv("CLOUD_SDK_SK")
        projectId := "{project_id}"
    
        auth := basic.NewCredentialsBuilder().
            WithAk(ak).
            WithSk(sk).
            WithProjectId(projectId).
            Build()
    
        client := kafka.NewKafkaClient(
            kafka.KafkaClientBuilder().
                WithRegion(region.ValueOf("<YOUR REGION>")).
                WithCredential(auth).
                Build())
    
        request := &model.CreatePostPaidKafkaInstanceRequest{}
    	chargingModeBssParam:= model.GetBssParamChargingModeEnum().PRE_PAID
    	isAutoPayBssParam:= true
    	periodTypeBssParam:= model.GetBssParamPeriodTypeEnum().MONTH
    	periodNumBssParam:= int32(1)
    	bssParambody := &model.BssParam{
    		ChargingMode: &chargingModeBssParam,
    		IsAutoPay: &isAutoPayBssParam,
    		PeriodType: &periodTypeBssParam,
    		PeriodNum: &periodNumBssParam,
    	}
    	keyTags:= "aaa"
    	valueTags:= "111"
    	var listTagsbody = []model.TagEntity{
            {
                Key: &keyTags,
                Value: &valueTags,
            },
        }
    	var listSaslEnabledMechanismsbody = []model.CreateInstanceByEngineReqSaslEnabledMechanisms{
            model.GetCreateInstanceByEngineReqSaslEnabledMechanismsEnum().SCRAM_SHA_512,
        }
    	var listTenantIpsbody = []string{
            "127.xx.xx.x",
    	    "127.xx.xx.x",
    	    "127.xx.xx.x",
        }
    	var listAvailableZonesbody = []string{
            "********706d4c1fb0eb72f0********",
        }
    	vpcClientPlainCreateInstanceByEngineReq:= true
    	archTypeCreateInstanceByEngineReq:= "X86"
    	enterpriseProjectIdCreateInstanceByEngineReq:= "0"
    	enableAutoTopicCreateInstanceByEngineReq:= true
    	retentionPolicyCreateInstanceByEngineReq:= model.GetCreateInstanceByEngineReqRetentionPolicyEnum().TIME_BASE
    	kafkaSecurityProtocolCreateInstanceByEngineReq:= "SASL_SSL"
    	sslEnableCreateInstanceByEngineReq:= true
    	publicipIdCreateInstanceByEngineReq:= "********-88fc-4a8c-86d0-************,********-16af-455d-8d54-************,********-3d69-4367-95ab-************"
    	enablePublicipCreateInstanceByEngineReq:= true
    	passwordCreateInstanceByEngineReq:= "********"
    	accessUserCreateInstanceByEngineReq:= "********"
    	descriptionCreateInstanceByEngineReq:= ""
    	request.Body = &model.CreateInstanceByEngineReq{
    		BssParam: bssParambody,
    		VpcClientPlain: &vpcClientPlainCreateInstanceByEngineReq,
    		ArchType: &archTypeCreateInstanceByEngineReq,
    		Tags: &listTagsbody,
    		EnterpriseProjectId: &enterpriseProjectIdCreateInstanceByEngineReq,
    		StorageSpecCode: model.GetCreateInstanceByEngineReqStorageSpecCodeEnum().DMS_PHYSICAL_STORAGE_ULTRA,
    		EnableAutoTopic: &enableAutoTopicCreateInstanceByEngineReq,
    		RetentionPolicy: &retentionPolicyCreateInstanceByEngineReq,
    		SaslEnabledMechanisms: &listSaslEnabledMechanismsbody,
    		KafkaSecurityProtocol: &kafkaSecurityProtocolCreateInstanceByEngineReq,
    		SslEnable: &sslEnableCreateInstanceByEngineReq,
    		PublicipId: &publicipIdCreateInstanceByEngineReq,
    		TenantIps: &listTenantIpsbody,
    		EnablePublicip: &enablePublicipCreateInstanceByEngineReq,
    		ProductId: "c6.2u4g.cluster",
    		AvailableZones: listAvailableZonesbody,
    		SubnetId: "********-8fbf-4438-ba71-************",
    		SecurityGroupId: "********-e073-4aad-991f-************",
    		VpcId: "********-9b4a-44c5-a964-************",
    		Password: &passwordCreateInstanceByEngineReq,
    		AccessUser: &accessUserCreateInstanceByEngineReq,
    		StorageSpace: int32(300),
    		BrokerNum: int32(3),
    		EngineVersion: "2.7",
    		Engine: model.GetCreateInstanceByEngineReqEngineEnum().KAFKA,
    		Description: &descriptionCreateInstanceByEngineReq,
    		Name: "kafka-test1",
    	}
    	response, err := client.CreatePostPaidKafkaInstance(request)
    	if err == nil {
            fmt.Printf("%+v\n", response)
        } else {
            fmt.Println(err)
        }
    }
    

更多编程语言的SDK代码示例,请参见API Explorer的代码示例页签,可生成自动对应的SDK代码示例。

状态码

状态码

描述

200

创建实例成功。

错误码

请参见错误码

我们使用cookie来确保您的高速浏览体验。继续浏览本站,即表示您同意我们使用cookie。 详情

文档反馈

文档反馈

意见反馈

0/500

标记内容

同时提交标记内容