Distributed Message Service for Kafka
Distributed Message Service for Kafka
- What's New
- Function Overview
- Product Bulletin
-
Service Overview
- What Is DMS for Kafka?
- Product Advantages
- Application Scenarios
- Kafka Instance Specifications
- Comparing Single-node and Cluster Kafka Instances
- Comparing Kafka, RabbitMQ, and RocketMQ
- Comparing DMS for Kafka and Open-Source Kafka
- Notes and Constraints
- Related Services
- Basic Concepts
- Permissions
- Billing
- Billing
- Getting Started
-
User Guide
- Process of Using Kafka
- Permissions Management
- Buying a Kafka Instance
- Configuring Topics
- Connecting to an Instance
- Managing Messages
- Managing Consumer Groups
- Managing Quotas
-
Managing Instances
- Viewing and Modifying Basic Information of a Kafka Instance
- Viewing Kafka Disk Usage
- Viewing Kafka Background Tasks
- Viewing Sample Code of Kafka Production and Consumption
- Modifying Kafka Instance Configuration Parameters
- Configuring Kafka Instance Tags
- Exporting the Kafka Instance List
- Restarting a Kafka Instance
- Deleting Kafka Instances
- Using Kafka Manager
- Modifying Instance Specifications
- Migrating Data
- Testing Instance Performance
- Applying for Increasing Kafka Quotas
- Monitoring and Alarms
- Viewing Kafka Audit Logs
-
Best Practices
- Kafka Best Practices
- Improving Kafka Message Processing Efficiency
- Optimizing Consumer Polling
- Interconnecting Logstash to Kafka to Produce and Consume Messages
- Using MirrorMaker to Synchronize Data Across Clusters
- Handling Message Accumulation
- Handling Service Overload
- Handling Uneven Service Data
- Configuring Message Accumulation Monitoring
- Developer Guide
- Performance Whitepaper
-
API Reference
- Before You Start
- API Overview
- Calling APIs
- Getting Started
-
APIs V2 (Recommended)
- Lifecycle Management
-
Instance Management
- Resetting the Password
- Resetting Kafka Manager Password
- Restarting Kafka Manager
- Configuring Automatic Topic Creation
- Modifying the Private IP Address for Cross-VPC Access
- Querying Kafka Cluster Metadata
- Querying Consumer Group Details
- Resetting Consumer Group Offset to the Specified Position
- Querying Coordinator Details of a Kafka Instance
- Adding Partitions to a Topic for a Kafka Instance
- Reassigning Replicas of a Topic for a Kafka Instance
- Querying the Disk Usage Status of Topics
- Querying All Consumer Groups
- Querying a Specific Consumer Group
- Deleting a Consumer Group from a Kafka Instance
- Batch Deleting Consumer Groups of a Kafka Instance
- Specification Modification Management
- Topic Management
- User Management
- Message Query
- Background Task Management
- Tag Management
- Other APIs
- Permissions Policies and Supported Actions
- Out-of-Date APIs
- Appendix
- Change History
- SDK Reference
-
FAQs
-
Instances
- Why Can't I Select Two AZs?
- Why Can't I View the Subnet and Security Group Information When Creating a DMS Instance?
- How Do I Select Storage Space for a Kafka Instance?
- How Do I Choose Between High I/O and Ultra-high I/O?
- Which Capacity Threshold Policy Should I Use?
- Which Kafka Versions Are Supported?
- What Is the ZooKeeper Address of a Kafka Instance?
- Are Kafka Instances in Cluster Mode?
- Can I Modify the Port for Accessing a Kafka Instance?
- How Long Are Kafka SSL Certificates Valid for?
- How Do I Synchronize Data from One Kafka Instance to Another?
- How Do I Change the SASL_SSL Setting of a Kafka Instance?
- How Do I Modify the SASL Mechanism?
- Will a Kafka Instance Be Restarted After Its Enterprise Project Is Modified?
- Are Kafka Brokers and ZooKeeper Deployed on the Same VM or on Different VMs?
- Which Cipher Suites Are Supported by Kafka?
- Can I Change Single-AZ Deployment to Multi-AZ Deployment for an Instance?
- Does DMS for Kafka Support Cross-AZ Disaster Recovery? Where Can I Check Whether an Existing Instance is Across-AZs?
- Do Kafka Instances Support Disk Encryption?
- Can I Change the VPC and Subnet After a Kafka Instance Is Created?
- Where Can I Find Kafka Streams Use Cases?
- Can I Upgrade Kafka Instances?
- How Do I Bind an EIP Again?
- Specification Modification
-
Connections
- How Do I Select and Configure a Security Group?
- Can I Access a Kafka Instance Over a Public Network?
- How Many Connection Addresses Does a Kafka Instance Have by Default?
- Do Kafka Instances Support Cross-Region Access?
- Do Kafka Instances Support Cross-VPC Access?
- Do Kafka Instances Support Cross-Subnet Access?
- Does DMS for Kafka Support Authentication with Kerberos?
- Does DMS for Kafka Support Password-Free Access?
- How Do I Obtain the Public Access Address After Public Access Is Enabled?
- Does DMS for Kafka Support Authentication on Clients by the Server?
- Can I Use PEM SSL Truststore When Connecting to a Kafka Instance with SASL_SSL Enabled?
- What Are the Differences Between JKS and CRT Certificates?
- Which TLS Version Does DMS for Kafka Support?
- Is There a Limit on the Number of Client Connections to a Kafka Instance?
- How Many Connections Are Allowed from Each IP Address?
- Can I Change the Private Network Addresses of a Kafka Instance?
- Is the Same SSL Certificate Used for Different Instances?
- Why Is It Not Recommended to Use a Sarama Client for Messaging?
-
Topics and Partitions
- Is There a Limit on the Number of Topics in a Kafka Instance?
- Why Is Partition Quantity Limited?
- Can I Reduce the Partition Quantity?
- Why Do I Fail to Create Topics?
- Do Kafka Instances Support Batch Importing Topics or Automatic Topic Creation?
- Why Do Deleted Topics Still Exist?
- Can I View the Disk Space Used by a Topic?
- Can I Add ACL Permissions for Topics?
- What Should I Do If Kafka Storage Space Is Used Up Because Retrieved Messages Are Not Deleted?
- How Do I Increase the Partition Quantity?
- Will a Kafka Instance Be Restarted After Its Automatic Topic Creation Setting Is Modified?
- What Can I Do If a Consumer Fails to Retrieve Messages from a Topic Due to Insufficient Permissions?
- Why Does an Instance Contain Default Topics __trace and __consumer_offsets?
-
Consumer Groups
- Do I Need to Create Consumer Groups, Producers, and Consumers for Kafka Instances?
- Will a Consumer Group Without Active Consumers Be Automatically Deleted in 14 Days?
- Why Do I See a Deleted Consumer Group on Kafka Manager?
- Why Does a Deleted Consumer Group Still Exist?
- Why Can't I View Consumers When Instance Consumption Is Normal?
- Can I Delete Unnecessary Topics in a Consumer Group?
-
Messages
- What Is the Maximum Size of a Message that Can be Created?
- Why Does Message Poll Often Fail During Rebalancing?
- Why Can't I Query Messages on the Console?
- What Can I Do If Kafka Messages Are Accumulated?
- Why Do Messages Still Exist After the Retention Period Elapses?
- Do Kafka Instances Support Delayed Message Delivery?
- How Do I View the Number of Accumulated Messages?
- Why Is the Message Creation Time Displayed as Year 1970?
- How Do I Modify message.max.bytes?
- Why Are Offsets Not Continuous?
-
Kafka Manager
- Can I Configure a Kafka Manager Account to Be Read-Only?
- Why Can't I See Broker Information After Logging In to Kafka Manager?
- Yikes! Insufficient partition balance when creating topic : projectman_project_enterprise_project Try again.
- Can I Query the Body of a Message by Using Kafka Manager?
- Can I Change the Port of the Kafka Manager Web UI?
- Which Topic Configurations Can Be Modified on Kafka Manager?
- Why Is Information Displayed on Kafka Manager Inconsistent with Cloud Eye Monitoring Data?
- How Do I Change a Partition Leader for a Topic in Kafka Manager?
- Why Is the Version on the Console Different from That in Kafka Manager?
-
Monitoring & Alarm
- Why Can't I View the Monitoring Data?
- Why Is the Monitored Number of Accumulated Messages Inconsistent with the Message Quantity Displayed on the Kafka Console?
- Why Is a Consumer Group Still on the Monitoring Page After Being Deleted?
- Why Do Metrics Fluctuate Significantly (Disk Read/Write Speed, Average Disk Read/Write Time, and CPU Usage)?
- Why Does JVM Heap Memory Usage of JVM Fluctuate Significantly?
-
Instances
-
Troubleshooting
- Troubleshooting Kafka Connection Exceptions
- Troubleshooting 6-Min Latency Between Message Creation and Retrieval
- Troubleshooting Message Creation Failures
- Troubleshooting Topic Deletion Failures
- Troubleshooting Failure to Log In to Kafka Manager in Windows
- Troubleshooting Error "Topic {{topic_name}} not present in metadata after 60000 ms" During Message Production or Consumption
- Videos
On this page
Using spring-kafka
Updated on 2023-11-29 GMT+08:00
This section describes how to use spring-kafka to connect to a Huawei Cloud Kafka instance to produce and consume messages. Obtain the related code from kafka-springboot-demo.
The Kafka instance connection addresses, topic name, and user information used in the following examples are available in Collecting Connection Information.
Adding the spring-kafka Dependency to the pom.xml File
<dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency>
Configuring the application.properties File
#=============== Kafka ========================== ## Broker information of the Kafka instance. ip:port indicates the connection address and port number of the instance. spring.kafka.bootstrap-servers=ip1:port1,ip2:port2,ip3:port3 #=============== Producer Configuration ======================= spring.kafka.producer.retries=0 spring.kafka.producer.batch-size=16384 spring.kafka.producer.buffer-memory=33554432 spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer #=============== Consumer Configuration ======================= spring.kafka.consumer.group-id=test-consumer-group spring.kafka.consumer.auto-offset-reset=earliest spring.kafka.consumer.enable-auto-commit=true spring.kafka.consumer.auto-commit-interval=100 spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer #======== SASL Configuration (Delete the following configuration if SASL is disabled.) ======= ## Set the SASL authentication mechanism, username, and password. # spring.kafka.properties.sasl.mechanism indicates the SASL authentication mechanism. username and password indicate the username and password of SASL_SSL. Obtain them by referring to "Collecting Connection Information." ## If the SASL mechanism is PLAIN, the configuration is as follows: spring.kafka.properties.sasl.mechanism=PLAIN spring.kafka.properties.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \ username="username" \ password="password"; ## If the SASL mechanism is SCRAM-SHA-512, the configuration is as follows: spring.kafka.properties.sasl.mechanism=SCRAM-SHA-512 spring.kafka.properties.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \ username="username" \ password="password"; # Set spring.kafka.security.protocol. ## If the security protocol is SASL_SSL, the configuration is as follows: spring.kafka.security.protocol=SASL_SSL # spring.kafka.ssl.trust-store-location is the path for storing the SSL certificate. The following code uses the path format in Windows as an example. Change the path format based on the actual running environment. spring.kafka.ssl.trust-store-location=E:\\temp\\client.truststore.jks # spring.kafka.ssl.trust-store-password is the password of the server certificate. This password does not need to be modified. It is used for accessing the JKS file generated by Java. spring.kafka.ssl.trust-store-password=dms@kafka # spring.kafka.properties.ssl.endpoint.identification.algorithm indicates whether to verify the certificate domain name. This parameter must be left blank, which indicates disabling domain name verification. spring.kafka.properties.ssl.endpoint.identification.algorithm=
Producing Messages
package com.huaweicloud.dms.example.producer; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.kafka.core.KafkaTemplate; import org.springframework.scheduling.annotation.Scheduled; import org.springframework.stereotype.Component; import java.util.UUID; /** * @author huaweicloud DMS */ @Component public class DmsKafkaProducer { /** * Topic name. Use the actual topic name. */ public static final String TOPIC = "test_topic"; @Autowired private KafkaTemplate<String, String> kafkaTemplate; /** * One message is produced every five seconds as scheduled. */ @Scheduled(cron = "*/5 * * * * ?") public void send() { String message = String.format("{id:%s,timestamp:%s}", UUID.randomUUID().toString(), System.currentTimeMillis()); kafkaTemplate.send(TOPIC, message); System.out.println("send finished, message = " + message); } }
Consuming Messages
package com.huaweicloud.dms.example.consumer; import org.apache.kafka.clients.consumer.ConsumerRecord; import org.springframework.kafka.annotation.KafkaListener; import org.springframework.stereotype.Component; import java.util.Optional; /** * @author huaweicloud DMS */ @Component public class DmsKafkaConsumer { /** * Topic name. Use the actual topic name. */ private static final String TOPIC = "test_topic"; @KafkaListener(topics = {TOPIC}) public void listen(ConsumerRecord<String, String> record) { Optional<String> message = Optional.ofNullable(record.value()); if (message.isPresent()) { System.out.println("consume finished, message = " + message.get()); } } }
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.
The system is busy. Please try again later.