このページは、お客様の言語ではご利用いただけません。Huawei Cloudは、より多くの言語バージョンを追加するために懸命に取り組んでいます。ご協力ありがとうございました。
- What's New
- Function Overview
- Service Overview
- Billing
- Getting Started
-
User Guide
- Usage Process
- Instances
- Integration Application Management
-
Data Source Management
- Data Sources Supported by ROMA Connect
- Connecting to an API Data Source
- Connecting to an ActiveMQ Data Source
- Connecting to an ArtemisMQ Data Source
- Connecting to a DB2 Data Source
- Connecting to a DIS Data Source
- Connecting to a DWS Data Source
- Connecting to the DM Data Source
- Connecting to a Gauss100 Data Source
- Connecting to an FTP Data Source
- Connecting to an HL7 Data Source
- Connecting to a HANA Data Source
- Connecting to a HIVE Data Source
- Connecting to an LDAP Data Source
- Connecting to an IBM MQ Data Source
- Connecting to a Kafka Data Source
- Connecting to a MySQL Data Source
- Connecting to a MongoDB Data Source
- Connecting to an MQS Data Source
- Connecting to an MRS Hive Data Source
- Connecting to an MRS HDFS Data Source
- Connecting to an MRS HBase Data Source
- Connecting to an MRS Kafka Data Source
- Connecting to an OBS Data Source
- Connecting to an Oracle Data Source
- Connecting to a PostgreSQL Data Source
- Connecting to a Redis Data Source
- Connecting to a RabbitMQ Data Source
- Connecting to a RocketMQ Data Source
- Connecting to an SAP Data Source
- Connecting to an SNMP Data Source
- Connecting to a SQL Server Data Source
- Connecting to a GaussDB(for MySQL) Data Source
- Connecting to a WebSocket Data Source
- Connecting to a Custom Data Source
- Data Integration Guide
-
Service Integration Guide
- Usage Introduction
- Exposing an API
- Exposing a Function API
- Exposing a Data API
- Calling an API
- Managing APIs
- Managing Custom Backends
- Configuring API Control Policies
- Configuring API Plug-in Policies
- Configuring a Custom Authorizer
- Configuring Signature Verification for Backend Services
- Configuring API Cascading
-
Service Integration Guide (Old Edition)
- Usage Introduction
- Exposing APIs
- Creating and Exposing Data APIs
- Creating and Exposing Function APIs
- Calling an API
- Managing APIs
- Managing Custom Backends
- Managing Control Policies
- Managing Plug-ins
- Configuring a Custom Authorizer
- Configuring Signature Verification for Backend Services
- Configuring API Cascading
- Message Integration Guide
- Device Integration Guide
- Increasing Resource Quota
- Audit Logs
- Monitoring Metrics
- Permissions
- User Guide(new)
-
Best Practices
- Digital Reconstruction of Traditional Parking Lot Management Systems
- Sharing Enterprise Data Using APIs
- Integrating and Converting Service Data Across Systems
- Building an Enterprise Service Open Platform
- Developing a Custom Authorizer with a Custom Backend
- Avoiding MQS Message Accumulation
- Synchronizing Data from MySQL to Oracle as Scheduled
-
Developer Guide
- Developer Guide for Data Integration
-
Developer Guide for Service Integration
- Overview
- Developing API Calling Authentication (App)
- Developing API Calling Authentication (IAM)
-
Developing Custom Function Backends
- Overview
- AesUtils
- APIConnectResponse
- Base64Utils
- CacheUtils
- CipherUtils
- ConnectionConfig
- DataSourceClient
- DataSourceConfig
- ExchangeConfig
- HttpClient
- HttpConfig
- JedisConfig
- JSON2XMLHelper
- JSONHelper
- JsonUtils
- JWTUtils
- KafkaConsumer
- KafkaProducer
- KafkaConfig
- MD5Encoder
- Md5Utils
- QueueConfig
- RabbitMqConfig
- RabbitMqProducer
- RedisClient
- RomaWebConfig
- RSAUtils
- SapRfcClient
- SapRfcConfig
- SoapClient
- SoapConfig
- StringUtils
- TextUtils
- XmlUtils
- Developing Custom Data Backends
- Developing Signature Verification for Backend Services
- Developer Guide for Message Integration
-
Developer Guide for Device Integration
- Overview
- Preparations
- Configuring Device Integration
-
MQTT Topic Specifications
- Before You Start
- Gateway Login
- Adding a Gateway Subdevice
- Response for Adding a Gateway Subdevice
- Updating the Gateway Subdevice Status
- Response for Updating the Gateway Subdevice Status
- Deleting a Gateway Subdevice
- Querying Gateway Information
- Response for Querying Gateway Information
- Delivering a Command to a Device
- Response for Delivering a Command to a Device
- Reporting Device Data
-
API Reference
- Before You Start
- API Overview
- Calling APIs
-
Public Resource APIs
-
Application Management
- Verifying the Existence of an Application
- Querying Applications
- Creating an Application
- Querying Application Details
- Updating an Application
- Deleting an Application
- Querying an Application Secret
- Resetting an Application Secret
- Querying Application Members
- Setting Application Members
- Querying Candidate Members
- Asset Management
- Dictionary Management
- Public Management
- Instance Management
-
Application Management
-
Data Integration APIs
- Data Source Management
- Task Monitoring and Management
-
Task Management
- Creating a Common Task
- Querying a Task List
- Counting the number of tasks of different types and states
- Updating a Common Job
- Querying Information About a Specified Task
- Deleting a Task
- Manually Triggering a Single Task
- Manually Stopping the Current Task
- Creating a Schedule
- Querying a Schedule
- Modifying a Scheduling
- Starting or Stopping Tasks in Batches
- Creating a Composite Task
- Initializing the Combined Task
- Modifying a Combined Task
- Resetting the Progress of the Combined Task
- Creating Combined Task Mappings
- Deleting the Specified Task Mapping
-
Service Integration APIs
- Instance Management
- API Group Management
-
API Management
- Creating an API
- Modifying an API
- Deleting an API
- Publishing an API or Taking an API Offline
- Querying Details of an API
- Querying APIs
- Publishing APIs or Taking APIs Offline in Batches
- Debugging an API
- Switching API Versions
- Querying Details of an API Version
- Taking an API Version Offline
- Querying Historical Versions of an API
- Querying the Runtime Definition of an API
- Verifying the API Definition
- Environment Management
- Environment Variable Management
- Domain Name Management
- Request Throttling Policy Management
- Binding a Request Throttling Policy to an API
- Excluded Request Throttling Configuration
- Signature Key Management
- Binding Signature Keys
- Access Control Policy Management
- Binding Access Control Policies
- API Import and Export
-
VPC Channel Management
- Creating a VPC Channel
- Updating a VPC Channel
- Deleting a VPC Channel
- Querying Details of a VPC Channel
- Querying VPC Channels
- Adding or Updating Backend Servers
- Querying Backend Servers of a VPC Channel
- Updating a Backend Instance
- Deleting a Backend Server
- Enabling Backend Servers in Batches
- Disabling Backend Servers in Batches
- Modifying VPC Channel Health Checks
- Adding or Updating a Backend Server Group of a VPC Channel
- Querying the List of Backend Cloud Service Groups of a VPC Channel
- Viewing Details About a Backend Server Group of a VPC Channel
- Deleting a VPC Backend Server Group
- Updating a Backend Server Group of a VPC Channel
-
Client Configuration
- Querying Apps
- Querying Details of an App
- Creating an AppCode
- Generating an AppCode
- Querying AppCodes of an App
- Querying Details of an AppCode
- Deleting an AppCode
- Querying Application Quotas Associated with an Application Quota
- Configuring Access Control for an App
- Querying Details About App Access Control
- Deleting Access Control for an App
-
Client Quotas
- Creating a client quota.
- Modifying a Client Quota
- Delete a Client Quota
- Querying Details of a Client Quota
- Querying Client Quotas
- Binding Client Applications to a Client Quota
- Unbinding Client Applications from a Client Quota
- Querying Client Applications Bound to a Client Quota
- Querying Client Applications Available for Being Bound to a Client Quota
- Application Authorization Management
-
Custom Backend Management
- Creating a Backend API
- Modifying a Backend API
- Deleting a Backend API
- Querying Details of a Backend API
- Querying Backend APIs
- Creating a Backend API Script
- Testing a Backend API
- Querying the Backend API Test Result
- Deploying a Backend API
- Querying the Deployment History of a Backend API
- Canceling Deployment of a Backend API
- Verifying the Definition of a Custom Backend API
- Querying the Quota of a Custom Backend Service
- Querying Data Sources of a Custom Backend Service
- Custom Authorizer Management
- Querying Metrics
- Instance Feature Management
- Tag Management
- Configuration Management
- Application Configuration Management
- VPC Channel Management - Project-Level
-
SSL Certificate Management
- Obtaining SSL Certificates
- Create an SSL certificate.
- Binding a Domain Name to an SSL Certificate
- Unbinding a Domain Name from an SSL Certificate
- Viewing Certificate Details
- Deleting an SSL Certificate
- Modifying an SSL certificate
- Binding an SSL Certificate with a Domain Name
- Unbinding an SSL Certificate from a Domain Name
- Obtaining Domain Names Bound to an SSL Certificate
-
Plug-in Management
- Creating a Plug-in
- Modifying a Plug-in
- Deleting a Plug-in
- Querying a Plug-in
- Querying Plug-in Details
- Binding a Plug-in to an API
- Binding a Plug-in to an API
- Unbinding a Plug-in from an API
- Unbinding a Plug-in from an API
- Querying APIs Bound with a Plug-in
- Querying Plug-ins bound to an API
- Querying APIs that Can Be Bound to a Plug-in
- Querying Plug-ins that Can Be Bound to the Current API
- Message Integration APIs
-
Device Integration APIs
- Device Group Management
-
Device Management
- Creating a Device
- Querying Devices
- Bringing Devices Offline in Batches
- Deleting a Device
- Querying Device Details
- Modifying a Device
- Querying Device Topics
- Adding a Subdevice to the Gateway
- Querying Subdevices
- Querying a Device Shadow
- Resetting Device Authentication Information
- Querying Device Authentication Information
- Sending Commands
- Subscription Management
- Product Template Management
-
Product Management
- Creating a Product
- Querying Products
- Querying the Number of Devices in a Product
- Deleting a Product
- Querying Product Details
- Modifying Product Information
- Adding a Product Topic
- Querying Product Topics
- Deleting a Product Topic
- Modifying a Product Topic
- Resetting Product Authentication Information
- Querying Product Authentication Information
- Importing Products
- Exporting Products
- Rule Engine
-
Service Management
- Creating a Service
- Querying Services
- Deleting a Service
- Querying Service Details
- Modifying a Service
- Creating an Attribute
- Querying Attributes
- Creating a Command
- Querying Commands
- Deleting a Command
- Querying Command Details
- Modifying a Command
- Creating a Request Attribute
- Querying Request Attributes
- Deleting a Request Attribute
- Modifying a Request Attribute
- Creating a Response Attribute
- Querying Response Attributes
- Deleting a Response Attribute
- Querying Response Attributes
- Modifying a Response Attribute
- Example Applications
- Permissions Policies and Supported Actions
- Appendix
-
Out-of-Date APIs
-
APIC APIs (V1)
- API Group Management (V1)
- API Management (V1)
-
Custom Backend Management (V1)
- Querying Backend APIs
- Creating a Backend API
- Querying Details of a Backend API
- Modifying a Backend API
- Deleting a Backend API
- Querying the Backend API Test Result
- Querying the Deployment History of a Backend API
- Deploying a Backend API
- Testing a Backend API
- Canceling Deployment of a Backend API
- Creating a Backend API Script
- Application Authorization Management (V1)
- Custom Authorizer Management (V1)
- Device Integration APIs (V1)
-
APIC APIs (V1)
- Change History
- SDK Reference
-
FAQs
- Instance Management
-
Data Integration
-
Common Data Integration Tasks
- Which Types of Data Are Supported by FDI Databases?
- What Is a Trace Number? Can Data Be Traced by Using a Trace Number?
- Can I Clear the Destination Tables of an FDI Task?
- Can a Task Collect Data from One Table to Another Table?
- Is the FDI Task Created by One User Visible to Another User Under the Same Account?
- How Do I Configure FDI to Connect to MRS Hive of Other Tenants Through a Public Network?
- Will Collected Data Be Updated After a File Is Parsed from OBS to the RDS Database?
- Why Are the Column Values Incorrect When the CSV File on the Source OBS Is Parsed to the Relational Database?
- Can Data Be Integrated into the Destination When the Data Types of Source and Destination Fields of MRS Hive Do Not Match?
- Is the Mapping Between MRS Hive or MRS HBase and MongoDB Case-sensitive When It Is Manually Entered?
- Does MRS Hive Support Partitioning?
- How Do I Set the Custom Period for the API Data Source at the Source?
- Does SAP Support Reading Views by Page?
- Composite Data Integration Tasks
-
Common Data Integration Tasks
-
Service Integration
- Does APIC Support Multiple Backend Endpoints?
- How Do I Perform App Authentication in iOS System?
- How Can I Ensure the Security of Backend Services Invoked by APIC?
- Do I Need to Publish an API Again After Modification?
- What Can I Do If an API Published in a Non-RELEASE Environment Is Inaccessible?
- Can I Invoke Different Backend Services by Publishing an API in Different Environments?
- Can Applications Deployed in a VPC Call APIs?
- What Is the Maximum Size of an API Request Packet?
- Can I Modify a Deployed Custom Backend?
- How Does APIC Throttle Requests?
- What Are the Causes of an API Calling Failure?
- Message Integration
- Device Integration
-
Troubleshooting
-
Common Data Integration Tasks
- Garbled Characters Are Displayed When Data Is Written to MRS Hive at the Destination
- All Data Is Written to the First Field of MRS Hive
- An Error Message Is Displayed at the Destination Indicating Task Execution Times Out
- Error Message "could only be written to 0 of the 1 minReplication nodes. There are 2 datanode(s) running and 2 node(s) are excluded in this operation" Is Reported at the Destination During Data Integration from MySQL to MRS Hive
- Error Message "Illegal mix of collations for operation 'UNION'" Is Displayed at the Source Database During MySQL-to-MySQL Data Integration
- Data May Be Lost When Incremental Data Collection Is Performed from the Source MySQL on an Hourly Basis
- Error Message "401 unauthorized" Is Displayed at the Source During API-to-MySQL Data Integration
- Error Message "cannot find record mapping field" Is Displayed at the Destination During Kafka-to-MySQL Data Integration
- Error Message "connect timeout" Is Displayed at the Source During Scheduled API-to-MySQL Data Integration
- FDI Fails to Obtain Data During Real-Time Kafka-to-MySQL Data Integration Although Data Exists in MQS Topics
- Value of the Source Field of the tinyint(1) Type Is Changed from 2 to 1 at the Destination During Scheduled MySQL-to-MySQL Data Integration
- "The task executes failed.Writer data to kafka failed" Is Reported When the Kafka Destination Is Used over the Public Network
-
Composite Data Integration Tasks
- Data Fails to Be Written Because the RowId Field Type Is Incorrectly Configured in the Destination Table
- Error Message "binlog probably contains events generated with statement or mixed based replication forma" Is Displayed When the Binlog of the MySQL Database Is Read
- Data Still Fails to Be Written After an FDI Task Failure Is Rectified
- Camel Fails to Access the Database Because Table Names Contain Garbled Characters
- Inserted Data Violates the Non-null Constraint
- FDI Task Fails to Be Executed Because DWS Changes to the Read-only State
- Data Write to DWS Becomes Slower
- Data Sources
-
Service Integration
- Backend Service Fails to Be Invoked
- Error Message "No backend available" Is Displayed When an API Is Called
- Error Message "The API does not exist or has not been published in an environment" Is Displayed When an API Is Called Using JavaScript
- Common Errors Related to IAM Authentication Information
- A Message Is Displayed Indicating that the Certificate Chain Is Incomplete When You Add a Certificate
- Device Integration
-
Common Data Integration Tasks
- Videos
-
More Documents
-
User Guide (ME-Abu Dhabi Region)
- Service Overview
- Getting Started
- Getting Started
- Instances
- Integration Application Management
- Data Integration Guide
- Service Integration Guide
- Message Integration Guide
- Device Integration Guide
-
Data Source Management
- Data Sources Supported by ROMA Connect
- Connecting to an API Data Source
- Connecting to an ActiveMQ Data Source
- Connecting to an ArtemisMQ Data Source
- Connecting to a DB2 Data Source
- Connecting to a DIS Data Source
- Connecting to a DWS Data Source
- Connecting to a GaussDB 100 Data Source
- Connecting to a GaussDB 200 Data Source
- Connecting to an FTP Data Source
- Connecting to an FI HDFS Data Source
- Connecting to an FI Hive Data Source
- Connecting to an FI Kafka Data Source
- Connecting to an HL7 Data Source
- Connecting to a HANA Data Source
- Connecting to a HIVE Data Source
- Connecting to an LDAP Data Source
- Connecting to an IBM MQ Data Source
- Connecting to a Kafka Data Source
- Connecting to a MySQL Data Source
- Connecting to a MongoDB Data Source
- Connecting to an MRS Hive Data Source
- Connecting to an MRS HDFS Data Source
- Connecting to an MRS HBase Data Source
- Connecting to an MRS Kafka Data Source
- Connecting to an OBS Data Source
- Connecting to an Oracle Data Source
- Connecting to a PostgreSQL Data Source
- Connecting to a Redis Data Source
- Connecting to a RabbitMQ Data Source
- Connecting to an SAP Data Source
- Connecting to an SNMP Data Source
- Connecting to a SQL Server Data Source
- Connecting to a TaurusDB Data Source
- Connecting to a WebSocket Data Source
- Connecting to a Custom Data Source
- Asset Management
-
FAQs
- Common Operations
-
Data Integration Tasks
- Which Types of Data Are Supported by FDI Databases?
- What Is a Trace Number? Can Data Be Traced by Using a Trace Number?
- Does an FDI Task Support Clearance of a Destination Table?
- Can a Task Collect Data from One Table to Another Table?
- Is the FDI Task Created by One User Visible to Another User Under the Same Account?
- How Do I Configure FDI to Connect to MRS Hive of Other Tenants Through a Public Network?
- Will Collected Data Be Updated After a File Is Parsed from OBS to the RDS Database?
- Why Are the Column Values Incorrect When the CSV File on the Source OBS Is Parsed to the Relational Database?
- Can Data Be Integrated into the Destination When the Data Types of Source and Destination Fields of MRS Hive Do Not Match?
- Is the Mapping Between MRS Hive or MRS HBase and MongoDB Case-sensitive When It Is Manually Entered?
- Does MRS Hive Support Partitioning?
- How Can I Configure the Time for Triggering a Scheduled Task of FDI?
- How Do I Set the Custom Period for the API Data Source at the Source?
- Composite Data Integration Tasks
-
Service Integration
- Which Languages Does APIC Support for SDKs?
- Does APIC Support Multiple Backend Endpoints?
- Which Error Codes Will Be Displayed When I Use APIC?
- How Can I Ensure the Security of Backend Services Invoked by APIC?
- Do I Need to Publish an API Again After Modification?
- What Can I Do If an API Published in a Non-RELEASE Environment Is Inaccessible?
- Can I Invoke Different Backend Services by Publishing an API in Different Environments?
- What Is the Maximum Size of an API Request Package?
- Can I Modify an API After It Is Deployed on the Custom Backend?
- How Does APIC Throttle Requests?
- What Are the Causes of an API Calling Failure?
- Can I Call an API If the API Is Not Bound to an EIP?
- Which Data Sources Does APIC Support?
- Message Integration
- Device Integration
-
Troubleshooting
-
Common Data Integration Tasks
- Garbled Characters Are Displayed When Data Is Written to MRS Hive at the Destination
- All Data Is Written to the First Field of MRS Hive
- An Error Message Is Displayed at the Destination Indicating Task Execution Times Out
- Error Message "could only be written to 0 of the 1 minReplication nodes. There are 2 datanode(s) running and 2 node(s) are excluded in this operation" Is Reported at the Destination During Data Integration from MySQL to MRS Hive
- Error Message "Illegal mix of collations for operation 'UNION'" Is Displayed at the Source Database During MySQL-to-MySQL Data Integration
- Data May Be Lost When Incremental Data Collection Is Performed from the Source MySQL on an Hourly Basis
- Error Message "401 unauthorized" Is Displayed at the Source During API-to-MySQL Data Integration
- Error Message "cannot find record mapping field" Is Displayed at the Destination During Kafka-to-MySQL Data Integration
- Error Message "connect timeout" Is Displayed at the Source During Scheduled API-to-MySQL Data Integration
- FDI Fails to Obtain Data During Real-Time Kafka-to-MySQL Data Integration Although Data Exists in MQS Topics
- Value of the Source Field of the tinyint(1) Type Is Changed from 2 to 1 at the Destination During Scheduled MySQL-to-MySQL Data Integration
-
Composite Data Integration Tasks
- Data Fails to Be Written Because the RowId Field Type Is Incorrectly Configured in the Destination Table
- Error Message "binlog probably contains events generated with statement or mixed based replication forma" Is Displayed When the Binlog of the MySQL Database Is Read
- Data Still Fails to Be Written After an FDI Task Failure Is Rectified
- Camel Fails to Access the Database Because Table Names Contain Garbled Characters
- Inserted Data Violates the Non-null Constraint
- FDI Task Fails to Be Executed Because DWS Changes to the read-only State
- Speed of Writing Data to DWS Becomes Slower
- Data Sources
- Service Integration
- Device Integration
-
Common Data Integration Tasks
-
Developer Guide (ME-Abu Dhabi Region)
-
Developer Guide for Service Integration
- How Do I Choose an Authentication Mode
- Using App Authentication to Call APIs
- Using IAM Authentication to Call APIs
- Signing Backend Services
-
Developing Function APIs
- Function API Script Compilation Guide
- APIConnectResponse
- Base64Utils
- CacheUtils
- CipherUtils
- ConnectionConfig
- DataSourceClient
- DataSourceConfig
- ExchangeConfig
- HttpClient
- HttpConfig
- JedisConfig
- JSON2XMLHelper
- JSONHelper
- JsonUtils
- JWTUtils
- KafkaConsumer
- KafkaProducer
- KafkaConfig
- MD5Encoder
- Md5Utils
- ObjectUtils
- QueueConfig
- RabbitMqConfig
- RabbitMqProducer
- RedisClient
- RomaWebConfig
- RSAUtils
- SapRfcClient
- SapRfcConfig
- SoapClient
- SoapConfig
- StringUtils
- TextUtils
- XmlUtils
- Developing Data API Statements
-
Developer Guide for Message Integration
- Overview and Network Environment Preparation
- Collecting Connection Information
-
Connecting to MQS in Client Mode
- Recommendations for Client Usage
- Setting Parameters for Clients
- Setting Up the Java Development Environment
- Configuring Kafka Clients in Java
- Configuring Kafka Clients in Python
- Configuring Kafka Clients in Other Languages
- Appendix: Methods for Improving the Message Processing Efficiency
- Appendix: Restrictions on Spring Kafka Interconnection
- Connecting to MQS Using RESTful APIs
-
Developer Guide for Device Integration
- Device Integration Development
-
MQTT Topic Specifications
- Before You Start
- Gateway Login
- Adding a Gateway Subdevice
- Response for Adding a Gateway Subdevice
- Updating the Gateway Subdevice Status
- Response for Updating the Gateway Subdevice Status
- Deleting a Gateway Subdevice
- Querying Gateway Information
- Response for Querying Gateway Information
- Delivering a Command to a Device
- Response for Delivering a Command to a Device
- Reporting Device Data
-
Developer Guide for Service Integration
-
API Reference (ME-Abu Dhabi Region)
- Before You Start
- Calling APIs
-
Public Resource APIs
-
Application Management
- Verifying the Existence of an Application
- Querying Applications
- Creating an Application
- Querying Application Details
- Updating an Application
- Deleting an Application
- Querying an Application Secret
- Resetting an Application Secret
- Querying Application Members
- Setting Application Members
- Querying Candidate Members
- Asset Management
- Dictionary Management
- Public Management
-
Application Management
-
APIC APIs
- Instance Management
- API Group Management
-
API Management
- Creating an API
- Modifying an API
- Deleting an API
- Publishing an API or Taking an API Offline
- Querying Details of an API
- Querying APIs
- Publishing APIs or Taking APIs Offline in Batches
- Debugging an API
- Switching API Versions
- Querying Details of an API Version
- Taking an API Version Offline
- Querying Historical Versions of an API
- Querying the Runtime Definition of an API
- Verifying the API Definition
- Environment Management
- Environment Variable Management
- Domain Name Management
- Request Throttling Policy Management
- Binding/Unbinding Request Throttling Policies
- Excluded Request Throttling Configuration
- Signature Key Management
- Binding/Unbinding Signature Keys
- Access Control Policy Management
- Binding/Unbinding Access Control Policies
- API Import and Export
- VPC Channel Management
-
Client Configuration
- Querying Apps
- Querying Details of an App
- Creating an AppCode
- Generating an AppCode
- Querying AppCodes of an App
- Querying Details of an AppCode
- Deleting an AppCode
- Querying Application Quotas Associated with an Application Quota
- Configuring Access Control for an App
- Querying Details About App Access Control
- Deleting Access Control for an App
-
Client Quota
- This API is used to create a client quota.
- Modifying a Client Quota
- Delete a Client Quota
- Querying Details of a Client Quota
- Querying Client Quotas
- Binding Client Applications to a Client Quota
- Unbinding Client Applications from a Client Quota
- Querying Client Applications Bound to a Client Quota
- Querying Client Applications Available for Being Bound to a Client Quota
- App Authorization Management
-
Custom Backend Management
- Creating a Backend API
- Modifying a Backend API
- Deleting a Backend API
- Querying Details of a Backend API
- Querying Backend APIs
- Creating a Backend API Script
- Testing a Backend API
- Querying the Backend API Test Result
- Deploying a Backend API
- Querying the Deployment History of a Backend API
- Canceling Deployment of a Backend API
- Verifying the Definition of a Custom Backend API
- Querying the Quota of a Custom Backend Service
- Querying Data Sources of a Custom Backend Service
- Custom Authorizer Management
- Monitoring Information Query
- Instance Feature Management
- Tag Management
- Configuration Management
- Application Configuration Management
- MQS APIs
-
Device Integration APIs
- Device Group Management
-
Device Management
- Creating a Device
- Querying Devices
- Bringing Devices Offline in Batches
- Deleting a Device
- Querying Device Details
- Modifying a Device
- Querying Device Topics
- Adding a Subdevice to the Gateway
- Querying Subdevices
- Querying a Device Shadow
- Resetting Device Authentication Information
- Querying Device Authentication Information
- Sending Commands
- Product Template Management
-
Product Management
- Creating a Product
- Querying Products
- Querying the Number of Devices in a Product
- Deleting a Product
- Querying Product Details
- Modifying Product Information
- Adding a Product Topic
- Querying Product Topics
- Deleting a Product Topic
- Modifying a Product Topic
- Resetting Product Authentication Information
- Querying Product Authentication Information
- Importing Products
- Exporting Products
- Rule Engine
-
Service Management
- Creating a Service
- Querying Services
- Deleting a Service
- Querying Service Details
- Modifying a Service
- Creating an Attribute
- Querying Attributes
- Creating a Command
- Querying Commands
- Deleting a Command
- Querying Command Details
- Modifying a Command
- Creating a Request Attribute
- Querying Request Attributes
- Deleting a Request Attribute
- Modifying a Request Attribute
- Creating a Response Attribute
- Querying Response Attributes
- Deleting a Response Attribute
- Querying Response Attributes
- Modifying a Response Attribute
- Appendix
-
User Guide (ME-Abu Dhabi Region)
- General Reference
Copied.
Migrating Kafka Services
Overview
Kafka service migration refers to the process of migrating the production and consumption message clients connected to other Kafka services to ROMA Connect and migrating some persistent message files to ROMA Connect.
During service migration, services that have high requirements on continuity must be smoothly migrated to the cloud because they cannot afford a long downtime.
Preparations
- Ensure that the network connections between the message production and consumption clients and the MQS connection address of the ROMA Connect instance are normal. You can view the MQS connection address on the Instance Information page of the ROMA Connect console.
- If a private IP address is used for the connection, the clients and the ROMA Connect instance must be in the same VPC. If the clients and the ROMA Connect instance are in different VPCs, you can create a VPC peering connection to enable communication between the two VPCs. For details, see VPC Peering Connection.
- If a public network address is used for connection, the clients must have the permission to access the public network.
- Ensure that the MQS specifications of the ROMA Connect instance cannot be lower than the Kafka specifications used by the original service. For details about MQS specifications, see MQS Specifications.
- Create a topic with the same configurations as the original Kafka instance, including the topic name, number of replicas, number of partitions, message aging time, and whether to enable synchronous replication and flushing.
Migration Scheme 1: Migrating the Production First
- Solution
In this solution, first migrate the message production service to ROMA Connect so that the original Kafka does not generate new messages. After all messages in the original Kafka are consumed, migrate the message consumption service to ROMA Connect to consume new messages.
This is a common migration solution in the industry because the operation procedure is simple. The migration process is controlled by the service side. During the entire process, messages are not out of order. However, latency may occur because there is a period when you have to wait for all data to be consumed.
This scheme is applicable to services that require the message sequence but are insensitive to the end-to-end latency.
- Migration Process
- Change the Kafka connection address of the production client to the MQS connection address of the ROMA Connect instance.
- Restart the production service so that the producer can send new messages to the new ROMA Connect instance.
- Check the consumption progress of each consumer group in the original Kafka instance until all data in the original Kafka instance is consumed.
- Change the Kafka connection address of the consumer client to the MQS connection address of the ROMA Connect instance.
- Restart the consumption service so that consumers can consume messages from the ROMA Connect instance.
- Check whether consumers consume messages properly from the ROMA Connect instance.
- The migration is completed.
Migration Scheme 2: Migrating the Production Later
- Solution
Use multiple consumers for the consumption service. Some consume messages from the original Kafka instance, and others consume messages from the ROMA Connect instance. Then, migrate the production service to the ROMA Connect instance so that all messages can be consumed in time.
In this scheme, the consumption service may consume messages from the original Kafka and ROMA Connect at the same time in a period of time. Before the production service is migrated, the consumer service has been running on ROMA Connect. Therefore, there is no end-to-end latency problem. However, early on in the migration, data is consumed from both the original Kafka instance and ROMA Connect instance, so the messages may not be consumed in the order that they are produced.
This scheme is suitable for services that require low latency but do not require strict message sequence.
- Migration Process
- Start new consumer clients, set the Kafka connection addresses to that of the ROMA Connect instance, and consume data from the ROMA Connect instance.
NOTE:
Original consumer clients must continue running. Messages are consumed from both the original Kafka instance and ROMA Connect instance.
- Modify the production client and change the Kafka connection address to the MQS connection address of the ROMA Connect instance.
- Restart the producer client to migrate the production service to the ROMA Connect instance.
- After the production service is migrated, check whether the consumption service connected to the ROMA Connect instance is normal.
- After all data in the original Kafka is consumed, close the original consumption clients.
- The migration is completed.
- Start new consumer clients, set the Kafka connection addresses to that of the ROMA Connect instance, and consume data from the ROMA Connect instance.
Migrating Persistent Data
You can migrate consumed data from the original Kafka instance to the ROMA Connect instance by using the open-source tool MirrorMaker. This tool mirrors the original Kafka consumer and ROMA Connect producer and migrates data to the ROMA Connect instance.
If the topic of the original Kafka contains a single replica and the topic of the ROMA Connect instance contains three replicas, it is recommended that the storage space of the ROMA Connect instance be three times that of the original Kafka.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot