Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Situation Awareness
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Testing Connections in Batches

Function

This API is used to test connections in batches.

Debugging

You can debug the API in API Explorer to support automatic authentication. API Explorer can automatically generate and debug example SDK code.

Constraints

  • After the task is created, you can test the connection only when the task status is CONFIGURATION.
  • In the dual-active DR scenario, the backward task can be executed only when the forward task is in INCRE_TRANSFER_STARTED state. The parent task cannot call the API.
  • You can call a maximum of 10 APIs in batches.

URI

POST /v3/{project_id}/jobs/batch-connection

Table 1 Path parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID of a tenant in a region

For details about how to obtain the project ID, see Obtaining a Project ID.

Request Parameters

Table 2 Request header parameters

Parameter

Mandatory

Type

Description

Content-Type

Yes

String

The content type.

The default value is application/json.

X-Auth-Token

Yes

String

User token obtained from IAM.

X-Language

No

String

Request language type

Default value: en-us

Values:

  • en-us
  • zh-cn
Table 3 Request body parameters

Parameter

Mandatory

Type

Description

jobs

Yes

Array of objects

List of requests for testing connections in batches.

For details, see Table 4.

Table 4 Data structure description of field jobs

Parameter

Mandatory

Type

Description

id

Yes

String

DRS task ID, which can be obtained from the task list or task details page.

net_type

Yes

String

Network type. Value:

  • vpn
  • vpc
  • eip

db_type

Yes

String

Database type. Value:

  • mysql: MySQL
  • mongodb: MongoDB
  • gaussdbv5: GaussDB Distributed
  • taurus: TaurusDB
  • gaussdbv5ha: GaussDB Centralized
  • kafka: Kafka
  • postgresql: PostgreSQL
  • oracle: Oracle

ip

Yes

String

Database IP address.

db_port

No

Integer

Database port number. This parameter must be set to 0 for the Mongo and DDS databases.

inst_id

No

String

DB instance ID. This parameter is mandatory when the database is a cloud instance, for example, an RDS instance.

db_user

Yes

String

Database account.

db_password

Yes

String

Database password.

ssl_link

No

Boolean

Whether SSL is enabled. If this parameter is set to true, you need to set parameters related to the SSL certificate.

ssl_cert_key

No

String

SSL certificate content, which is a character string encrypted using BASE64 after the SSL certificate is obtained. This parameter is mandatory when ssl_link is set to true.

ssl_cert_name

No

String

SSL certificate name. This parameter is mandatory when ssl_link is set to true.

ssl_cert_check_sum

No

String

The checksum value of the SSL certificate content encrypted using SHA256 after the SSL certificate is obtained, which is used for backend verification. This parameter is mandatory when ssl_link is set to true.

ssl_cert_password

No

String

The SSL certificate password. The certificate file name extension is .p12 and requires a password.

vpc_id

No

String

ID of the VPC where the instance resides. This parameter is mandatory when the database is a cloud instance, for example, an RDS instance.

subnet_id

No

String

ID of the subnet where the instance resides. This parameter is mandatory when the database is a cloud instance, for example, an RDS instance.

end_point_type

Yes

String

Source database: so. Destination database: ta.

Default value: so

Values:

  • so
  • ta

region

No

String

Region where the DB instance is located. This parameter is mandatory when the database is a cloud instance, for example, an RDS instance.

project_id

No

String

Project ID of the region where the user is located.

db_name

No

String

Database user name, which is the DDS authentication database or the service name of the Oracle database.

kafka_security_config

No

Object

This parameter is only for Kafka security authentication.

For details, see Table 5.

Table 5 Data structure description of field kafka_security_config

Parameter

Mandatory

Type

Description

type

No

String

Security protocol. This parameter is mandatory for security authentication. The corresponding field for Kafka is security.protocol.

  • PLAINTEXT: No security authentication mode is available. You only need to enter an IP address and a port number.
  • SASL_PLAINTEXT: The SASL mechanism is used to connect to Kafka, and you need to configure SASL parameters.
  • SSL: The SSL encryption is used to connect to Kafka, and you need to configure SSL parameters.
  • SASL_SSL: The SASL and SSL encryption authentication modes are used. You need to configure SSL and SASL parameters.

Enumerated values:

  • PLAINTEXT
  • SASL_PLAINTEXT
  • SASL_SSL
  • SSL

trust_store_key_name

No

String

Certificate name. This parameter is mandatory when the security protocol is set to SSL or SASL_SSL.

trust_store_key

No

String

Value of the security certificate after Base64 transcoding. This parameter is mandatory when the security protocol is set to SSL or SASL_SSL.

trust_store_password

No

String

Certificate password. This parameter is mandatory when security authentication is used.

endpoint_algorithm

No

String

Host name endpoint identification algorithm, which specifies the endpoint identification algorithm for verifying the server host name using the server certificate. If this parameter is left blank, host name verification is disabled. The corresponding field for Kafka is ssl.endpoint.identification.algorithm.

sasl_mechanism

No

String

SASL mechanism used for client connection. This parameter is mandatory when the authentication type is set to SASL_PLAINTEXT or SASL_SSL. The corresponding field for Kafka is sasl.mechanism. The values are as follows:

  • GSSAPI
  • PLAIN
  • SCRAM-SHA-256
  • SCRAM-SHA-512

delegation_tokens

No

Boolean

Whether to use token authentication. This parameter is valid only when the security protocol is set to SASL_SSL or SASL_PLAINTEXT and the SASL mechanism is set to SCRAM-SHA-256 or SCRAM-SHA-512.

enable_key_store

No

Boolean

Whether to enable two-way SSL authentication.

key_store_key

No

String

Keystore certificate. This parameter is mandatory when two-way SSL authentication is enabled.

key_store_key_name

No

String

Keystore certificate name. This parameter is mandatory when two-way SSL authentication is enabled.

key_store_password

No

String

Keystore certificate password. This parameter is mandatory when a password is set for the certificate. The corresponding field for Kafka is ssl.keystore.password.

set_private_key_password

No

Boolean

Whether to set the keystore private key password. The default value is false.

key_password

No

String

Keystore private key password. This parameter is mandatory when two-way SSL authentication is enabled and set_private_key_password is set to true. The corresponding field for Kafka is ssl.key.password.

Response Parameters

Status code: 200

Table 6 Response body parameters

Parameter

Type

Description

results

Array of objects

Response body set for the batch test connection.

For details, see Table 7.

count

Integer

Total number of records.

Table 7 Data structure description of field results

Parameter

Type

Description

id

String

Task ID.

status

String

Test result. Value:

  • success: indicates that the connection test is successful.
  • failed: indicates that the connection test fails.

error_code

String

Error code.

error_msg

String

Error message.

success

Boolean

Whether the request is successful.

Example Request

  • Testing connections for a DDS real-time migration task in which the destination database is DDS
    https://{endpoint}/v3/054ba152d480d55b2f5dc0069e7ddef0/jobs/batch-connection
    
    {
      "jobs" : [ {
        "id" : "140b5236-88ad-43c8-811c-1268453jb101",
        "ip" : "192.168.4.66:8635,192.168.4.83:8635",
        "net_type" : "eip",
        "db_type" : "mongodb",
        "db_port" : 0,
        "db_user" : "root",
        "db_password" : "********",
        "inst_id" : "3cadd5a0ef724f55ac7fa5bcb5f4fc5fin02",
        "project_id" : "0549a6a31000d4e82fd1c00c3d6f2d76",
        "region" : "cn-xianhz-1",
        "end_point_type" : "ta"
      } ]
    }
  • Testing connections for an RDS for MySQL real-time migration task in which the destination database is RDS for MySQL
    https://{endpoint}/v3/054ba152d480d55b2f5dc0069e7ddef0/jobs/batch-connection
    
    {
      "jobs" : [ {
        "id" : "140b5236-88ad-43c8-811c-1268453jb101",
        "ip" : "192.168.0.131",
        "net_type" : "eip",
        "db_type" : "mysql",
        "db_port" : 3306,
        "db_user" : "root",
        "db_password" : "********",
        "inst_id" : "e05a3679efe241d8b5dee80b17c1a863in01",
        "project_id" : "054ba152d480d55b2f5dc0069e7ddef0",
        "region" : "cn-xianhz-1",
        "end_point_type" : "ta"
      } ]
    }
  • Testing connections for a MySQL real-time migration task in which the source database is not RDS
    https://{endpoint}/v3/054ba152d480d55b2f5dc0069e7ddef0/jobs/batch-connection
    
    {
      "jobs" : [ {
        "id" : "140b5236-88ad-43c8-811c-1268453jb101",
        "ip" : "192.168.0.27",
        "net_type" : "eip",
        "db_type" : "mysql",
        "db_port" : 3306,
        "db_user" : "root",
        "db_password" : "********",
        "ssl_link" : false,
        "end_point_type" : "so"
      } ]
    }
  • Creating a real-time synchronization task from MySQL to Kafka and setting the Kafka authentication mode to PLAINTEXT
    https://{endpoint}/v3/054ba152d480d55b2f5dc0069e7ddef0/jobs/batch-connection
    
    {
      "jobs": [
        {
          "id": "3bc38fe4-da50-4aad-903e-5db76d8jb20i",
          "ip": "xxxxxxx:xxxx",
          "net_type": "eip",
          "db_type": "kafka",
          "project_id": "5237e10fe9aa4ad5b16b6a5245248314",
          "end_point_type": "ta",
          "kafka_security_config": {
            "type": "PLAINTEXT"
          }
        }
      ]
    }
  • Creating a real-time synchronization task from MySQL to Kafka and setting the Kafka authentication mode to SASL_PLAINTEXT
    https://{endpoint}/v3/054ba152d480d55b2f5dc0069e7ddef0/jobs/batch-connection
    
    {
      "jobs": [
        {
          "id": "3bc38fe4-da50-4aad-903e-5db76d8jb20i",
          "ip": "xxxxxxx:xxxx",
          "net_type": "eip",
          "db_type": "kafka",
          "db_user": "xxxxxxx",
          "db_password": "xxxxxxx",
          "project_id": "5237e10fe9aa4ad5b16b6a5245248314",
          "end_point_type": "ta",
          "kafka_security_config": {
            "type": "SASL_PLAINTEXT",
            "sasl_mechanism": "PLAIN"
          }
        }
      ]
    }
  • Creating a real-time synchronization task from MySQL to Kafka and setting the Kafka authentication mode to SSL
    https://{endpoint}/v3/054ba152d480d55b2f5dc0069e7ddef0/jobs/batch-connection
    
    {
      "jobs": [
        {
          "id": "3bc38fe4-da50-4aad-903e-5db76d8jb20i",
          "ip": "xxxxxxx:xxxx",
          "net_type": "eip",
          "db_type": "kafka",
          "project_id": "5237e10fe9aa4ad5b16b6a5245248314",
          "end_point_type": "ta",
          "kafka_security_config": {
            "type": "SSL",
            "trust_store_key_name": "client.truststore.jks",
            "trust_store_key": "xxxxxx",
            "trust_store_password": "xxxxxx",
            "endpoint_algorithm": "",
            "enable_key_store": false
          }
        }
      ]
    }
  • Creating a real-time synchronization task from MySQL to Kafka and setting the Kafka authentication mode to SASL_SSL
    https://{endpoint}/v3/054ba152d480d55b2f5dc0069e7ddef0/jobs/batch-connection
    
    {
      "jobs": [
        {
          "id": "3bc38fe4-da50-4aad-903e-5db76d8jb20i",
          "ip": "xxxxxxx:xxxx",
          "net_type": "eip",
          "db_type": "kafka",
          "db_user": "xxxxxxx",
          "db_password": "xxxxxxx",
          "project_id": "5237e10fe9aa4ad5b16b6a5245248314",
          "end_point_type": "ta",
          "kafka_security_config": {
            "type": "SSL",
            "trust_store_key_name": "client.truststore.jks",
            "trust_store_key": "xxxxxx",
            "trust_store_password": "xxxxxx",
            "endpoint_algorithm": "",
            "enable_key_store": false
          }
        }
      ]
    }

Example Response

Status code: 200

OK

{
  "results" : [ {
    "success" : true,
    "id" : "140b5236-88ad-43c8-811c-1268453jb101",
    "status" : "success"
  } ],
  "count" : 1
}

Status Code

Status Code

Description

200

OK

400

Bad Request

Error Code

For details, see Error Code.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback