Creating Links

Scenario

Before creating a data migration job, create a link to enable the CDM cluster to read data from and write data to a data source. A migration job requires a source link and a destination link. For details on the data sources that can be exported (source links) and imported (destination links) in different migration modes (table/file migration or scenario-based migration), see Supported Data Sources.

The link configurations depend on the data source. This section describes how to create these links.

Prerequisites

Procedure

  1. Log in to the CDM management console.
  2. In the left navigation pane, click Cluster Management. Locate the target cluster, choose Job Management > Link Management > Create Link, and select a connector. See Figure 1.

    The connectors are classified based on the type of the data source to be connected. All supported data types are displayed.

    Figure 1 Selecting a connector

  3. Select a data source and click Next. The following describes how to create a MySQL link.

    Figure 2 Creating a MySQL Link
    The link parameters of different data sources vary. Table 1 describes the link parameters.
    Table 1 Link parameters

    Connector

    Description

    • Data Warehouse Service
    • RDS for MySQL
    • RDS for PostgreSQL
    • RDS for SQL Server
    • MySQL
    • PostgreSQL
    • Microsoft SQL Server
    • Oracle
    • IBM Db2
    • FusionInsight LibrA
    • Derecho (GaussDB)
    • NewSQL (GaussDB)
    • SAP HANA
    • MYCAT
    • Dameng database
    • Sharding

    Because the JDBC drivers used to connect to these relational databases are the same, the parameters to be configured are also the same and are described in Link to Relational Databases.

    • When importing data to DWS, specify the COPY or GDS import mode to improve the import performance. You can specify the Import Mode parameter when creating a DWS link.
    • When importing data to RDS for MySQL, enable the LOAD DATA function of MySQL to accelerate data import and improve the import performance. You can configure Use Local API to enable the function when you create a MySQL link.

    HUAWEI CLOUD OBS

    If the data source is OBS, see Link to OBS.

    Alibaba Cloud OSS

    If the data source is OSS on Alibaba Cloud, see Link to OSS on Alibaba Cloud.

    Currently, data can only be exported from OSS to OBS.

    Qiniu Cloud Object Storage (KODO)

    Tencent Cloud COS

    If the data source is KODO or COS, see Link to KODO/COS.

    Currently, data can only be exported from KODO/COS to OBS.

    Amazon S3

    If the data source is Amazon S3, see Link to Amazon S3.

    Currently, objects can only be exported from Amazon S3 to OBS.

    • MRS HDFS
    • FusionInsight HDFS
    • Apache HDFS

    If the data source is HDFS of MRS, Apache Hadoop, or FusionInsight HD, see Link to HDFS.

    NOTE:

    If Run Mode is set to Standalone, CDM can migrate data between HDFSs of multiple MRS clusters.

    • MRS HBase
    • FusionInsight HBase
    • Apache HBase

    If the data source is HBase of MRS, Apache Hadoop, or FusionInsight HD, see Link to HBase.

    • MRS Hive
    • FusionInsight Hive
    • Apache Hive

    If the data source is Hive of MRS, see Link to Hive.

    CloudTable Service

    If the data source is CloudTable, see Link to CloudTable.

    • FTP
    • SFTP

    If the data source is an FTP or SFTP server, see Link to an FTP or SFTP Server.

    • HTTP
    • HTTPS

    These connectors are used to read files with an HTTP/HTTPS URL, such as reading public files on the third-party object storage system and web disks.

    When creating an HTTP link, you only need to configure the link name. The URL is configured during job creation.

    • NAS
    • SFS Turbo

    If the data source is a NAS server, see Link to NAS/SFS.

    CIFS, SMB, and NFS are supported. CDM can connect to dedicated file servers, Windows file sharing servers, Linux Samba servers, and cloud services that support CIFS, SMB, or NFS file systems such as SFS.

    • MongoDB
    • Document Database Service

    If the data source is a local MongoDB or DDS, see Link to MongoDB.

    • Redis
    • Distributed Cache Service

    If the data source is a local Redis database or DCS, see Link to Redis/DCS.

    Currently, data can be imported to but cannot be exported from DCS. Data can be imported to and exported from the open source Redis.

    Apache Kafka

    If the data source is the open source Kafka, see Link to Kafka.

    Currently, data can only be exported from Kafka to CSS, DIS, or DMS Kafka.

    Data Ingestion Service

    If the data source is DIS, see Link to DIS.

    Currently, data can only be exported from DIS to CSS, Apache Kafka, or DMS Kafka.

    • Cloud Search Service
    • Elasticsearch

    If the data source is CSS or Elasticsearch, see Link to Elasticsearch/CSS.

    Data Lake Insight

    If the data source is DLI, see Link to DLI.

    Currently, data can be imported to but cannot be exported from DLI.

    OpenTSDB

    If the data source is OpenTSDB, see Link to CloudTable OpenTSDB.

    DMS Kafka

    If the data source is DMS Kafka, see Link to DMS Kafka.

    Currently, data can only be exported from DMS Kafka to CSS, Apache Kafka, DIS, or DMS Kafka.

  4. After configuring the parameters of the link, click Test to check whether the link is available. Alternatively, click Save, and the system checks automatically.

    If the network is poor or the data source is too large, the link test may take 30 to 60 seconds.