Updated on 2024-11-08 GMT+08:00

Example Typical Scenario: Migrating Data from GaussDB(DWS) to DLI

This section describes how to use CDM's data synchronization to migrate data from GaussDB(DWS) to DLI.

Prerequisites

  • You have created a DLI SQL queue. For how to create a DLI queue, see Creating a Queue.

    Set Type to For SQL when buying a queue.

  • You have created a GaussDB(DWS) cluster. For how to create a GaussDB(DWS) cluster, see Creating a Cluster.
  • You have created a CDM cluster. For how to create a CDM cluster, see Creating a CDM Cluster.
    • To connect the cluster to an on-premises database as the destination data source, you can use either Internet or Direct Connect. If the Internet is used, make sure that an EIP has been bound to the CDM cluster, the security group of CDM allows outbound traffic from the host where the on-premises data source is located, the host where the data source is located can access the Internet, and the connection port has been enabled in the firewall rules.
    • If the data source is GaussDB(DWS) or MRS, the network must meet the following requirements:

      i. If the CDM cluster and the cloud service are in different regions, they must be connected through either the Internet or Direct Connect. If the Internet is used, make sure that an EIP has been bound to the CDM cluster, the host where the data source is located can access the Internet, and the port has been enabled in the firewall rules.

      ii. If the CDM cluster and the cloud service are in the same region, instances in the same VPC, subnet, and security group can communicate with each other by default. If the CDM cluster and the cloud service are in the same VPC but in different subnets or security groups, you must configure routing and security group rules.

      For how to configure routing rules, see Configure routes. For how to configure security group rules, see Security Group Configuration Examples.

      iii. The cloud service instance and the CDM cluster belong to the same enterprise project. If they do not, change the enterprise project of the workspace.

    In this example, the VPC, subnet, and security group of the CDM cluster match those of the GaussDB(DWS) cluster.

Step 1: Prepare Data

  • Create a database and table in the GaussDB(DWS) cluster.
    1. Connect to the existing GaussDB(DWS) cluster by referring to Using the gsql CLI Client to Connect to a Cluster.
    2. Connect to the default database gaussdb of the GaussDB(DWS) cluster.
      gsql -d gaussdb -h Connection address of the GaussDB(DWS) cluster -U dbadmin -p 8000 -W password -r
      • gaussdb: Default database of the GaussDB(DWS) cluster.
      • Connection address of the DWS cluster: If a public address is used, set it to Public Network Address or Public Network Access Domain Name. If a private address is used, set it to Private Network Address or Private Network Access Domain Name. For details, see Obtaining the Cluster Connection Address. If an ELB is used, set it to the ELB address.
      • dbadmin: Default administrator username used during cluster creation.
      • -W: Default password of the administrator.
    3. Create the testdwsdb database.
      CREATE DATABASE testdwsdb;
    4. Exit the gaussdb database and connect to testdwsdb.
      \q
      gsql -d testdwsdb -h Connection address of the GaussDB(DWS) cluster -U dbadmin -p 8000 -W password -r
    5. Create a table and import data into it.
      Create a table.
      CREATE TABLE table1(id int, a char(6), b varchar(6),c varchar(6)) ;
      Insert data into the table.
      INSERT INTO table1 VALUES(1,'123','456','789');
      INSERT INTO table1 VALUES(2,'abc','efg','hif');
    6. Query the table data to verify that the data is inserted.
      select * from table1;
      Figure 1 Querying data in the table
  • Create a database and table on DLI.
    1. Log in to the DLI management console. In the navigation pane on the left, choose SQL Editor. On the displayed page, set Engine to Spark and Queues to the created SQL queue.

      Create a database, for example, testdb. For the syntax to create a DLI database, see Creating a Database.

      create database testdb;
    2. On the SQL Editor page, set Databases to testdb and run the following table creation statement to create a table in the database. For the table creation syntax, see Creating a DLI Table Using the DataSource Syntax.
      create table tabletest(id INT, name1 string, name2 string, name3 string);

Step 2: Migrate Data

  1. Create a CDM connection.
    1. Create a connection to the GaussDB(DWS) database.
      1. Log in to the CDM console. In the navigation pane on the left, choose Cluster Management. On the displayed page, locate the created CDM cluster and click Job Management in the Operation column.
      2. On the Job Management page, click the Links tab, and click Create Link. On the displayed page, select Data Warehouse Service and click Next.
      3. Configure the connection as follows:
        Table 1 GaussDB(DWS) data source configuration

        Parameter

        Value

        Name

        Name of the GaussDB(DWS) data source, for example, source_dws.

        Database Server

        Click Select next to the text box and select the name of the created GaussDB(DWS) cluster.

        Port

        Port number of the GaussDB(DWS) database, which is 8000 by default.

        Database Name

        Name of the GaussDB(DWS) database you want to migrate. In this example, the testdwsdb database created in Create a database and table in the GaussDB(DWS) cluster is used.

        Username

        Username used for accessing the database. This user must have the permissions to read and write data tables and metadata.

        In this example, the default administrator dbadmin specified when you create the GaussDB(DWS) database is used.

        Password

        Password of the GaussDB(DWS) database user.

        Figure 2 Configuring the GaussDB(DWS) connection

        For other parameters, retain the default values. For details, see Link to Relational Databases. Click Save.

    2. Create a connection to the DLI.
      1. Log in to the CDM console. In the navigation pane on the left, choose Cluster Management. On the displayed page, locate the created CDM cluster and click Job Management in the Operation column.
      2. On the Job Management page, click the Links tab, and click Create Link. On the displayed page, select Data Lake Insight and click Next.
        Figure 3 Selecting the DLI connector
      1. Create a connection to link CDM to DLI. For details about parameter settings, see Link to DLI.
        Figure 4 Selecting the DLI connector

        Click Save.

  2. Create a CDM migration job.
    1. Log in to the CDM console. In the navigation pane on the left, choose Cluster Management. On the displayed page, locate the created CDM cluster and click Job Management in the Operation column.
    2. On the Job Management page, click the Table/File Migration tab. On the displayed tab, click Create Job.
    3. On the Create Job page, set job parameters.
      Figure 5 Configuring the migration job
      1. Job Name: Name of the data migration job, for example, test.
      2. Set the parameters in the Source Job Configuration area as follows:
        Table 2 Source job parameters

        Parameter

        Value

        Source Link Name

        Select the name of the data source created in 1.a.

        Use SQL Statement

        When set to Yes, enter a SQL statement. CDM exports data based on the statement.

        In this example, set it to No.

        Schema/Table Space

        Name of the schema or tablespace from which data will be extracted. This parameter is available when Use SQL Statement is set to No. Click the icon next to the text box to select a schema or tablespace or directly enter a schema or tablespace.

        In this example, set this parameter to the default value public as there is no schema created in Create a database and table in the GaussDB(DWS) cluster.

        If there are no schemas or tablespaces available, check if the account has the permission to query metadata.

        NOTE:

        The parameter value can contain wildcard characters (*), which allows for the export of all databases with names starting or ending with a certain prefix or suffix, respectively. For example:

        SCHEMA* indicates that all databases with names starting with SCHEMA are exported.

        *SCHEMA indicates that all databases with names ending with SCHEMA are exported.

        *SCHEMA* indicates that all databases with names containing SCHEMA are exported.

        Table Name

        Name of the table you want to migrate. In this example, table1 created in Create a database and table in the GaussDB(DWS) cluster is used.

        For details about parameter settings, see From a Relational Database.

      3. Set the parameters in the Destination Job Configuration area as follows:
        Table 3 Destination job parameters

        Parameter

        Value

        Destination Link Name

        Select the DLI data source connection.

        Resource Queue

        Select a created DLI SQL queue.

        Database Name

        Select a created DLI database. In this example, the database testdb created in Create a database and table on DLI is used.

        Table Name

        Select the name of a table in the database. In this example, the table tabletest created in Create a database and table on DLI is used.

        Clear data before import

        Whether to clear data in the destination table before data import. In this example, set it to No.

        If set to Yes, data in the destination table will be cleared before the task is started.

        For details about parameter settings, see To DLI.

      4. Click Next. The Map Field page is displayed. CDM automatically matches the source and destination fields.
        • You can drag any unmatched fields to match them.
        • If the type is automatically created at the migration destination, you need to configure the type and name of each field.
        • CDM allows for field conversion during migration. For details, see Field Conversion.
          Figure 6 Field mapping
      5. Click Next and set task parameters. Typically, retain the default values for all parameters.

        In this step, you can configure the following optional features:

        • Retry Upon Failure: If the job fails to be executed, you can determine whether to automatically retry. Retain the default value Never.
        • Group: Select the group to which the job belongs. The default group is DEFAULT. On the Job Management page, jobs can be displayed, started, or exported by group.
        • Scheduled Execution: For how to configure scheduled execution, see Scheduling Job Execution. Retain the default value No.
        • Concurrent Extractors: Enter the number of extractors to be concurrently executed. Retain the default value 1.
        • Write Dirty Data: Set this parameter if data that fails to be processed or filtered out during job execution needs to be written to OBS. Before writing dirty data, create an OBS link. You can view the data on OBS later. Retain the default value No, meaning dirty data is not recorded.
      6. Click Save and Run. On the Job Management page, you can view the job execution progress and result.
        Figure 7 Job progress and execution result

Step 3: Query Results

Once the migration job is complete, check whether the GaussDB(DWS) table data has been migrated to the tabletest table. Specifically, do as follows: Log in to the DLI management console and choose SQL Editor. On the displayed page, set Engine to Spark, Queues to the created SQL queue, and Databases to the database created in Create a database and table on DLI. Then, execute the following query statement:
select * from tabletest;
Figure 8 Querying data in the table