Updated on 2024-11-29 GMT+08:00

Using CDL from Scratch

Scenario

CDL supports data synchronization or comparison tasks in multiple scenarios. This section describes how to import data from PgSQL to Kafka on the CDLService web UI of a cluster with Kerberos authentication enabled.

Prerequisites

  • The CDL and Kafka services have been installed in a cluster and are running properly.
  • The write-ahead log policy for the PostgreSQL database has been modified by referring to Modifying the Write-Ahead Log Policy for the PostgreSQL Database.
  • You have created a human-machine user, for example, cdluser, added the user to user groups cdladmin (primary group), hadoop, kafka, and associated the user with the System_administrator role on FusionInsight Manager.

Procedure

  1. Log in to FusionInsight Manager as user cdluser (change the password upon the first login) and choose Cluster > Services > CDL. On the Dashboard page, click the hyperlink next to CDLService UI to go to the native CDL page.
  2. Choose Link Management and click Add Link. On the displayed dialog box, set parameters for adding the pgsql and kafka links by referring to the following tables. Creating a Database Link describes the data link parameters.

    Table 1 PgSQL data link parameters

    Parameter

    Example Value

    Link Type

    pgsql

    Name

    pgsqllink

    Host

    10.10.10.10

    Port

    5432

    DB Name

    testDB

    User

    user

    Password

    Password of the user user

    Description

    -

    Table 2 Kafka data link parameters

    Parameter

    Example Value

    Link Type

    kafka

    Name

    kafkalink

    Description

    -

  3. After the parameters are configured, click Test to check whether the data link is normal.

    After the test is successful, click OK.

  4. Choose Job Management > Data synchronization task and click Add Job. In the displayed dialog box, set parameters. and click Next.

    Specifically:

    Parameter

    Example Value

    Name

    job_pgsqltokafka

    Desc

    xxx

  5. Configure PgSQL job parameters.

    1. On the Job Management page, drag the pgsql icon on the left to the editing area on the right and double-click the icon to go to the PgSQL job configuration page. Creating a CDL Data Synchronization Job describes the data link parameters.
      Table 3 PgSQL job parameters

      Parameter

      Example Value

      Link

      pgsqllink

      Tasks Max

      1

      Mode

      insert, update, and delete

      Schema

      public

      dbName Alias

      cdc

      Slot Name

      test_slot

      Slot Drop

      No

      Connect With Hudi

      No

      Use Exist Publication

      Yes

      Kafka Message Format

      CDL Json

      Publication Name

      test

    2. Click the plus sign (+) to display more parameters.

      • WhiteList: Enter the name of the table in the database, for example, myclass.
      • Topic Table Mapping: Enter a table name in the first text box, for example, test. Enter a topic name in the second text box, for example, test_topic. The topic name must match the table name in the first text box. Enter the data filtering time in the third box.
    3. Click OK. The PgSQL job parameters are configured.

  6. Configure Kafka job parameters.

    1. On the Job Management page, drag the kafka icon on the left to the editing area on the right and double-click the icon to go to the Kafka job configuration page. Configure parameters based on Table 4.
      Table 4 Kafka job parameter

      Parameter

      Example Value

      Link

      kafkalink

    2. Click OK.

  7. After the job parameters are configured, drag the two icons to associate the job parameters and click Save. The job configuration is complete.

  8. In the job list on the Job Management page, locate the created jobs, click Start in the Operation column, and wait until the jobs are started.

    Check whether the data transmission takes effect. For example, insert data into the table in the PgSQL database, go to the Kafka UI to check whether data is generated in the Kafka topic by referring to Managing Topics on Kafka UI.