Updated on 2023-01-11 GMT+08:00

Configuring a Traditional Data Source

Scenario

This section describes how to add a Hive data source on HSConsole.

Currently, HetuEngine supports data sources of the following traditional data formats: AVRO, TEXT, RCTEXT, Binary, ORC, Parquet, and SequenceFile.

Prerequisites

  • The domain name of the cluster where the data source is located must be different from the HetuEngine cluster domain name.
  • The cluster where the data source is located and the HetuEngine cluster nodes can communicate with each other.
  • In the /etc/hosts file of all nodes in the cluster where HetuEngine is located, add the mapping between the host names and IP addresses of the cluster where the data source to be connected is located, and add 10.10.10.10 hadoop.System domain name in the /etc/hosts file (for example, 10.10.10.10 hadoop.hadoop.com). Otherwise, HetuEngine cannot connect to the nodes that are not in the cluster based on the host name.
  • A HetuEngine compute instance has been created.

Procedure

  1. Obtain the hdfs-site.xml and core-site.xml configuration files of the Hive data source cluster.

    1. Log in to FusionInsight Manager of the cluster where the Hive data source is located.
    2. Choose Cluster > Dashboard.
    3. Choose More > Download Client and download the client file to the local computer.
    4. Decompress the downloaded client file package and obtain the core-site.xml and hdfs-site.xml files in the FusionInsight_Cluster_1_Services_ClientConfig/HDFS/config directory.
    5. Check whether the core-site.xml file contains the fs.trash.interval configuration item. If not, add the following configuration items:
      <property>
      <name>fs.trash.interval</name>
      <value>2880</value>
      </property>
    6. Change the value of dfs.client.failover.proxy.provider.NameService name in the hdfs-site.xml file to org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.

      For example, if the NameService name is hacluster, the configuration is as follows:

      <property>
      <name>dfs.client.failover.proxy.provider.hacluster</name>
      <value>org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider</value>
      </property>

      If the Hive data source to be interconnected is in the same Hadoop cluster with HetuEngine, you can log in to the HDFS client and run the following commands to obtain the hdfs-site.xml and core-site.xml configuration files. For details, see Using the HDFS Client.

      hdfs dfs -get /user/hetuserver/fiber/restcatalog/hive/core-site.xml

      hdfs dfs -get /user/hetuserver/fiber/restcatalog/hive/hdfs-site.xml

  2. Obtain the user.keytab and krb5.conf files of the proxy user of the Hive data source.

    1. Log in to FusionInsight Manager of the cluster where the Hive data source is located.
    2. Choose System > Permission > User.
    3. Locate the row that contains the target data source user, click More in the Operation column, and select Download Authentication Credential.
    4. Decompress the downloaded package to obtain the user.keytab and krb5.conf files.

      The proxy user of the Hive data source must be associated with at least the hive user group.

  3. Obtain the MetaStore URL and the Principal of the server.

    1. Decompress the client package of the cluster where the Hive data source is located and obtain the hive-site.xml file from the FusionInsight_Cluster_1_Services_ClientConfig/Hive/config directory.
    2. Open the hive-site.xml file and search for hive.metastore.uris. The value of hive.metastore.uris is the value of MetaStore URL. Search for hive.server2.authentication.kerberos.principal. The value of hive.server2.authentication.kerberos.principal is the value of Principal on the server.

  4. Log in to FusionInsight Manager as a HetuEngine administrator and choose Cluster > Services > HetuEngine. The HetuEngine service page is displayed.
  5. In the Basic Information area on the Dashboard page, click the link next to HSConsole WebUI. The HSConsole page is displayed.
  6. Choose Data Source and click Add Data Source. Configure parameters on the Add Data Source page.

    1. Configure parameters in the Basic Information area. For details, see Table 1.
      Table 1 Basic Information

      Parameter

      Description

      Example Value

      Name

      Name of the data source to be connected.

      The value can contain only letters, digits, and underscores (_) and must start with a letter.

      hive_1

      Data Source Type

      Type of the data source to be connected. Select Hive.

      Hive

      Mode

      Mode of the current cluster. The default value is Security Mode.

      -

      Description

      Description of the data source.

      The value can contain only letters, digits, commas (,), periods (.), underscores (_), spaces, and line breaks.

      -

    2. Configure parameters in the Hive Configuration area. For details, see Table 2.
      Table 2 Hive Configuration

      Parameter

      Description

      Example Value

      Driver

      The default value is fi-hive-hadoop.

      fi-hive-hadoop

      hdfs-site File

      Select the hdfs-site.xml configuration file obtained in 1. The file name is fixed.

      -

      core-site File

      Select the core-site.xml configuration file obtained in 1. The file name is fixed.

      -

      krb5 File

      Configure this parameter when the security mode is enabled.

      It is the configuration file used for Kerberos authentication. Select the krb5.conf file obtained in 2.

      krb5.conf

      Enable Data Source Authentication

      Whether to use the permission policy of the Hive data source for authentication.

      If Ranger is disabled for the HetuEngine service, select Yes. If Ranger is enabled, select No.

      No

    3. Configure parameters in the MetaStore Configuration area. For details, see Table 3.
      Table 3 MetaStore Configuration

      Parameter

      Description

      Example Value

      Metastore URL

      URL of the MetaStore of the data source. For details, see 3.

      thrift://10.92.8.42:21088,thrift://10.92.8.43:21088,thrift://10.92.8.44:21088

      Security Authentication Mechanism

      After the security mode is enabled, the default value is KERBEROS.

      KERBEROS

      Server Principal

      Configure this parameter when the security mode is enabled.

      It specifies the username with domain name used by meta to access MetaStore. For details, see 3.

      hive/hadoop.hadoop.com@HADOOP.COM

      Client Principal

      Configure this parameter when the security mode is enabled.

      The parameter format is as follows: Username for accessing MetaStore@domain name (uppercase).COM.

      Username for accessing MetaStore is the user to which the user.keytab file obtained in 2 belongs.

      admintest@HADOOP.COM

      Keytab File

      Configure this parameter when the security mode is enabled.

      It specifies the keytab credential file of the MetaStore user name. The file name is fixed. Select the user.keytab file obtained in 2.

      user.keytab

    4. Configure parameters in the Connection Pool Configuration area. For details, see Table 4.
      Table 4 Connection Pool Configuration

      Parameter

      Description

      Example Value

      Enable Connection Pool

      Whether to enable the connection pool when accessing Hive MetaStore.

      Yes/No

      Maximum Connections

      Maximum number of connections in the connection pool when accessing Hive MetaStore.

      50

    5. Configure parameters in Hive User Information Configuration. For details, see Table 5.
      Hive User Information Configuration and HetuEngine-Hive User Mapping Configuration must be used together. When HetuEngine is connected to the Hive data source, user mapping enables HetuEngine users to have the same permissions of the mapped Hive data source user. Multiple HetuEngine users can correspond to one Hive user.
      Table 5 Hive User Information Configuration

      Parameter

      Description

      Example Value

      Data Source User

      Data source user information.

      The value can contain only letters, digits, underscores (_), hyphens (-), and periods (.), and must start with a letter or underscore (_). The minimum length is 2 characters and the maximum length is 100 characters.

      If the data source user is set to hiveuser1, a HetuEngine user mapped to hiveuser1 must exist. For example, create hetuuser1 and map it to hiveuser1.

      Keytab File

      Obtain the authentication credential of the user corresponding to the data source.

      hiveuser1.keytab

    6. Configure parameters in the HetuEngine-Hive User Mapping Configuration area. For details, see Table 6.
      Table 6 HetuEngine-Hive User Mapping Configuration

      Parameter

      Description

      Example Value

      HetuEngine User

      HetuEngine user information.

      The value can contain only letters, digits, underscores (_), hyphens (-), and periods (.), and must start with a letter or underscore (_). The minimum length is 2 characters and the maximum length is 100 characters.

      hetuuser1

      Data Source User

      Data source user information.

      The value can contain only letters, digits, underscores (_), hyphens (-), and periods (.), and must start with a letter or underscore (_). The minimum length is 2 characters and the maximum length is 100 characters.

      hiveuser1 (data source user configured in Table 5)

    7. Modify custom configurations.
      • You can click Add to add custom configuration parameters by referring to Table 7.
        Table 7 Custom parameters

        Parameter

        Description

        Example Value

        hive.metastore.connection.pool.maxTotal

        Maximum number of connections in the connection pool.

        50 (Value range: 0-200)

        hive.metastore.connection.pool.maxIdle

        Maximum number of idle threads in the connection pool. When the number of idle threads reaches the maximum number, new threads are not released.

        Default value: 10.

        10 (The value ranges from 0 to 200 and cannot exceed the maximum number of connections.)

        hive.metastore.connection.pool.minIdle

        Minimum number of idle threads in the connection pool. When the number of idle threads reaches the minimum number, the thread pool does not create new threads.

        Default value: 10.

        10 (The value ranges from 0 to 200 and cannot exceed the value of hive.metastore.connection.pool.maxIdle.)

      • You can click Delete to delete custom configuration parameters.
        • You can add prefixes coordinator. and worker. to the preceding custom configuration items to configure coordinators and workers, respectively. For example, if worker.hive.metastore.connection.pool.maxTotal is set to 50, a maximum number of 50 connections are allowed for workers to access Hive MetaStore. If no prefix is added, the configuration item is valid for both coordinators and workers.
        • By default, the maximum number of connections for coordinators to access Hive MetaStore is 5 and the maximum and minimum numbers of idle data source connections are both 10. The maximum number of connections for workers to access Hive MetaStore is 20, the maximum and minimum numbers of idle data source connections are both 0.
    8. Click OK.

  7. Log in to the node where the cluster client is located and run the following commands to switch to the client installation directory and authenticate the user:

    cd /opt/client

    source bigdata_env

    kinit User performing HetuEngine operations (If the cluster is in normal mode, skip this step.)

  8. Run the following command to log in to the catalog of the data source:

    hetu-cli --catalog Data source name --schema default

    For example, run the following command:

    hetu-cli --catalog hive_1 --schema default

  9. Run the following command to view the database table:

    show tables;
      Table  
    ---------
     hivetb   
    (1 rows)
    
    Query 20210730_084524_00023_u3sri@default@HetuEngine, FINISHED, 3 nodes
    Splits: 36 total, 36 done (100.00%)
    0:00 [2 rows, 47B] [7 rows/s, 167B/s]