Help Center/ MapReduce Service/ Component Operation Guide (LTS)/ Using Hive/ Enterprise-Class Enhancements of Hive/ Creating a Foreign Table in a Directory (Read and Execute Permission Granted)
Updated on 2025-08-22 GMT+08:00

Creating a Foreign Table in a Directory (Read and Execute Permission Granted)

Scenario

When creating a Hive foreign table, the current user must be the owner of the foreign table location directory. The Hive parameter hive.restrict.create.grant.external.table allows users and user groups with the read and execute permissions on the directory to create Hive foreign tables without checking whether the user is the owner of the directory. The location directory of the foreign table cannot be the default warehouse directory. In addition, do not change the permission of the directory during foreign table authorization.

You can create or authorize an external table only if you have the required permissions. This mechanism effectively prevents unauthorized users from accessing sensitive data paths. In the production environment, especially where multiple users share a cluster, you are advised to use HDFS permission management to improve data security.

Enabling this function will cause great changes to foreign table functions. Exercise caution when deciding whether to enable this function.

Notes and Constraints

  • This section applies only to clusters with Kerberos authentication enabled.
  • This section applies only to the scenario where Ranger authentication is not enabled for Hive.

Procedure

  1. Log in to FusionInsight Manager and choose System > Permission > User. On the displayed page, create two users with the same permissions, for example, test and test1. Add The two users to the hive and hadoop user groups.
  2. Log in to the node where the client is installed as the client installation user.

    For details about how to download and install the cluster client, see Installing an MRS Cluster Client.

  3. Go to the client installation directory, configure environment variables, and authenticate the user.

    1. Go to the client installation directory.
      cd /opt/hadoopclient
    2. Configure environment variables.
      source bigdata_env
    3. Authenticate the user. Skip this step if Kerberos authentication is disabled for the cluster (in normal mode).
      kinit Component service user

      Example:

      kinit test

  4. Create an HDFS directory for storing Hive foreign tables:

    hdfs dfs -mkdir /user/test

    View the HDFS directory.

  5. Log in to the Hive client as the other user, create a Hive foreign table, and save the table to the HDFS directory created in 4.

    1. Authenticate the user. Skip this step if Kerberos authentication is disabled for the cluster (in normal mode).
      kinit Component service user

      Example:

      kinit test1
    2. Log in to the Hive client.
      beeline
    3. Create a Hive foreign table.
      create external table test(name string) location "hdfs://hacluster/user/test";

      After the command is executed, an error message is displayed, indicating that the user does not have the required permission.

      Figure 1 Insufficient permissions reported

  6. Log in to FusionInsight Manager, choose Cluster > Services > Hive, click Configurations, and click All Configurations.
  7. Choose HiveServer(Role) > Customization, add a customized parameter to the hive-site.xml parameter file, set Name to hive.restrict.create.grant.external.table, and set Value to true.
  8. Choose MetaStore(Role) > Customization, add a customized parameter to the hivemetastore-site.xml parameter file, set Name to hive.restrict.create.grant.external.table, and set Value to true.
  9. Click Save to save the settings. Click Instances, select all Hive instances, click More then Restart Instance, enter the user password, and click OK to restart all Hive instances.
  10. Determine whether to enable this function on the Spark/Spark2x client.

    • If yes, download and install the Spark/Spark2x client again.
    • If no, no further action is required.

  11. Repeat 5 to create a Hive foreign table. The table is created successfully. You can run the following command to view the details about the Hive table:

    desc formatted test;
    Figure 2 Viewing Hive table details

    Run the following command to create a Hive foreign table in the /user/hive/warehouse directory. An error message is displayed, indicating that the location directory of the foreign table cannot be in the default warehouse directory.

    create external table test1(name string) location "hdfs://hacluster/user/hive/warehouse";
    Figure 3 Table creation error