Interconnecting Hive with OBS
Before performing the following operations, ensure that you have configured a storage-compute decoupled cluster by referring to Configuring a Storage-Compute Decoupled Cluster (Agency) or Configuring a Storage-Compute Decoupled Cluster (AK/SK).
When creating a table, set the table location to an OBS path.
- Log in to the client installation node as the client installation user.
- Run the following command to initialize environment variables:
source ${client_home}/bigdata_env
- For a security cluster, run the following command to perform user authentication (the user must have the permission to perform Hive operations). If Kerberos authentication is not enabled for the current cluster, you do not need to run this command.
kinit User performing Hive operations
- Log in to FusionInsight Manager and choose Cluster > Services > Hive > Configurations > All Configurations.
In the left navigation tree, choose Hive > Customization. In the customized configuration items, add dfs.namenode.acls.enabled to the hdfs.site.customized.configs parameter and set its value to false.
- Click Save. Click the Dashboard tab and choose More > Restart Service. In the Verify Identity dialog box that is displayed, enter the password of the current user, and click OK. In the displayed Restart Service dialog box, select Restart upper-layer services and click OK. Hive is restarted.
- Log in to the beeline client and set Location to the OBS file system path when creating a table.
beeline
For example, run the following command to create the table test in obs://OBS parallel file system name/user/hive/warehouse/Database name/Table name:
create table test(name string) location "obs://OBS parallel file system name/user/hive/warehouse/Database name/Table name";
You need to add the component operator to the URL policy in the Ranger policy. Set the URL to the complete path of the object on OBS. Select the Read and Write permissions.
Setting the Default Location of the Created Hive Table to the OBS Path
- Log in to FusionInsight Manager and choose Cluster > Services > Hive > Configurations > All Configurations.
- In the left navigation tree, choose MetaStore > Customization. Add hive.metastore.warehouse.dir to the hive.metastore.customized.configs parameter and set it to the OBS path.
Figure 1 Configurations of hive.metastore.warehouse.dir
- In the left navigation tree, choose HiveServer > Customization. Add hive.metastore.warehouse.dir to the hive.metastore.customized.configs and hive.metastore.customized.configs parameters, and set it to the OBS path.
Figure 2 hive.metastore.warehouse.dir configuration
- Save the configurations and restart Hive.
- Update the client configuration file.
- Run the following command to open hivemetastore-site.xml in the Hive configuration file directory on the client:
vim /opt/Bigdata/client/Hive/config/hivemetastore-site.xml
- Change the value of hive.metastore.warehouse.dir to the corresponding OBS path.
- Run the following command to open hivemetastore-site.xml in the Hive configuration file directory on the client:
- Log in to the beeline client, create a table, and check whether the location is the OBS path.
beeline
create table test(name string);
desc formatted test;
If the database location points to HDFS, the table to be created in the database (without specifying the location) also points to HDFS. If you want to modify the default table creation policy, change the location of the database to OBS by performing the following operations:
- Run the following command to query the location of the database:
show create database obs_test;
- Run the following command to change the database location:
alter database obs_test set location 'obs://OBS parallel file system name/user/hive/warehouse/Database name'
Run the show create database obs_test command to check whether the database location points to OBS.
- Run the following command to change the table location:
alter table user_info set location 'obs://OBS parallel file system name/user/hive/warehouse/Database name/Table name'
If the table contains data, migrate the original data file to the new location.
- Run the following command to query the location of the database:
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot