Interconnecting Hive with OBS
When creating a table, set the table location to an OBS path.
- Log in to the client installation node as the client installation user.
- Run the following command to initialize environment variables:
source ${client_home}/bigdata_env
- For a security cluster, run the following command to perform user authentication (the user must have the permission to perform Hive operations). If Kerberos authentication is not enabled for the current cluster, you do not need to run this command.
kinit User performing Hive operations
- Log in to FusionInsight Manager and choose Cluster > Services > Hive > Configurations > All Configurations.
In the left navigation tree, choose Hive > Customization. In the customized configuration items, add dfs.namenode.acls.enabled to the hdfs.site.customized.configs parameter and set its value to false.
- Save the configurations and restart Hive.
- Log in to the beeline client and set Location to the OBS file system path when creating a table.
beeline
create table test(name string) location "obs://OBS parallel file system name/user/hive/warehouse/test";
You need to add the component operator to the URL policy in the Ranger policy. Set the URL to the complete path of the object on OBS. Select the Read and Write permissions.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot