Commissioning a Hive HCatalog Application on Linux
- Run the mvn package command to generate a JAR file, for example, hive-examples-1.0.jar, and obtain it from the target directory in the project directory.
- Upload the hive-examples-1.0.jar package generated in the previous step to a specified path on Linux, for example, /opt/hive_examples, marked as $HCAT_CLIENT, and ensure that the client has been installed.
export HCAT_CLIENT=/opt/hive_examples/
- Run the following command to configure environment variables (client installation path /opt/client is used as an example):
export HADOOP_HOME=/opt/client/HDFS/hadoop export HIVE_HOME=/opt/client/Hive/Beeline export HCAT_HOME=$HIVE_HOME/../HCatalog export LIB_JARS=$HCAT_HOME/lib/hive-hcatalog-core-1.3.0.jar,$HCAT_HOME/lib/hive-metastore-1.3.0.jar,$HIVE_HOME/lib/hive-exec-1.3.0.jar,$HCAT_HOME/lib/libfb303-0.9.3.jar,$HCAT_HOME/lib/slf4j-api-1.7.5.jar,$HCAT_HOME/lib/antlr-2.7.7.jar,$HCAT_HOME/lib/jdo-api-3.0.1.jar,$HCAT_HOME/lib/antlr-runtime-3.4.jar,$HCAT_HOME/lib/datanucleus-api-jdo-3.2.6.jar,$HCAT_HOME/lib/datanucleus-core-3.2.10.jar,$HCAT_HOME/lib/datanucleus-rdbms-3.2.9.jar export HADOOP_CLASSPATH=$HCAT_HOME/lib/hive-hcatalog-core-1.3.0.jar:$HCAT_HOME/lib/hive-metastore-1.3.0.jar:$HIVE_HOME/lib/hive-exec-1.3.0.jar:$HCAT_HOME/lib/libfb303-0.9.3.jar:$HADOOP_HOME/etc/hadoop:$HCAT_HOME/conf:$HCAT_HOME/lib/slf4j-api-1.7.5.jar:$HCAT_HOME/lib/antlr-2.7.7.jar:$HCAT_HOME/lib/jdo-api-3.0.1.jar:$HCAT_HOME/lib/antlr-runtime-3.4.jar:$HCAT_HOME/lib/datanucleus-api-jdo-3.2.6.jar:$HCAT_HOME/lib/datanucleus-core-3.2.10.jar:$HCAT_HOME/lib/datanucleus-rdbms-3.2.9.jar
Before importing the preceding environment variables, check whether the current JAR file exists. You can obtain the version number from the lib directory of Hive on the client.
- Prepare for the client running.
- If the Kerberos authentication is enabled for the current cluster, run the following command to authenticate the user. If the Kerberos authentication is disabled for the current cluster, skip this step. The current user is the development user added in Preparing a Hive Application Development User.
Human-machine user: kinit MRS cluster user
For example, kinit hiveuser.
Machine-machine user: kinit -kt <user.keytab path> <MRS cluster user>
For example, kinit -kt /opt/hive_examples/conf/user.keytab hiveuser
When connecting to a security cluster, add the following parameters to the HCatalog configuration file (for example, /opt/client/Hive/HCatalog/conf/hive-site.xml) on the Hive client:
<property> <name>hive.metastore.sasl.enabled</name> <value>true</value> </property>
- Use the Hive client to create source table t1 in beeline: create table t1(col1 int);
Run the insert into t1(col1) values(X); command to insert the following data into t1. In the command, X indicates the data value to be inserted.
+----------+--+ | t1.col1 | +----------+--+ | 1 | | 1 | | 1 | | 2 | | 2 | | 3 |
- Create destination table t2: create table t2(col1 int,col2 int);
- If the Kerberos authentication is enabled for the current cluster, run the following command to authenticate the user. If the Kerberos authentication is disabled for the current cluster, skip this step. The current user is the development user added in Preparing a Hive Application Development User.
- Use the YARN client to submit a task.
yarn --config $HADOOP_HOME/etc/hadoop jar $HCAT_CLIENT/hive-examples-1.0.jar com.huawei.bigdata.hive.example.HCatalogExample -libjars $LIB_JARS t1 t2
- View the running result. The data in t2 is as follows:
0: jdbc:hive2://192.168.1.18:24002,192.168.1.> select * from t2; +----------+----------+--+ | t2.col1 | t2.col2 | +----------+----------+--+ | 1 | 3 | | 2 | 2 | | 3 | 1 | +----------+----------+--+
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot