Why Are Some Functions Not Available when Another JDBCServer Is Connected?
Question
Scenario 1
I set up permanent functions using the add jar statement. After Beeline connects to different JDBCServer or JDBCServer is restarted, I have to run the add jar statement again.
Scenario 2
The show functions statement can be used to query functions, but not obtain functions. The reason is that connected JDBC node does not contain jar packages of the corresponding path. However, after I add corresponding .jar packages, the show functions statement can be used to obtain functions.
Answer
Scenario 1
The add jar statement is used to load jars to the jarClassLoader of the JDBCServer connected currently. The add jar statement is not shared by different JDBCServer. After the JDBCServer restarts, new jarClassLoader is created. So the add jar statement needs to be run again.
There are two methods to add jar packages: You can run the spark-sql --jars /opt/test/two_udfs.jar statement to add the jar package during the startup of the Spark SQL process; or run the add jar /opt/test/two_udfs.jar statement to add the jar package after the Spark SQL process is started. Note that the path following the add jar statement can be a local path or an HDFS path.
Scenario 2
The show functions statement is used to obtain all functions in the current database from the external catalog. If functions are used in SQL, thriftJDBC-server loads .jar files related to the function.
If .jar files do not exist, the function cannot obtain corresponding .jar files. Therefore, the corresponding .jar files need to be added.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot