Support for Third-party JAR Packages on x86 and TaiShan Platforms
Question
How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)?
Answer
If the third-party JAR packages (for example, custom UDF packages) have two versions (x86 and TaiShan), the following hybrid solution is used:
- Go to the installation directory of the Spark2x SparkResource on the server. The cluster may be installed on multiple nodes. You can go to any SparkResource node to go to the installation directory of the SparkResource.
- Prepare the .jar package, for example, xx.jar packages for the x86 and TaiShan platforms. Copy the xx.jar packages of x86 and TaiShan to the x86 and TaiShan folders respectively.
- Run the following commands in the current directory to compress the JAR packages:
zip -qDj spark-archive-2x-x86.zip x86/*
zip -qDj spark-archive-2x-arm.zip arm/*
- Run the following command to check the .jar package on which the Spark2x of HDFS depends:
hdfs dfs -ls /user/spark2x/jars/8.1.0.1
Change the version number 8.1.0.1 as required.
Run the following commands to move the .jar package files from HDFS, for example, /tmp.
hdfs dfs -mv /user/spark2x/jars/8.1.0.1/spark-archive-2x-arm.zip /tmp
hdfs dfs -mv /user/spark2x/jars/8.1.0.1/spark-archive-2x-x86.zip /tmp
- Run the following commands to upload the spark-archive-2x-arm.zip and spark-archive-2x-x86.zip packages in 3 to the /user/spark2x/jars/8.1.0.1 directory of HDFS:
hdfs dfs -put spark-archive-2x-arm.zip /user/spark2x/jars/8.1.0.1/
hdfs dfs -put spark-archive-2x-x86.zip /user/spark2x/jars/8.1.0.1/
After the upload is complete, delete local files spark-archive-2x-arm.zip and spark-archive-2x-x86.zip.
- Perform 1 to 2 for other SparkResource nodes.
- Log in to the web UI and restart the jdbcServer instance of Spark2x.
- After the restart, update the client configuration. Copy xx.jar for the corresponding platform to the Spark2x installation directory ${install_home}/Spark2x/spark/jars on the client according to client server type (x86 or TaiShan). ${install_home} indicates the installation path of the client. Replace it with the actual one. If the local installation directory is /opt/hadoopclient, copy the corresponding xx.jar to the /opt/hadoopclient/Spark2x/spark/jars folder.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot