An Error Is Reported During Spark Running
Issue
The specified class cannot be found when a Spark job is running.
Symptom
The specified class cannot be found when a Spark job is running. The error message is as follows:
Exception encountered | org.apache.spark.internal.Logging$class.logError(Logging.scala:91) org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.ClassNotFoundException: org.apache.phoenix.filter.SingleCQKeyValueComparisonFilter
Cause Analysis
The default path configured by the user is incorrect.
Procedure
- Log in to any Master node.
- Modify the configuration file in the Spark client directory.
Run the vim /opt/client/Spark/spark/conf/spark-defaults.conf command to open the spark-defaults.conf file and set spark.executor.extraClassPath to ${PWD}/*.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot