On this page
Help Center/ MapReduce Service/ Troubleshooting/ Using Spark/ Error "ClassNotFoundException" Is Reported When a Spark Task Is Submitted

Error "ClassNotFoundException" Is Reported When a Spark Task Is Submitted

Updated on 2024-12-18 GMT+08:00

Symptom

When a Spark task is executed, an error message is displayed, indicating that class ClassNotFoundException cannot be found.

The error message is as follows:

Exception encountered | org.apache.spark.internal.Logging$class.logError(Logging.scala:91)
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.ClassNotFoundException: org.apache.phoenix.filter.SingleCQKeyValueComparisonFilter

Cause Analysis

The default path configured by the user is incorrect.

Procedure

  1. Log in to any Master node.
  2. Modify the configuration file in the Spark client directory.

    Run the vim Client installation directory/Spark/spark/conf/spark-defaults.conf command to open the spark-defaults.conf file and set spark.executor.extraClassPath to ${PWD}/*.

  3. Submit the task again.
Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback