Updated on 2022-02-22 GMT+08:00

An Error Is Reported During Spark Running

Issue

The specified class cannot be found when a Spark job is running.

Symptom

The specified class cannot be found when a Spark job is running. The error message is as follows:

Exception encountered | org.apache.spark.internal.Logging$class.logError(Logging.scala:91)
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.ClassNotFoundException: org.apache.phoenix.filter.SingleCQKeyValueComparisonFilter

Cause Analysis

The default path configured by the user is incorrect.

Procedure

  1. Log in to any Master node.
  2. Modify the configuration file in the Spark client directory.

    Run the vim /opt/client/Spark/spark/conf/spark-defaults.conf command to open the spark-defaults.conf file and set spark.executor.extraClassPath to ${PWD}/*.