Help Center/ MapReduce Service/ User Guide (Paris Region)/ Troubleshooting/ Using Flume/ Class Cannot Be Found After Flume Submits Jobs to Spark Streaming
Updated on 2024-10-11 GMT+08:00

Class Cannot Be Found After Flume Submits Jobs to Spark Streaming

Issue

After Flume submits jobs to Spark Streaming, the class cannot be found.

Symptom

After the Spark Streaming code is packed into a JAR file and submitted to the cluster, an error message is displayed indicating that the class cannot be found. The following two methods are not useful:

  1. When submitting a Spark job, run the --jars command to reference the JAR file of the class.
  2. Import the JAR file where the class resides to the JAR file of Spark Streaming.

Cause Analysis

Some JAR files cannot be loaded during Spark job execution, resulting that the class cannot be found.

Procedure

  1. Run the --jars command to load the flume-ng-sdk-{version} .jar dependency package.
  2. Modify the two configuration items in the spark-default.conf file:

    spark.driver.extraClassPath=$PWD/*: {Add the original value}

    spark.executor.extraClassPath =$PWD/*

  3. Run the job successfully. If an error is reported, check which JAR is not loaded and perform step 1 and step 2 again.