Help Center/ MapReduce Service/ Troubleshooting/ Using Spark/ An Error Is Reported When Spark Is Used
Updated on 2023-01-11 GMT+08:00

An Error Is Reported When Spark Is Used

Issue

When Spark is used, the cluster fails to run.

Symptom

When Spark is used, the cluster fails to run.

Cause Analysis

  • Invalid characters are added during command execution.
  • The owner and owner group of the uploaded JAR file is incorrect.

Procedure

  1. Run ./bin/spark-submit --class cn.interf.Test --master yarn-client Client installation directory/Spark/spark1-1.0-SNAPSHOT.jar; to check whether invalid characters are imported.
  2. If they are imported, modify the invalid characters and run the command again.
  3. After the command is executed again, other errors occur. Both the owner and the owner group of the JAR file are root.
  4. Change the owner and the owner group of the JAR file to omm:wheel.