Help Center> MapReduce Service> Troubleshooting> Using Spark> Job Status Is error After a Spark Job Is Submitted Through an API
Updated on 2023-11-30 GMT+08:00

Job Status Is error After a Spark Job Is Submitted Through an API

Issue

After a Spark job is submitted using an API, the job status is displayed as error.

Symptom

After the log level in /opt/client/Spark/spark/conf/log4j.properties is changed and a job is submitted using API V1.1, the job status is displayed as error.

Cause Analysis

The executor monitors the job log output and determines the job execution result. After the execution result is changed to error, the output result cannot be detected. Therefore, the executor determines that the job status is abnormal after the job expires.

Procedure

Change the log level in the /opt/client/Spark/spark/conf/log4j.properties file to info.

Summary and Suggestions

You are advised to use the V2 API to submit jobs.