Help Center> MapReduce Service> Troubleshooting> Using Yarn> Error "ERROR 500" Is Displayed When Job Logs Are Queried on the Yarn Web UI
Updated on 2023-11-30 GMT+08:00

Error "ERROR 500" Is Displayed When Job Logs Are Queried on the Yarn Web UI

Symptom

A user views full logs on the Yarn web UI after a Spark Streaming job submitted on MRS has been running for a period of time. Error message "HTTP ERROR 500 org.apache.http.ConnectionCloseException: Premature end of chunk coded message body:closing chunk expected" is displayed.

Cause Analysis

Because the job has been running for a long time, the size of logs to be displayed on the Yarn web UI is too large. In this case, you need to decrease the size of the aggregated log generated by the job so that the log can be displayed by segment.

Procedure

  1. Log in to the node where the Spark2x/Spark client is installed as user root.
  2. Run the following command to edit the file:

    vim $SPARK_HOME/conf/log4j-executor.properties

  3. Set log4j.appender.sparklog.MaxFileSize to a smaller value, for example, 20MB. This parameter indicates the maximum size for storing a log file, and its default value is 50MB.
  4. Set log4j.appender.sparklog.MaxBackupIndex to a smaller value, for example, 5. This parameter indicates the maximum number of files that can be saved in rolling mode, and its upper limit is 10. The eleventh file to be saved will overwrite the first one.
  5. Save the file.
  6. Submit the job again. The job runs properly.