Help Center/ MapReduce Service/ FAQs/ Job Development/ Why DataArtsStudio Occasionally Fail to Schedule Spark Jobs and the Rescheduling also Fails?
Updated on 2022-09-14 GMT+08:00

Why DataArtsStudio Occasionally Fail to Schedule Spark Jobs and the Rescheduling also Fails?

Symptom

DataArtsStudio occasionally fails to schedule Spark jobs and the rescheduling also fails. The following error information is displayed:

Caused by: org.apache.spark.SparkException: Application application_1619511926396_2586346 finished with failed status

Solution

Log in to the node where the Spark client is located as user root and increase the value of the spark.driver.memory parameter in the spark-defaults.conf file.