Help Center/ MapReduce Service/ FAQs/ Job Management/ What Can I Do If DataArts Studio Occasionally Fails to Schedule Spark Jobs?
Updated on 2024-08-16 GMT+08:00

What Can I Do If DataArts Studio Occasionally Fails to Schedule Spark Jobs?

Symptom

DataArtsStudio occasionally fails to schedule Spark jobs and the rescheduling also fails. The following error information is displayed:

Caused by: org.apache.spark.SparkException: Application application_1619511926396_2586346 finished with failed status

Solution

Log in to the node where the Spark client is located as user root and increase the value of the spark.driver.memory parameter in the spark-defaults.conf file.