Help Center/ MapReduce Service/ FAQs/ Performance Tuning/ How Do I Configure Spark Jobs to Automatically Obtain More Resources During Execution?
Updated on 2024-08-16 GMT+08:00

How Do I Configure Spark Jobs to Automatically Obtain More Resources During Execution?

Question:

How Do I Configure Spark Jobs to Automatically Obtain More Resources During Execution?

Answer:

Resources are a key factor that affects Spark execution efficiency. When multiple executors but no tasks are allocated to a long-running service (such as the JDBCServer) and resources of other applications are insufficient, resources are wasted and cannot be scheduled properly.

Dynamic resource scheduling can add or remove executors of applications in real time based on the task load. In this way, resources are dynamically scheduled to applications.

You can use the following method to enable dynamic resource scheduling. For other related configurations, see Configuring Dynamic Resource Scheduling in Yarn Mode.

Log in to FusionInsight Manager, choose Cluster > Services > Spark and choose Configurations > All Configurations. Enter spark.dynamicAllocation.enabled in the search box and set its value to true to enable dynamic resource scheduling.