Updated on 2025-08-15 GMT+08:00

Setting Queue Properties

Scenario

DLI allows you to set properties for queues.

You can currently set the following property parameters:

  • Spark driver parameters: Set them to improve the scheduling efficiency of queues.
  • Set Result Saving Policy: Set it to determine whether to save job results for queues to a DLI job bucket.
  • Enable Spark Native operator optimization: Enable the Spark Native engine feature to improve Spark SQL job performance and reduce CPU and memory consumption.

This section describes how to set queue properties on the management console.

Notes and Constraints

  • Queue properties can only be set for SQL queues of the Spark engine in an elastic resource pool of the standard edition.
  • You cannot set queue properties in batches.
  • The constraints on different queue properties vary. For details, see Table 1.
    Table 1 Constraints on different queue properties

    Property

    Phase Where You Can Set This Property

    Constraint

    Helpful Link

    Spark driver parameters of a queue

    You can set this property only after a queue is created.

    For a queue in an elastic resource pool, if the minimum number of CUs of the queue is fewer than 16 CUs, both Max. Spark Driver Instances and Max. Prestart Spark Driver Instances set in the queue properties do not apply.

    Procedure

    Job result saving policy

    You can set this property only after a queue is created.

    After the Set Result Saving Policy is enabled, that is, job results are saved to a DLI job bucket, you must configure the DLI job bucket before submitting SQL jobs. Failure to do so may result in SQL jobs not being submitted successfully.

    Procedure

    Enabling Spark Native operator optimization

    • Creating queues within an elastic resource pool
    • Setting queue properties after queue creation

    For created queues, if you change the Spark Native setting (enabled/disabled) through the DLI management console or API, you need to restart the queue for the modification to take effect.

    • To enable the Spark Native engine for a queue in an elastic resource pool, the following conditions must be met simultaneously:
      • Type of an elastic resource pool: Standard
      • Type of a queue: For SQL
      • Spark version: Spark 3.3.1 or later
    • For the default queue, when Spark 3.3.1 or later is used, Spark Native is disabled by default.
    • To disable Spark Native for a job, configure spark.gluten.enabled=false in the job parameters to disable Spark Native at the job level.

    Enabling Spark Native Operator Optimization

Procedure

  1. In the navigation pane on the left of the DLI management console, choose Resources > Queue Management.
  2. Locate the queue for which you want to set properties, click More in the Operation column, and select Set Property.
  3. Go to the queue property setting page and set property parameters. Table 2 describes the property parameters.
    Table 2 Queue properties

    Property Type

    Property

    API Parameter

    Description

    Value Range

    Spark driver type

    Max. Spark Driver Instances

    computeEngine.maxInstance

    Maximum number of Spark drivers can be started on this queue, including the Spark driver that is prestarted and the Spark driver that runs jobs.

    • For a 16-CU queue, the value is 2.
    • For a queue that has more than 16 CUs, the value range is [2, queue CUs/16].
    • If the minimum number of CUs of the queue is fewer than 16 CUs, this configuration item does not apply.

    Max. Prestart Spark Driver Instances

    computeEngine.maxPrefetchInstance

    Maximum number of Spark drivers can be prestarted on this queue. When the number of Spark drivers that run jobs exceeds the value of Max. Concurrency per Instance, the jobs are allocated to the Spark drivers that are prestarted.

    • For a 16-CU queue, the value range is 0 to 1.
    • For a queue that has more than 16 CUs, the value range is [2, queue CUs/16].
    • If the minimum number of CUs of the queue is fewer than 16 CUs, this configuration item does not apply.

    Max. Concurrency per Instance

    job.maxConcurrent

    Maximum number of jobs can be concurrently executed by a Spark driver. When the number of jobs exceeds the value of this parameter, the jobs are allocated to other Spark drivers.

    1–32

    Job result saving policy

    Set Result Saving Policy

    job.saveJobResultToJobBucket

    Whether to save job results to a DLI job bucket.

    This parameter is only available for Spark SQL queues.

    Once enabled, this feature cannot be disabled, and job results of a queue are consistently saved to the DLI job bucket you configure.

    Before enabling this feature, make sure you have configured a DLI job bucket. For how to configure a DLI job bucket, see Configuring a DLI Job Bucket.

    To check if you have enabled the function to save SQL job results to a DLI job bucket, refer to How Do I Check if Job Result Saving to a DLI Job Bucket Is Enabled for a SQL Queue?

    You are advised to enable the feature of saving job results to DLI job buckets to more effectively manage and store SQL job query results.

    N/A

    Enabling Spark Native operator optimization

    DLI Spark Native Acceleration

    computeEngine.spark.nativeEnabled

    Enabling Spark Native can improve the performance of Spark SQL jobs, reducing CPU and memory consumption.

    For more information, see Enabling Spark Native Operator Optimization.

    Enabled or disabled

  4. Click OK.

How Do I Check if Job Result Saving to a DLI Job Bucket Is Enabled for a SQL Queue?

  • Method 1: View the result path on the SQL job details page.
    1. Log in to the DLI console. In the navigation pane on the left, choose Job Management > SQL Jobs.
    2. Locate a desired SQL job and click next to the queue name to expand the job details.
    3. Check the result path in the job details.
      • If the result path displays a custom DLI job bucket, the feature of saving job results to a DLI job bucket is enabled for the queue running the job.
      • If the result path is not displayed in the job details, the feature is not enabled for the queue running the job.
  • Method 2: Check whether the feature of saving job results to a job bucket is enabled in the SQL queue properties.
    1. Log in to the DLI management console. In the navigation pane on the left, choose Resources > Queue Management.
    2. Locate the queue for which you want to set properties, click More in the Operation column, and select Set Property.
    3. On the Set Property dialog box that appears, check whether the feature is enabled.