On this page

Show all

Help Center/ Data Lake Insight/ FAQs/ Queue/ O&M Guide/ How Do I Allocate Queue Resources for Running Spark Jobs If I Have Purchased 64 CUs?

How Do I Allocate Queue Resources for Running Spark Jobs If I Have Purchased 64 CUs?

Updated on 2023-03-21 GMT+08:00

In DLI, 64 CU = 64 cores and 256 GB memory.

In a Spark job, if the driver occupies 4 cores and 16 GB memory, the executor can occupy 60 cores and 240 GB memory.

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback