Deze pagina is nog niet beschikbaar in uw eigen taal. We werken er hard aan om meer taalversies toe te voegen. Bedankt voor uw steun.

On this page

Show all

Help Center/ Data Lake Insight/ FAQs/ Problems Related to Spark Jobs/ Job O&M Errors/ Why Is a Job Running Timeout Reported When a Spark Job Runs a Large Amount of Data?

Why Is a Job Running Timeout Reported When a Spark Job Runs a Large Amount of Data?

Updated on 2023-05-19 GMT+08:00

When a Spark job accesses a large amount of data, for example, accessing data in a GaussDB(DWS) database, you are advised to set the number of concurrent tasks and enable multi-task processing.

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback