Updated on 2024-11-15 GMT+08:00

Spark Jobs

Does DLI Spark Support Scheduled Periodic Jobs?

DLI Spark does not support job scheduling. You can use other services, such as DataArts Studio, or use APIs or SDKs to customize job schedule.

Can I Define the Primary Key When I Create a Table with a Spark SQL Statement?

The Spark SQL syntax does not support primary key definition.

Can DLI Spark Jar Jobs Access GaussDB(DWS) Datasource Tables?

Yes.

For details, see Connecting to GaussDB(DWS) and Accessing SQL Database Tables.

How Do I Check the Version of the Spark Built-in Dependency Package?

DLI built-in dependencies are provided by the platform by default. In case of conflicts, you do not need to upload them when packing JAR files of Spark or Flink Jar jobs.

For details about how to check the version, see Built-in Dependency Package.

Can I Download Packages on the Package Management Page?

No, the packages cannot be downloaded.

How Do I Use an API to Access DLI Through a Public Network?

To access DLI from the public network, use the domain name dli.{regionid}.myhuaweicloud.com.

Which Path Should the Third-Party Dependency Jar File Be Stored for the Customized Spark 3.1.1 Image?

Store the third-party dependency in the /opt/spark/jars directory.