Spark Jobs
Does DLI Spark Support Scheduled Periodic Jobs?
DLI Spark does not support job scheduling. You can use other services, such as DataArts Studio, or use APIs or SDKs to customize job schedule.
Can I Define the Primary Key When I Create a Table with a Spark SQL Statement?
The Spark SQL syntax does not support primary key definition.
Can DLI Spark Jar Jobs Access GaussDB(DWS) Datasource Tables?
Yes.
For details, see Connecting to GaussDB(DWS) and Accessing SQL Database Tables.
How Do I Check the Version of the Spark Built-in Dependency Package?
DLI built-in dependencies are provided by the platform by default. In case of conflicts, you do not need to upload them when packing JAR files of Spark or Flink Jar jobs.
For details about how to check the version, see Built-in Dependency Package.
Can I Download Packages on the Package Management Page?
No, the packages cannot be downloaded.
How Do I Use an API to Access DLI Through a Public Network?
To access DLI from the public network, use the domain name dli.{regionid}.myhuaweicloud.com.
- For details about DLI endpoints, see Endpoints.
- For details about DLI APIs, see Data Lake Insight API Reference.
Which Path Should the Third-Party Dependency Jar File Be Stored for the Customized Spark 3.1.1 Image?
Store the third-party dependency in the /opt/spark/jars directory.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot