Help Center/
Data Lake Insight/
Flink SQL Syntax Reference/
Flink OpenSource SQL 1.15 Syntax Reference/
Flink OpenSource SQL 1.15 Usage
Updated on 2024-09-29 GMT+08:00
Flink OpenSource SQL 1.15 Usage
When switching from Flink 1.12 to Flink 1.15 for job execution, keep in mind the following considerations when utilizing Flink OpenSource SQL 1.15:
- Flink SQL utilizes a SQL client submission method. To configure this submission method in Flink 1.15, you need to use the SET 'key'='value' command in your SQL script. This is different from the optimization parameters used in Flink 1.12. For details about the syntax, see SQL Client Configuration.
- The following Flink connectors are added to Flink 1.15: Doris Connector and Hive Connector. For details, see Overview.
- In Flink 1.15, you need to configure a custom agency on the tenant plane and configure agency information in the job. The permissions included an agency should be configured based on the specific service scenario requirements of the job. For details, see DLI Custom Agency.
- Methods to manage credentials for Flink 1.15 jobs:
- You are advised to use DEW to manage access credentials, such as passwords and keys, in Flink OpenSource SQL. For details, see Flink OpenSource SQL Jobs Using DEW to Manage Access Credentials.
- Manage fixed AKs/SKs used by Flink Jar jobs to access OBS, temporary AKs/SKs used by Flink Jar jobs to obtain agencies, and temporary AKs/SKs used by Flink SQL UDFs to obtain agencies. For details, see Flink Job Agencies.
- There are differences in the way Flink 1.15 Jar reads custom configuration files compared to Flink 1.12. For details, see Writing Data to OBS Using Flink Jar.
- The Flink 1.15 Jar program uses a child-first reverse class loading mechanism. By setting the parent.first.classloader.jars parameter to include the names of the desired jars, for example, test1.jar,test2.jar, certain dependency packages can be loaded by the parent class loader.
- For the built-in JAR file list of Flink 1.15 Jar, obtain information about Flink 1.15 dependency packages from Flink job logs.
- Check the logs of a Flink job.
- Log in to the DLI management console. In the navigation pane on the left, choose Job Management > Flink Jobs.
- Click the name of the desired job. On the displayed page, click the Run Log tab.
- Check the latest run logs. For more logs, check the OBS bucket where the job logs are stored.
- Search for dependency information in the logs.
Search for Classpath: in the logs to check the dependencies.
- Check the logs of a Flink job.
- Flink 1.15 no longer supports DLI package management. To upload dependency packages and files, select the OBS path directly when editing the job.
Parent topic: Flink OpenSource SQL 1.15 Syntax Reference
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
The system is busy. Please try again later.
For any further questions, feel free to contact us through the chatbot.
Chatbot