What Data Formats and Data Sources Are Supported by DLI Flink Jobs?
DLI Flink jobs support the following data formats:
Avro, Avro_merge, BLOB, CSV, EMAIL, JSON, ORC, Parquet, and XML.
DLI Flink jobs support data from the following data sources:
CloudTable HBase, CloudTable OpenTSDB, CSS Elasticsearch, DCS, DDS, DIS, DMS, GaussDB(DWS), EdgeHub, MRS HBase, MRS Kafka, open-source Kafka, file systems, OBS, RDS, and SMN
Data Format |
Supported Source Stream |
Supported Sink Stream |
---|---|---|
Avro |
- |
|
Avro_merge |
- |
|
BLOB |
- |
|
CSV |
||
- |
||
JSON |
||
ORC |
- |
|
Parquet |
- |
|
XML |
- |
Usage FAQs
- What Data Formats and Data Sources Are Supported by DLI Flink Jobs?
- How Do I Authorize a Subuser to View Flink Jobs?
- How Do I Set Auto Restart upon Exception for a Flink Job?
- How Do I Save Flink Job Logs?
- How Can I Check Flink Job Results?
- Why Is Error "No such user. userName:xxxx." Reported on the Flink Job Management Page When I Grant Permission to a User?
- How Do I Know Which Checkpoint the Flink Job I Stopped Will Be Restored to When I Start the Job Again?
- Which Flink Version Does DLI Support? Is Flink 1.13 Supported? Which Version Is the Next?
- Why Is a Message Displayed Indicating That the SMN Topic Does Not Exist When I Use the SMN Topic in DLI?
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.
more