Creating a FlinkServer Stream Table Source
Data tables can be used to define basic attributes and parameters of source tables, dimension tables, and output tables.
Procedure
- Access the Flink web UI. For details, see Accessing the FlinkServer Web UI.
- Click Table Management. The table management page is displayed.
- Click Create Stream Table. On the stream table creation page, set parameters by referring to Table 1 and click OK. After the stream table is created, you can edit or delete the stream table in the Operation column.
Figure 1 Creating a stream table
Table 1 Parameters for creating a stream table Parameter
Description
Remarks
Stream/Table Name
Stream/Table name
Example: flink_sink
Description
Stream/Table description information
-
Mapping Table Type
Flink SQL does not provide the data storage function. Table creation is actually the creation of mapping for external data tables or storage.
The value can be Kafka or HDFS.
-
Type
Includes data source table Source and data result table Sink. Tables included in different mapping table types are as follows:
- Kafka: Source and Sink
- HDFS: Source and Sink
-
Data Connection
Name of the data connection
-
Topic
Kafka topic to be read. Multiple Kafka topics can be read. Use separators to separate topics.
This parameter is available when Mapping Table Type is set to Kafka.
-
File Path
HDFS directory or a single file path to be transferred.
This parameter is available when Mapping Table Type is set to HDFS.
Example:
/user/sqoop/ or /user/sqoop/example.csv
Code
Codes corresponding to different mapping table types are as follows:
- Kafka: CSV and JSON
- HDFS: CSV
-
Prefix
When Mapping Table Type is set to Kafka, Type is set to Source, and Code is set to JSON, this parameter indicates the hierarchical prefixes of multi-layer nested JSON, which are separated by commas (,).
For example, data,info indicates that the content under data and info in the nested JSON file is used as the data input in JSON format.
Separator
Has different meanings when Mapping Table Type is set to the following values: It is used as the separator of specified CSV fields. This parameter is available only when Code is set to CSV.
Example: comma (,)
Row Separator
Line break in the file, including \r, \n, and \r\n.
This parameter is available when Mapping Table Type is set to HDFS.
-
Column Separator
Field separator in the file.
This parameter is available when Mapping Table Type is set to HDFS.
Example: comma (,)
Stream Table Structure
Stream/Table structure, including Name and Type.
-
Proctime
System time, which is irrelevant to the data timestamp. That is, the time when the calculation is complete in Flink operators.
This parameter is available when Type is set to Source.
-
Event Time
Time when an event is generated, that is, the timestamp generated during data generation.
This parameter is available when Type is set to Source.
-
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot