Development Plan of Flink Stream SQL Join
Assume that a Flink service receives a message every second. The message records the basic information about a user, including the name, gender, and age. Another Flink service (service 2) receives a message irregularly, and the message records the name and career information about the user.
To meet the requirements of some services, a Flink application is developed to achieve the following function: uses the username recorded in the message received by service 2 as a keyword to jointly query service data.
Data Planning
- The data of service 1 is stored in the Kafka component. Data is sent to and received from Kafka (user with Kafka permission required). For details about Kafka configuration, see Data Planning.
- Service 2 receives messages using the socket. You can run the netcat command to input the analog data source.
- Run the Linux command netcat -l -p <port> to start a simple text server.
- After starting the application to connect to the port monitored by netcat, enter the data information to the netcat terminal.
Development Guidelines
- Start the Flink Kafka Producer application to send data to Kafka.
- Start the Flink Kafka Consumer application to receive data from Kafka and create Table1. Ensure that topics of Kafka Consumer are consistent with that of Kafka Producer.
- Read data from the socket and create Table2.
- Use Flink SQL to jointly query Table1 and Table2 and print the result.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot