Real-Time Alarm Platform Construction
In this practice, you can learn how to set up a simple real-time alarm platform by using the job editing and scheduling functions of DataArts Factory, as well as other cloud services.
The following gives a sample scenario where a customer has deployed many applications in a data center and requires a unified O&M system to receive alarm information in real time and:
- To send a message to the user when the alarm severity is Major or higher.
- To provide an O&M report every day used for collecting statistics on alarm severity of each application.
The following solution is developed to meet the preceding requirements:
The general procedure is as follows:
- Real-time data import: Data Ingestion Service (DIS) is used to import alarm data from the data center to Data Lake Insight (DLI) in real time.
- Data cleansing and preprocessing: DLI cleanses and preprocesses alarm data.
- Alarm information sending: When the alarm severity exceeds the specified level, an SMS message is sent to the user.
- Data export and storage: The cleaned data is exported from DLI to DIS. DIS imports the alarm data to buckets in Object Storage Service (OBS). All the buckets are created based on the import time.
- Alarm statistic table output: The DLI SQL script is used to create an alarm statistic table.
- Data migration: After the alarm statistic table is created, the data is exported to RDS for MySQL using Cloud Data Migration (CDM).
Environment preparations
- OBS has been enabled and buckets have been created, for example, obs://dlfexample/alarm_info and obs://dlfexample/alarm_count_info, which are used to store the raw alarm table and alarm statistic table, respectively.
- DataArts Studio has been enabled, and the cdm-alarm cluster is available for Creating a CDM job.
- DLI has been enabled.
- Simple Message Notification (SMN) has been enabled.
Data Preparation
The raw alarm table records the real-time data of the data center, including alarm IDs and alarm severity. Table 1 shows the sample data.
Creating a DIS Stream
Create two DIS streams on the DIS console, one for inputting real-time data to DLI and one for outputting real-time data to OBS.
- Create a stream for inputting real-time data to DLI. The stream is named dis-alarm-input.
Figure 2 Creating an input stream
- Create a stream for outputting real-time data to OBS. The stream is named dis-alarm-output.
Figure 3 Creating an output stream
Configure a dump task for dis-alarm-output to dump the data in the stream to the obs://dlfexample/alarm_info directory of the OBS based on the output time.
Figure 4 Configuring a dump task for an output stream
Creating an SMN Topic
This section describes how to create an SMN topic, add a subscription to the topic, and add users who need to receive alarm notifications to a subscription terminal.
- Create an SMN topic named alarm_over.
Figure 5 Create SMN Topic
- Add a subscription to the topic, specifying alarm types and users who need to receive alarm notifications.
Figure 6 Adding a subscription
The key parameters are as follows:
- Protocol: Select SMS. After SMS is selected, an SMS notification will be sent when the alarm severity reaches the specified value.
- Endpoint: Enter the mobile number of the user who needs to receive alarm notifications.
Using DLI to Construct an Alarm Notification Project
After creating DIS streams and SMN topics, you can set up an alarm notification project in DLI. For details about how to create a DIS stream and SMN topic, see Creating a DIS Stream and Creating an SMN Topic, respectively.
- Create a Flink job in DLI. The job is named test.
Figure 7 Creating a Flink SQL Job
- Edit the Flink SQL job and enter statements in the SQL editor.
Figure 8 Editing a Flink SQL job
Functions provided by running the SQL statements:
- Upload real-time data from the dis-alarm-input created in Step 1 to DLI.
- Send an SMS notification when the alarm severity reaches the specified value.
- Output the processed real-time data from DLI to OBS through dis-alarm-output.
CREATE SOURCE STREAM alarm_info ( alarm_id STRING, alarm_type INT ) WITH ( type = "dis", region = "cn-south-1", channel = "dis-alarm-input", partition_count = "1", encode = "csv", field_delimiter = "," ); CREATE SINK STREAM over_alarm ( alarm_over STRING /* over speed message */ ) WITH ( type = "smn", region = "cn-south-1", topic_urn = "urn:smn:cn-south-1:6f2bf33af5104f45ab85de31d7841f5a:alarm_over", message_subject = "alarm", message_column = "alarm_over" ); INSERT INTO over_alarm SELECT "your alarm over (" || CAST(alarm_type as CHAR(20)) || ") ." FROM alarm_info WHERE alarm_type > 8; CREATE SINK STREAM alarm_info_output ( alarm_id STRING, alarm_type INT )WITH ( type ="dis", region = "cn-south-1", channel = "dis-alarm-output", PARTITION_KEY = "alarm_type", encode = "csv", field_delimiter = "," ); INSERT INTO alarm_info_output SELECT * FROM alarm_info WHERE alarm_type > 0;
- After the Flink SQL job is created, save and start the job.
Using a DLI SQL Script to Develop and Construct an Alarm Table
This section describes how to create an SQL script for creating an OBS table to store data tables on DLI and create one more SQL script for collecting alarm information.
- On the DataArts Studio Management Center page, create a data connection named dli to DLI.
- On the DataArts Factory page, create a database named dlitest in DLI to store data tables.
- Create a DLI SQL script used to create table alarm_info and alarm_count_info by entering SQL statements in the editor.
alarm_info and alarm_count_info are OBS tables used to store a raw alarm table and alarm statistic table, respectively.
Description of key operations:
- The script development area in Figure 9 is a temporary debugging area. After you close the script tab, the development area will be cleared. To retain the SQL script, click to save the script to a specific directory.
The key parameters are as follows:
- Data Connection: DLI data connection created in Step 1.
- Database: database created in Step 2.
- Resource Queue: the default resource queue default provided by DLI.
- SQL statements:
create table alarm_info(alarm_time string, alarm_id string, alarm_type int ) using csv options(path 'obs://dlfexample/alarm_info') partitioned by(alarm_time); create table alarm_count_info(alarm_time string, alarm_type int, alarm_count int) using csv options(path 'obs://dlfexample/alarm_count_info');
- Click to run the script for creating the alarm_info and alarm_count_info data tables.
- Clear the editor and enter SQL statements again.
ALTER TABLE alarm_info ADD PARTITION (alarm_time = ${dayParam}) LOCATION 'obs://dlfexample/alarm_info/${obsPathYear}'; insert into alarm_count_info select alarm_time,alarm_type,count(alarm_type) from alarm_info where alarm_time = ${dayParam} group by alarm_time,alarm_type;
Functions provided by running the SQL statements:
- Create a DLI partition based on the date in the obs://dlfexample/alarm_info directory of OBS. If the current date is 2018-10-10, create a DLI partition named 2018/10/09 to store the data tables generated on 2018/10/09 to the obs://dlfexample/alarm_info directory.
- Collect statistics based on the alarm partition time and alarm type, and insert the statistic result into the alarm_count_info table.
The key parameters are as follows:
- ${dayParam}: dayParam indicates the partition value of the alarm_info table. Enter $getCurrentTime(@@yyyyMMdd@@,-24*60*60) in the lower part of the script editor.
- ${obsPathYear}: obsPathYear indicates the directory of the OBS partition. Enter $getCurrentTime(@@yyyy/MM/dd@@,-24*60*60) in the lower part of the script editor.
- After the script is debugged, save the script. The script name is dli_partition_count. In subsequent operations, set the script to be executed periodically so that an alarm statistics table can be exported periodically. For details about how to export alarm statistics tables periodically, see Exporting Alarm Statistics Tables Periodically.
Creating a CDM job
This section describes how to use CDM to migrate the alarm statistic table from OBS to RDS for MySQL.
The key parameters are as follows:
- Job Name: obs_rds. In subsequent operations, set the job to be periodically executed so that data will be migrated periodically. For details about how to export alarm statistics tables periodically, see Exporting Alarm Statistics Tables Periodically.
- Source Job Configuration: OBS directory for storing alarm statistic tables. The source link obs_link must be created in CDM in advance.
- Destination Job Configuration: RDS MySQL space for storing alarm statistic tables. The destination link mysql_link must be created in CDM in advance.
Exporting Alarm Statistics Tables Periodically
After the alarm statistic table script and data migration CDM job are created, you can create a job in DataArts Factory and execute the job periodically so that you can export alarm statistic tables and migrate data periodically.
- Create a batch job named job_alarm.
Figure 10 Creating a DLF Job
- On the job development page, drag the DLI SQL and CDM Job nodes to the canvas, connect the nodes, and then click them to configure node properties.
Figure 11 Connecting nodes and configuring node properties
Key notes:
- dli_partition_count (DLI SQL node): In Node Properties, associates with the DLI SQL script dli_partition_count developed in Using a DLI SQL Script to Develop and Construct an Alarm Table.
- obs_rds (CDM Job node): In Node Properties, associates with the CDM job obs_rds developed in Creating a CDM job.
- After configuring the job, click to test it.
- If the log shows that the job runs properly, click Scheduling Setup in the right pane and configure the scheduling policy for the job.
Figure 12 Configuring scheduling type
Parameter descriptions:
- 2018/10/10 to 2018/11/09: The job is executed at 02:00 a.m. every day.
- Save and submit the job. Then click to execute the job so that it runs automatically every day.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot