METRIC Dashboard Template
You can create metric rules on the LTS console to generate statistical reports as required. You can also set a single log filter or add associations and groups to set multiple log filters to retain logs that meet the filters. Statistics on the structured logs within a specified time range can be collected dynamically and displayed on the Prometheus instances of AOM, which is easy to operate and powerful.
Prerequisites
- A metric generation rule has been created. For details, see Generating Metrics from Logs (Beta).
- Logs have been structured. For details, see Setting Cloud Structuring Parsing.
Monitoring Center for Metric Generation Tasks
- Log in to the LTS console. In the navigation pane, choose Dashboards.
- Choose METRIC dashboard templates under Dashboard Templates and click DSL processing task monitoring center to view the chart details.
- Filter by rule ID. The associated query and analysis statement is:
select distinct(task_set)
- Input Lines. The associated query and analysis statement is:
SELECT CASE WHEN "input" < 1000 THEN concat( cast( "input" AS VARCHAR ), 'Lines' ) WHEN "input" < 1000 * 1000 THEN concat( cast( round( "input"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "input" < 1000000000 THEN concat( cast( round( "input"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "input"/ 1000.0 < 1000000000 THEN concat( cast( round( "input"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "input"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("input") as "input")
- Output Lines. The associated query and analysis statement is:
SELECT CASE WHEN "output" < 1000 THEN concat( cast( "output" AS VARCHAR ), 'Lines' ) WHEN "output" < 1000 * 1000 THEN concat( cast( round( "output"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "output" < 1000000000 THEN concat( cast( round( "output"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "output"/ 1000.0 < 1000000000 THEN concat( cast( round( "output"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "output"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("output") as "output")
- Filter Criteria Met. The associated query and analysis statement is:
SELECT CASE WHEN "filters" < 1000 THEN concat( cast( "filters" AS VARCHAR ), 'Lines' ) WHEN "filters" < 1000 * 1000 THEN concat( cast( round( "filters"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "filters" < 1000000000 THEN concat( cast( round( "filters"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "filters"/ 1000.0 < 1000000000 THEN concat( cast( round( "filters"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "filters"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("filters") as "filters")
- Filter Criteria Unmet. The associated query and analysis statement is:
SELECT CASE WHEN "filter_drops" < 1000 THEN concat( cast( "filter_drops" AS VARCHAR ), 'Lines' ) WHEN "filter_drops" < 1000 * 1000 THEN concat( cast( round( "filter_drops"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "filter_drops" < 1000000000 THEN concat( cast( round( "filter_drops"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "filter_drops"/ 1000.0 < 1000000000 THEN concat( cast( round( "filter_drops"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "filter_drops"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("filter_drops") as "filter_drops")
- Sampled Lines. The associated query and analysis statement is:
SELECT CASE WHEN "samples" < 1000 THEN concat( cast( "samples" AS VARCHAR ), 'Lines' ) WHEN "samples" < 1000 * 1000 THEN concat( cast( round( "samples"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "samples" < 1000000000 THEN concat( cast( round( "samples"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "samples"/ 1000.0 < 1000000000 THEN concat( cast( round( "samples"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "samples"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("samples") as "samples")
- Unsampled Lines. The associated query and analysis statement is:
SELECT CASE WHEN "sample_drops" < 1000 THEN concat( cast( "sample_drops" AS VARCHAR ), 'Lines' ) WHEN "sample_drops" < 1000 * 1000 THEN concat( cast( round( "sample_drops"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "sample_drops" < 1000000000 THEN concat( cast( round( "sample_drops"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "sample_drops"/ 1000.0 < 1000000000 THEN concat( cast( round( "sample_drops"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "sample_drops"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum( "sample_drops" ) as "sample_drops")
- Lines Out of Log Time Range. The associated query and analysis statement is:
SELECT CASE WHEN "out_of_bounds" < 1000 THEN concat( cast( "out_of_bounds" AS VARCHAR ), 'Lines' ) WHEN "out_of_bounds" < 1000 * 1000 THEN concat( cast( round( "out_of_bounds"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "out_of_bounds" < 1000000000 THEN concat( cast( round( "out_of_bounds"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "out_of_bounds"/ 1000.0 < 1000000000 THEN concat( cast( round( "out_of_bounds"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "out_of_bounds"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("out_of_bounds") as "out_of_bounds")
- Execute Records(Lines). The associated query and analysis statement is:
select TIME_FORMAT( "__time", 'yyyy-MM-dd HH:mm:ss:SSS', '+08:00') as "Log Time", sum("input") as "Input",sum("output") as "Output",sum("filters") as "Filter Criteria Met",sum("filter_drops") as "Filter Criteria Unmet",sum("samples") as "Sampled",sum("sample_drops") as "Unsampled",sum("out_of_bounds") as "Out of Log Time Range" group by __time order by __time desc limit 1000
- Filter by rule ID. The associated query and analysis statement is:
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot