DSL Dashboard Template
LTS provides DSL processing for you to achieve one-stop log processing. Using domain-defined script languages and more than 200 built-in functions, you can implement end-to-end log processing tasks on the LTS console, such as log normalization, enrichment, transfer, anonymization, and filtering.
LTS provides the DSL processing task monitoring center dashboard template to display information such as the processing task ID/name and number of input/output lines.
Prerequisites
- A DSL processing task has been created.
- Logs have been structured. For details, see Setting Cloud Structuring Parsing.
DSL Processing Task Monitoring Center
- Log in to the LTS console. In the navigation pane, choose Dashboards.
- Choose DSL dashboard templates under Dashboard Templates and click DSL processing task monitoring center to view the chart details.
- Filter by processing task ID. The associated query and analysis statement is:
select distinct(task_id)
- Filter by processing task name. The associated query and analysis statement is:
select distinct(task_name)
- Input Lines. The associated query and analysis statement is:
SELECT CASE WHEN "input" < 1000 THEN concat( cast( "input" AS VARCHAR ), 'Lines' ) WHEN "input" < 1000 * 1000 THEN concat( cast( round( "input"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "input" < 1000000000 THEN concat( cast( round( "input"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "input"/ 1000.0 < 1000000000 THEN concat( cast( round( "input"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "input"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("process.accept") as "input")
- Output Lines. The associated query and analysis statement is:
SELECT CASE WHEN "delivered" < 1000 THEN concat( cast( "delivered" AS VARCHAR ), 'Lines' ) WHEN "delivered" < 1000 * 1000 THEN concat( cast( round( "delivered"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "delivered" < 1000000000 THEN concat( cast( round( "delivered"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "delivered"/ 1000.0 < 1000000000 THEN concat( cast( round( "delivered"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "delivered"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("process.delivered") as "delivered")
- Filtered Lines. The associated query and analysis statement is:
SELECT CASE WHEN "drop" < 1000 THEN concat( cast( "drop" AS VARCHAR ), 'Lines' ) WHEN "drop" < 1000 * 1000 THEN concat( cast( round( "drop"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "drop" < 1000000000 THEN concat( cast( round( "drop"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "drop"/ 1000.0 < 1000000000 THEN concat( cast( round( "drop"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "drop"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("process.drop") as "drop")
- Failed Lines. The associated query and analysis statement is:
SELECT CASE WHEN "failed" < 1000 THEN concat( cast( "failed" AS VARCHAR ), 'Lines' ) WHEN "failed" < 1000 * 1000 THEN concat( cast( round( "failed"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "failed" < 1000000000 THEN concat( cast( round( "failed"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "failed"/ 1000.0 < 1000000000 THEN concat( cast( round( "failed"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "failed"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("process.failed") as "failed")
- Execution Records. The associated query and analysis statement is:
select TIME_FORMAT( MILLIS_TO_TIMESTAMP("start"), 'yyyy-MM-dd HH:mm:ss:SSS', '+08:00') as "Started",TIME_FORMAT( MILLIS_TO_TIMESTAMP("end"), 'yyyy-MM-dd HH:mm:ss:SSS', '+08:00') as "Ended", "process.accept" as "Input Lines", "process.delivered" as "Output Lines", "process.drop" as "Filtered Lines", "process.failed" as "Failed Lines" limit 1000
- Filter by processing task ID. The associated query and analysis statement is:
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot