Updated on 2024-11-18 GMT+08:00

DSL Dashboard Template

LTS provides DSL processing for you to achieve one-stop log processing. Using domain-defined script languages and more than 200 built-in functions, you can implement end-to-end log processing tasks on the LTS console, such as log normalization, enrichment, transfer, anonymization, and filtering.

LTS provides the DSL processing task monitoring center dashboard template to display information such as the processing task ID/name and number of input/output lines.

Prerequisites

DSL Processing Task Monitoring Center

  1. Log in to the LTS console. In the navigation pane, choose Dashboards.
  2. Choose DSL dashboard templates under Dashboard Templates and click DSL processing task monitoring center to view the chart details.

    • Filter by processing task ID. The associated query and analysis statement is:
      select distinct(task_id)
    • Filter by processing task name. The associated query and analysis statement is:
      select distinct(task_name)
    • Input Lines. The associated query and analysis statement is:
      SELECT CASE   WHEN  "input" < 1000 THEN concat( cast( "input" AS VARCHAR ), 'Lines' )  WHEN "input" < 1000 * 1000 THEN concat( cast( round( "input"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "input" < 1000000000 THEN concat( cast( round( "input"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' )  WHEN "input"/ 1000.0 < 1000000000 THEN concat( cast( round( "input"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "input"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' )  END AS  "total"  from (select sum("process.accept") as "input")
    • Output Lines. The associated query and analysis statement is:
      SELECT CASE   WHEN  "delivered" < 1000 THEN concat( cast( "delivered" AS VARCHAR ), 'Lines' )  WHEN "delivered" < 1000 * 1000 THEN concat( cast( round( "delivered"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "delivered" < 1000000000 THEN concat( cast( round( "delivered"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' )  WHEN "delivered"/ 1000.0 < 1000000000 THEN concat( cast( round( "delivered"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "delivered"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' )  END AS  "total"  from (select sum("process.delivered") as "delivered")
    • Filtered Lines. The associated query and analysis statement is:
      SELECT CASE   WHEN  "drop" < 1000 THEN concat( cast( "drop" AS VARCHAR ), 'Lines' )  WHEN "drop" < 1000 * 1000 THEN concat( cast( round( "drop"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "drop" < 1000000000 THEN concat( cast( round( "drop"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' )  WHEN "drop"/ 1000.0 < 1000000000 THEN concat( cast( round( "drop"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "drop"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' )  END AS  "total"  from (select sum("process.drop") as "drop")
    • Failed Lines. The associated query and analysis statement is:
      SELECT CASE   WHEN  "failed" < 1000 THEN concat( cast( "failed" AS VARCHAR ), 'Lines' )  WHEN "failed" < 1000 * 1000 THEN concat( cast( round( "failed"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "failed" < 1000000000 THEN concat( cast( round( "failed"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' )  WHEN "failed"/ 1000.0 < 1000000000 THEN concat( cast( round( "failed"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "failed"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' )  END AS  "total"  from (select sum("process.failed") as "failed")
    • Execution Records. The associated query and analysis statement is:
      select TIME_FORMAT( MILLIS_TO_TIMESTAMP("start"), 'yyyy-MM-dd HH:mm:ss:SSS', '+08:00') as "Started",TIME_FORMAT( MILLIS_TO_TIMESTAMP("end"), 'yyyy-MM-dd HH:mm:ss:SSS', '+08:00') as "Ended", "process.accept" as "Input Lines", "process.delivered" as "Output Lines", "process.drop" as "Filtered Lines", "process.failed" as "Failed Lines" limit 1000